00:00:00.001 Started by user sys_sgci 00:00:00.007 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-upstream/autotest.groovy 00:00:02.297 The recommended git tool is: git 00:00:02.297 using credential 00000000-0000-0000-0000-000000000002 00:00:02.299 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:02.307 Fetching changes from the remote Git repository 00:00:02.309 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:02.317 Using shallow fetch with depth 1 00:00:02.317 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:02.317 > git --version # timeout=10 00:00:02.324 > git --version # 'git version 2.39.2' 00:00:02.324 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:02.333 Setting http proxy: proxy-dmz.intel.com:911 00:00:02.333 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.353 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.365 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.378 Checking out Revision 308e970df89ed396a3f9dcf22fba8891259694e4 (FETCH_HEAD) 00:00:04.378 > git config core.sparsecheckout # timeout=10 00:00:04.389 > git read-tree -mu HEAD # timeout=10 00:00:04.407 > git checkout -f 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=5 00:00:04.429 Commit message: "jjb/create-perf-report: make job run concurrent" 00:00:04.429 > git rev-list --no-walk 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=10 00:00:04.604 [Pipeline] Start of Pipeline 00:00:04.618 [Pipeline] library 00:00:04.620 Loading library shm_lib@master 00:00:04.620 Library shm_lib@master is cached. Copying from home. 00:00:04.648 [Pipeline] node 00:00:04.654 Running on ME2 in /var/jenkins/workspace/autotest-per-patch 00:00:04.660 [Pipeline] { 00:00:04.676 [Pipeline] cleanWs 00:00:04.687 [WS-CLEANUP] Deleting project workspace... 00:00:04.687 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.693 [WS-CLEANUP] done 00:00:04.698 [Pipeline] stage 00:00:04.701 [Pipeline] { (Prologue) 00:00:04.797 [Pipeline] withCredentials 00:00:04.806 > git --version # timeout=10 00:00:04.818 > git --version # 'git version 2.39.2' 00:00:04.834 Masking supported pattern matches of $GIT_USERNAME or $GIT_PASSWORD or $GIT_ASKPASS 00:00:04.836 [Pipeline] { 00:00:04.844 [Pipeline] retry 00:00:04.845 [Pipeline] { 00:00:04.998 [Pipeline] sh 00:00:05.278 + git ls-remote https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master 00:00:07.202 [Pipeline] } 00:00:07.227 [Pipeline] // retry 00:00:07.233 [Pipeline] } 00:00:07.255 [Pipeline] // withCredentials 00:00:07.266 [Pipeline] httpRequest 00:00:07.282 [Pipeline] echo 00:00:07.284 Sorcerer 10.211.164.101 is alive 00:00:07.292 [Pipeline] httpRequest 00:00:07.296 HttpMethod: GET 00:00:07.297 URL: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:07.297 Sending request to url: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:07.298 Response Code: HTTP/1.1 200 OK 00:00:07.299 Success: Status code 200 is in the accepted range: 200,404 00:00:07.299 Saving response body to /var/jenkins/workspace/autotest-per-patch/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:07.444 [Pipeline] sh 00:00:07.726 + tar --no-same-owner -xf jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:07.744 [Pipeline] httpRequest 00:00:07.758 [Pipeline] echo 00:00:07.759 Sorcerer 10.211.164.101 is alive 00:00:07.767 [Pipeline] httpRequest 00:00:07.771 HttpMethod: GET 00:00:07.772 URL: http://10.211.164.101/packages/spdk_ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2.tar.gz 00:00:07.772 Sending request to url: http://10.211.164.101/packages/spdk_ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2.tar.gz 00:00:07.773 Response Code: HTTP/1.1 404 Not Found 00:00:07.774 Success: Status code 404 is in the accepted range: 200,404 00:00:07.774 Saving response body to /var/jenkins/workspace/autotest-per-patch/spdk_ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2.tar.gz 00:00:07.780 [Pipeline] sh 00:00:08.059 + rm -f spdk_ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2.tar.gz 00:00:08.074 [Pipeline] retry 00:00:08.077 [Pipeline] { 00:00:08.100 [Pipeline] checkout 00:00:08.107 The recommended git tool is: NONE 00:00:08.116 using credential 00000000-0000-0000-0000-000000000002 00:00:08.121 Cloning the remote Git repository 00:00:08.124 Honoring refspec on initial clone 00:00:08.128 Cloning repository https://review.spdk.io/gerrit/a/spdk/spdk 00:00:08.128 > git init /var/jenkins/workspace/autotest-per-patch/spdk # timeout=10 00:00:08.132 Using reference repository: /var/ci_repos/spdk_multi 00:00:08.132 Fetching upstream changes from https://review.spdk.io/gerrit/a/spdk/spdk 00:00:08.132 > git --version # timeout=10 00:00:08.133 > git --version # 'git version 2.42.0' 00:00:08.133 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:08.135 Setting http proxy: proxy-dmz.intel.com:911 00:00:08.135 > git fetch --tags --force --progress -- https://review.spdk.io/gerrit/a/spdk/spdk refs/changes/40/24040/6 +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:13.291 Avoid second fetch 00:00:13.300 Checking out Revision ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2 (FETCH_HEAD) 00:00:13.479 Commit message: "accel: introduce tasks in sequence limit" 00:00:13.483 First time build. Skipping changelog. 00:00:13.285 > git config remote.origin.url https://review.spdk.io/gerrit/a/spdk/spdk # timeout=10 00:00:13.287 > git config --add remote.origin.fetch refs/changes/40/24040/6 # timeout=10 00:00:13.288 > git config --add remote.origin.fetch +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:13.295 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:13.300 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:13.304 > git config core.sparsecheckout # timeout=10 00:00:13.306 > git checkout -f ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2 # timeout=10 00:00:13.483 > git rev-list --no-walk b2ac96cc231173e6cb375d29bc2848cfae3add6a # timeout=10 00:00:13.489 > git remote # timeout=10 00:00:13.491 > git submodule init # timeout=10 00:00:13.515 > git submodule sync # timeout=10 00:00:13.539 > git config --get remote.origin.url # timeout=10 00:00:13.543 > git submodule init # timeout=10 00:00:13.565 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 00:00:13.567 > git config --get submodule.dpdk.url # timeout=10 00:00:13.568 > git remote # timeout=10 00:00:13.570 > git config --get remote.origin.url # timeout=10 00:00:13.571 > git config -f .gitmodules --get submodule.dpdk.path # timeout=10 00:00:13.573 > git config --get submodule.intel-ipsec-mb.url # timeout=10 00:00:13.574 > git remote # timeout=10 00:00:13.576 > git config --get remote.origin.url # timeout=10 00:00:13.577 > git config -f .gitmodules --get submodule.intel-ipsec-mb.path # timeout=10 00:00:13.579 > git config --get submodule.isa-l.url # timeout=10 00:00:13.580 > git remote # timeout=10 00:00:13.582 > git config --get remote.origin.url # timeout=10 00:00:13.583 > git config -f .gitmodules --get submodule.isa-l.path # timeout=10 00:00:13.584 > git config --get submodule.ocf.url # timeout=10 00:00:13.586 > git remote # timeout=10 00:00:13.587 > git config --get remote.origin.url # timeout=10 00:00:13.589 > git config -f .gitmodules --get submodule.ocf.path # timeout=10 00:00:13.590 > git config --get submodule.libvfio-user.url # timeout=10 00:00:13.592 > git remote # timeout=10 00:00:13.593 > git config --get remote.origin.url # timeout=10 00:00:13.595 > git config -f .gitmodules --get submodule.libvfio-user.path # timeout=10 00:00:13.596 > git config --get submodule.xnvme.url # timeout=10 00:00:13.597 > git remote # timeout=10 00:00:13.599 > git config --get remote.origin.url # timeout=10 00:00:13.600 > git config -f .gitmodules --get submodule.xnvme.path # timeout=10 00:00:13.602 > git config --get submodule.isa-l-crypto.url # timeout=10 00:00:13.603 > git remote # timeout=10 00:00:13.605 > git config --get remote.origin.url # timeout=10 00:00:13.606 > git config -f .gitmodules --get submodule.isa-l-crypto.path # timeout=10 00:00:13.608 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:13.608 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:13.608 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:13.608 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:13.609 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:13.609 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:13.609 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:13.610 Setting http proxy: proxy-dmz.intel.com:911 00:00:13.610 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi libvfio-user # timeout=10 00:00:13.611 Setting http proxy: proxy-dmz.intel.com:911 00:00:13.611 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi xnvme # timeout=10 00:00:13.611 Setting http proxy: proxy-dmz.intel.com:911 00:00:13.611 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi intel-ipsec-mb # timeout=10 00:00:13.611 Setting http proxy: proxy-dmz.intel.com:911 00:00:13.611 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi ocf # timeout=10 00:00:13.611 Setting http proxy: proxy-dmz.intel.com:911 00:00:13.611 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l-crypto # timeout=10 00:00:13.611 Setting http proxy: proxy-dmz.intel.com:911 00:00:13.611 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l # timeout=10 00:00:13.611 Setting http proxy: proxy-dmz.intel.com:911 00:00:13.611 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi dpdk # timeout=10 00:00:34.558 [Pipeline] dir 00:00:34.559 Running in /var/jenkins/workspace/autotest-per-patch/spdk 00:00:34.560 [Pipeline] { 00:00:34.577 [Pipeline] sh 00:00:34.853 ++ nproc 00:00:34.854 + threads=8 00:00:34.854 + git repack -a -d --threads=8 00:00:38.124 + git submodule foreach git repack -a -d --threads=8 00:00:38.124 Entering 'dpdk' 00:00:42.301 Entering 'intel-ipsec-mb' 00:00:42.301 Entering 'isa-l' 00:00:42.301 Entering 'isa-l-crypto' 00:00:42.301 Entering 'libvfio-user' 00:00:42.301 Entering 'ocf' 00:00:42.558 Entering 'xnvme' 00:00:42.818 + find .git -type f -name alternates -print -delete 00:00:42.818 .git/objects/info/alternates 00:00:42.818 .git/modules/libvfio-user/objects/info/alternates 00:00:42.818 .git/modules/isa-l-crypto/objects/info/alternates 00:00:42.818 .git/modules/ocf/objects/info/alternates 00:00:42.818 .git/modules/intel-ipsec-mb/objects/info/alternates 00:00:42.818 .git/modules/isa-l/objects/info/alternates 00:00:42.818 .git/modules/xnvme/objects/info/alternates 00:00:42.818 .git/modules/dpdk/objects/info/alternates 00:00:42.829 [Pipeline] } 00:00:42.854 [Pipeline] // dir 00:00:42.860 [Pipeline] } 00:00:42.881 [Pipeline] // retry 00:00:42.888 [Pipeline] sh 00:00:43.173 + hash pigz 00:00:43.173 + tar -cf spdk_ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2.tar.gz -I pigz spdk 00:00:45.127 [Pipeline] httpRequest 00:00:45.133 HttpMethod: PUT 00:00:45.134 URL: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2.tar.gz 00:00:45.134 Sending request to url: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2.tar.gz 00:00:48.122 Response Code: HTTP/1.1 200 OK 00:00:48.128 Success: Status code 200 is in the accepted range: 200 00:00:48.132 [Pipeline] echo 00:00:48.134 00:00:48.134 Locking 00:00:48.134 Waited 0s for lock 00:00:48.134 Everything Fine. Saved: /storage/packages/spdk_ea3ef9678a81e1b9ff1db2058df2d6062fa6ecc2.tar.gz 00:00:48.134 00:00:48.138 [Pipeline] sh 00:00:48.457 + git -C spdk log --oneline -n5 00:00:48.457 ea3ef9678 accel: introduce tasks in sequence limit 00:00:48.457 719d03c6a sock/uring: only register net impl if supported 00:00:48.457 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:00:48.457 9937c0160 lib/rdma: bind TRACE_BDEV_IO_START/DONE to OBJECT_NVMF_RDMA_IO 00:00:48.457 6c7c1f57e accel: add sequence outstanding stat 00:00:48.473 [Pipeline] setCustomBuildProperty 00:00:48.480 [Pipeline] setCustomBuildProperty 00:00:48.488 [Pipeline] catchError 00:00:48.489 [Pipeline] { 00:00:48.505 [Pipeline] sh 00:00:48.784 + git -C spdk describe --tags --abbrev=0 origin/master 00:00:48.796 [Pipeline] sh 00:00:49.073 + git -C spdk describe --tags --abbrev=0 --exclude=LTS HEAD 00:00:49.088 [Pipeline] echo 00:00:49.090 Branch: master 00:00:49.093 [Pipeline] fileExists 00:00:49.111 [Pipeline] readJSON 00:00:49.125 [Pipeline] } 00:00:49.151 [Pipeline] // catchError 00:00:49.162 [Pipeline] sh 00:00:49.442 + /var/jenkins/workspace/autotest-per-patch/jbp/jenkins/jjb-config/jobs/scripts/get-pkgdep-jobs.sh /var/jenkins/workspace/autotest-per-patch/spdk 00:00:49.465 [Pipeline] } 00:00:49.492 [Pipeline] // stage 00:00:49.511 [Pipeline] catchError 00:00:49.512 [Pipeline] { 00:00:49.533 [Pipeline] stage 00:00:49.535 [Pipeline] { (Pre tests) 00:00:49.571 [Pipeline] parallel 00:00:49.583 [Pipeline] { (Branch: check-format-docker-autotest) 00:00:49.584 [Pipeline] { (Branch: check-so-deps-docker-autotest) 00:00:49.586 [Pipeline] { (Branch: doc-docker-autotest) 00:00:49.587 [Pipeline] { (Branch: build-files-docker-autotest) 00:00:49.610 [Pipeline] retry 00:00:49.612 [Pipeline] { 00:00:49.618 [Pipeline] retry 00:00:49.619 [Pipeline] { 00:00:49.624 [Pipeline] retry 00:00:49.625 [Pipeline] { 00:00:49.631 [Pipeline] retry 00:00:49.632 [Pipeline] { 00:00:49.653 [Pipeline] build 00:00:49.656 Scheduling project: check-format-docker-autotest 00:00:49.663 [Pipeline] build 00:00:49.666 Scheduling project: check-so-deps-docker-autotest 00:00:49.673 [Pipeline] build 00:00:49.676 Scheduling project: doc-docker-autotest 00:00:49.683 [Pipeline] build 00:00:49.686 Scheduling project: build-files-docker-autotest 00:00:55.044 Starting building: doc-docker-autotest #26703 00:00:55.047 Starting building: check-format-docker-autotest #26509 00:00:55.051 Starting building: build-files-docker-autotest #26488 00:00:55.055 Starting building: check-so-deps-docker-autotest #26520 00:01:31.338 Build doc-docker-autotest #26703 completed: SUCCESS 00:01:31.343 [Pipeline] } 00:01:31.381 [Pipeline] // retry 00:01:31.389 [Pipeline] } 00:01:39.968 Build check-format-docker-autotest #26509 completed: SUCCESS 00:01:39.972 [Pipeline] } 00:01:40.006 [Pipeline] // retry 00:01:40.012 [Pipeline] } 00:02:25.737 Build build-files-docker-autotest #26488 completed: FAILURE 00:02:25.778 [Pipeline] echo 00:02:25.779 No retry patterns found. 00:02:25.780 [Pipeline] } 00:02:25.802 [Pipeline] // retry 00:02:25.807 [Pipeline] error 00:02:25.812 [Pipeline] } 00:02:25.816 Failed in branch build-files-docker-autotest 00:03:04.058 Build check-so-deps-docker-autotest #26520 completed: FAILURE 00:03:04.123 [Pipeline] echo 00:03:04.124 No retry patterns found. 00:03:04.125 [Pipeline] } 00:03:04.147 [Pipeline] // retry 00:03:04.154 [Pipeline] error 00:03:04.160 [Pipeline] } 00:03:04.165 Failed in branch check-so-deps-docker-autotest 00:03:04.209 [Pipeline] // parallel 00:03:04.215 [Pipeline] } 00:03:04.234 [Pipeline] // stage 00:03:04.240 [Pipeline] } 00:03:04.245 ERROR: Build build-files-docker-autotest #26488 failed 00:03:04.245 Setting overall build result to FAILURE 00:03:04.262 [Pipeline] // catchError 00:03:04.271 [Pipeline] catchError 00:03:04.273 [Pipeline] { 00:03:04.294 [Pipeline] stage 00:03:04.296 [Pipeline] { (Tests) 00:03:04.316 [Pipeline] unstable 00:03:04.320 WARNING: Previous stages failed 00:03:04.321 [Pipeline] } 00:03:04.351 [Pipeline] // stage 00:03:04.356 [Pipeline] } 00:03:04.383 [Pipeline] // catchError 00:03:04.393 [Pipeline] stage 00:03:04.396 [Pipeline] { (Autorun Post and Coverage) 00:03:04.418 [Pipeline] setCustomBuildProperty 00:03:04.443 [Pipeline] dir 00:03:04.443 Running in /var/jenkins/workspace/autotest-per-patch/doc-docker-autotest_26703 00:03:04.445 [Pipeline] { 00:03:04.472 [Pipeline] copyArtifacts 00:03:04.729 Copied 5 artifacts from "doc-docker-autotest" build number 26703 00:03:04.734 [Pipeline] writeFile 00:03:04.760 [Pipeline] } 00:03:04.791 [Pipeline] // dir 00:03:04.809 [Pipeline] dir 00:03:04.810 Running in /var/jenkins/workspace/autotest-per-patch/check-format-docker-autotest_26509 00:03:04.812 [Pipeline] { 00:03:04.839 [Pipeline] copyArtifacts 00:03:04.892 Copied 4 artifacts from "check-format-docker-autotest" build number 26509 00:03:04.897 [Pipeline] writeFile 00:03:04.922 [Pipeline] } 00:03:04.954 [Pipeline] // dir 00:03:05.003 [Pipeline] dir 00:03:05.003 Running in /var/jenkins/workspace/autotest-per-patch/build-files-docker-autotest_26488 00:03:05.005 [Pipeline] { 00:03:05.028 [Pipeline] copyArtifacts 00:03:05.064 Copied 2 artifacts from "build-files-docker-autotest" build number 26488 00:03:05.068 [Pipeline] writeFile 00:03:05.094 [Pipeline] } 00:03:05.116 [Pipeline] // dir 00:03:05.174 [Pipeline] dir 00:03:05.174 Running in /var/jenkins/workspace/autotest-per-patch/check-so-deps-docker-autotest_26520 00:03:05.175 [Pipeline] { 00:03:05.199 [Pipeline] copyArtifacts 00:03:05.253 Copied 4 artifacts from "check-so-deps-docker-autotest" build number 26520 00:03:05.257 [Pipeline] writeFile 00:03:05.285 [Pipeline] } 00:03:05.308 [Pipeline] // dir 00:03:05.318 [Pipeline] catchError 00:03:05.320 [Pipeline] { 00:03:05.340 [Pipeline] sh 00:03:05.620 + jbp/jenkins/jjb-config/jobs/scripts/post_gen_coverage.sh 00:03:05.620 + shopt -s globstar nullglob 00:03:05.620 + echo 'Start stage post_gen_coverage.sh' 00:03:05.620 Start stage post_gen_coverage.sh 00:03:05.620 + cd /var/jenkins/workspace/autotest-per-patch 00:03:05.620 + rm -rf /var/jenkins/workspace/autotest-per-patch/spdk/doc 00:03:05.620 + trap 'compress_coverage_and_docs; remove_partial_coverage_files && echo '\''End stage post_gen_coverage.sh'\''' EXIT 00:03:05.620 + move_artifacts 00:03:05.620 + local out_dirs 00:03:05.620 + out_dirs=(./**/output/) 00:03:05.620 + for dir in "${out_dirs[@]}" 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./build-files-docker-autotest_26488/output//doc.tar.xz ]] 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./build-files-docker-autotest_26488/output//ut_coverage.tar.xz ]] 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./build-files-docker-autotest_26488/output//llvm.tar.xz ]] 00:03:05.620 + mv ./build-files-docker-autotest_26488/output//build-repo-manifest.txt ./build-files-docker-autotest_26488/output//power.tar.xz ./build-files-docker-autotest_26488/output//.. 00:03:05.620 + rmdir ./build-files-docker-autotest_26488/output/ 00:03:05.620 + for dir in "${out_dirs[@]}" 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./check-format-docker-autotest_26509/output//doc.tar.xz ]] 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./check-format-docker-autotest_26509/output//ut_coverage.tar.xz ]] 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./check-format-docker-autotest_26509/output//llvm.tar.xz ]] 00:03:05.620 + mv ./check-format-docker-autotest_26509/output//build-repo-manifest.txt ./check-format-docker-autotest_26509/output//power.tar.xz ./check-format-docker-autotest_26509/output//test_completions.txt ./check-format-docker-autotest_26509/output//timing.txt ./check-format-docker-autotest_26509/output//.. 00:03:05.620 + rmdir ./check-format-docker-autotest_26509/output/ 00:03:05.620 + for dir in "${out_dirs[@]}" 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./check-so-deps-docker-autotest_26520/output//doc.tar.xz ]] 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./check-so-deps-docker-autotest_26520/output//ut_coverage.tar.xz ]] 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./check-so-deps-docker-autotest_26520/output//llvm.tar.xz ]] 00:03:05.620 + mv ./check-so-deps-docker-autotest_26520/output//build-repo-manifest.txt ./check-so-deps-docker-autotest_26520/output//power.tar.xz ./check-so-deps-docker-autotest_26520/output//test_completions.txt ./check-so-deps-docker-autotest_26520/output//timing.txt ./check-so-deps-docker-autotest_26520/output//.. 00:03:05.620 + rmdir ./check-so-deps-docker-autotest_26520/output/ 00:03:05.620 + for dir in "${out_dirs[@]}" 00:03:05.620 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.620 + [[ -f ./doc-docker-autotest_26703/output//doc.tar.xz ]] 00:03:05.620 + tar -C ./doc-docker-autotest_26703/output/ -xf ./doc-docker-autotest_26703/output//doc.tar.xz 00:03:05.880 + rm ./doc-docker-autotest_26703/output//doc.tar.xz 00:03:05.880 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.880 + [[ -f ./doc-docker-autotest_26703/output//ut_coverage.tar.xz ]] 00:03:05.880 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:05.880 + [[ -f ./doc-docker-autotest_26703/output//llvm.tar.xz ]] 00:03:05.880 + mv ./doc-docker-autotest_26703/output//build-repo-manifest.txt ./doc-docker-autotest_26703/output//doc ./doc-docker-autotest_26703/output//power.tar.xz ./doc-docker-autotest_26703/output//test_completions.txt ./doc-docker-autotest_26703/output//timing.txt ./doc-docker-autotest_26703/output//.. 00:03:05.880 + rmdir ./doc-docker-autotest_26703/output/ 00:03:05.880 + unpack_cov_files 00:03:05.880 + local info_files 00:03:05.880 + info_files=(*/cov_*.info.xz) 00:03:05.880 + printf '%s\n' 00:03:05.880 + xargs -P0 -r -n1 xz -d 00:03:05.880 + fix_downstream_job_paths 00:03:05.880 + sed -i -e 's#^SF:/.\+/spdk/#SF:/var/jenkins/workspace/autotest-per-patch/spdk/#g' 00:03:05.880 sed: no input files 00:03:05.880 + compress_coverage_and_docs 00:03:05.880 + echo 'Start compress coverage and docs' 00:03:05.880 Start compress coverage and docs 00:03:05.880 + tar -C coverage -czf coverage_autotest-per-patch_126108.tar.gz ./ --remove-files 00:03:05.880 tar: coverage: Cannot open: No such file or directory 00:03:05.880 tar: Error is not recoverable: exiting now 00:03:05.898 [Pipeline] } 00:03:05.901 ERROR: script returned exit code 2 00:03:05.932 [Pipeline] // catchError 00:03:05.942 [Pipeline] catchError 00:03:05.944 [Pipeline] { 00:03:05.963 [Pipeline] dir 00:03:05.964 Running in /var/jenkins/workspace/autotest-per-patch/post_process 00:03:05.965 [Pipeline] {