00:00:00.001 Started by user sys_sgci 00:00:00.009 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-upstream/autotest.groovy 00:00:00.009 The recommended git tool is: git 00:00:00.010 using credential 00000000-0000-0000-0000-000000000002 00:00:00.012 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.022 Fetching changes from the remote Git repository 00:00:00.024 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.034 Using shallow fetch with depth 1 00:00:00.034 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.034 > git --version # timeout=10 00:00:00.043 > git --version # 'git version 2.39.2' 00:00:00.043 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.052 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.052 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.150 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.165 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.178 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.178 > git config core.sparsecheckout # timeout=10 00:00:02.189 > git read-tree -mu HEAD # timeout=10 00:00:02.205 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.225 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.225 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:02.368 [Pipeline] Start of Pipeline 00:00:02.383 [Pipeline] library 00:00:02.384 Loading library shm_lib@master 00:00:02.384 Library shm_lib@master is cached. Copying from home. 00:00:02.412 [Pipeline] node 00:00:02.418 Running on ME1 in /var/jenkins/workspace/autotest-per-patch 00:00:02.424 [Pipeline] { 00:00:02.440 [Pipeline] cleanWs 00:00:02.451 [WS-CLEANUP] Deleting project workspace... 00:00:02.451 [WS-CLEANUP] Deferred wipeout is used... 00:00:02.457 [WS-CLEANUP] done 00:00:02.462 [Pipeline] stage 00:00:02.466 [Pipeline] { (Prologue) 00:00:02.583 [Pipeline] withCredentials 00:00:02.592 > git --version # timeout=10 00:00:02.603 > git --version # 'git version 2.39.2' 00:00:02.620 Masking supported pattern matches of $GIT_USERNAME or $GIT_PASSWORD or $GIT_ASKPASS 00:00:02.622 [Pipeline] { 00:00:02.633 [Pipeline] retry 00:00:02.635 [Pipeline] { 00:00:02.821 [Pipeline] sh 00:00:03.105 + git ls-remote https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master 00:00:05.032 [Pipeline] } 00:00:05.055 [Pipeline] // retry 00:00:05.061 [Pipeline] } 00:00:05.084 [Pipeline] // withCredentials 00:00:05.122 [Pipeline] httpRequest 00:00:05.141 [Pipeline] echo 00:00:05.143 Sorcerer 10.211.164.101 is alive 00:00:05.154 [Pipeline] httpRequest 00:00:05.159 HttpMethod: GET 00:00:05.160 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.161 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.162 Response Code: HTTP/1.1 200 OK 00:00:05.163 Success: Status code 200 is in the accepted range: 200,404 00:00:05.163 Saving response body to /var/jenkins/workspace/autotest-per-patch/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.308 [Pipeline] sh 00:00:05.592 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.612 [Pipeline] httpRequest 00:00:05.631 [Pipeline] echo 00:00:05.632 Sorcerer 10.211.164.101 is alive 00:00:05.641 [Pipeline] httpRequest 00:00:05.646 HttpMethod: GET 00:00:05.646 URL: http://10.211.164.101/packages/spdk_a84bf89b48208c8f0849441de018ed17a2056966.tar.gz 00:00:05.647 Sending request to url: http://10.211.164.101/packages/spdk_a84bf89b48208c8f0849441de018ed17a2056966.tar.gz 00:00:05.648 Response Code: HTTP/1.1 404 Not Found 00:00:05.649 Success: Status code 404 is in the accepted range: 200,404 00:00:05.649 Saving response body to /var/jenkins/workspace/autotest-per-patch/spdk_a84bf89b48208c8f0849441de018ed17a2056966.tar.gz 00:00:05.659 [Pipeline] sh 00:00:05.945 + rm -f spdk_a84bf89b48208c8f0849441de018ed17a2056966.tar.gz 00:00:05.961 [Pipeline] retry 00:00:05.963 [Pipeline] { 00:00:05.985 [Pipeline] checkout 00:00:05.993 The recommended git tool is: NONE 00:00:06.019 using credential 00000000-0000-0000-0000-000000000002 00:00:06.025 Cloning the remote Git repository 00:00:06.028 Honoring refspec on initial clone 00:00:06.031 Cloning repository https://review.spdk.io/gerrit/a/spdk/spdk 00:00:06.031 > git init /var/jenkins/workspace/autotest-per-patch/spdk # timeout=10 00:00:06.037 Using reference repository: /var/ci_repos/spdk_multi 00:00:06.037 Fetching upstream changes from https://review.spdk.io/gerrit/a/spdk/spdk 00:00:06.037 > git --version # timeout=10 00:00:06.038 > git --version # 'git version 2.42.0' 00:00:06.038 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:06.068 Setting http proxy: proxy-dmz.intel.com:911 00:00:06.068 > git fetch --tags --force --progress -- https://review.spdk.io/gerrit/a/spdk/spdk refs/changes/68/24168/1 +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:11.300 Avoid second fetch 00:00:11.310 Checking out Revision a84bf89b48208c8f0849441de018ed17a2056966 (FETCH_HEAD) 00:00:11.490 Commit message: "nvmf: consolidate listener addition in avahi_entry_group_add_listeners" 00:00:11.494 First time build. Skipping changelog. 00:00:11.287 > git config remote.origin.url https://review.spdk.io/gerrit/a/spdk/spdk # timeout=10 00:00:11.289 > git config --add remote.origin.fetch refs/changes/68/24168/1 # timeout=10 00:00:11.291 > git config --add remote.origin.fetch +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:11.299 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:11.304 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:11.308 > git config core.sparsecheckout # timeout=10 00:00:11.310 > git checkout -f a84bf89b48208c8f0849441de018ed17a2056966 # timeout=10 00:00:11.488 > git rev-list --no-walk 897e912d5ef39b95adea4d69f24b5af81e596e94 # timeout=10 00:00:11.495 > git remote # timeout=10 00:00:11.496 > git submodule init # timeout=10 00:00:11.521 > git submodule sync # timeout=10 00:00:11.544 > git config --get remote.origin.url # timeout=10 00:00:11.548 > git submodule init # timeout=10 00:00:11.570 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 00:00:11.572 > git config --get submodule.dpdk.url # timeout=10 00:00:11.573 > git remote # timeout=10 00:00:11.575 > git config --get remote.origin.url # timeout=10 00:00:11.576 > git config -f .gitmodules --get submodule.dpdk.path # timeout=10 00:00:11.578 > git config --get submodule.intel-ipsec-mb.url # timeout=10 00:00:11.579 > git remote # timeout=10 00:00:11.581 > git config --get remote.origin.url # timeout=10 00:00:11.582 > git config -f .gitmodules --get submodule.intel-ipsec-mb.path # timeout=10 00:00:11.583 > git config --get submodule.isa-l.url # timeout=10 00:00:11.585 > git remote # timeout=10 00:00:11.586 > git config --get remote.origin.url # timeout=10 00:00:11.588 > git config -f .gitmodules --get submodule.isa-l.path # timeout=10 00:00:11.589 > git config --get submodule.ocf.url # timeout=10 00:00:11.590 > git remote # timeout=10 00:00:11.592 > git config --get remote.origin.url # timeout=10 00:00:11.593 > git config -f .gitmodules --get submodule.ocf.path # timeout=10 00:00:11.594 > git config --get submodule.libvfio-user.url # timeout=10 00:00:11.596 > git remote # timeout=10 00:00:11.597 > git config --get remote.origin.url # timeout=10 00:00:11.599 > git config -f .gitmodules --get submodule.libvfio-user.path # timeout=10 00:00:11.600 > git config --get submodule.xnvme.url # timeout=10 00:00:11.602 > git remote # timeout=10 00:00:11.603 > git config --get remote.origin.url # timeout=10 00:00:11.605 > git config -f .gitmodules --get submodule.xnvme.path # timeout=10 00:00:11.606 > git config --get submodule.isa-l-crypto.url # timeout=10 00:00:11.608 > git remote # timeout=10 00:00:11.609 > git config --get remote.origin.url # timeout=10 00:00:11.611 > git config -f .gitmodules --get submodule.isa-l-crypto.path # timeout=10 00:00:11.613 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.613 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.613 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.613 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.613 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.613 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.614 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.615 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.615 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi dpdk # timeout=10 00:00:11.615 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.615 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi ocf # timeout=10 00:00:11.616 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.616 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l # timeout=10 00:00:11.616 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.616 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi xnvme # timeout=10 00:00:11.616 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.616 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi intel-ipsec-mb # timeout=10 00:00:11.617 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.617 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l-crypto # timeout=10 00:00:11.618 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.618 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi libvfio-user # timeout=10 00:00:34.256 [Pipeline] dir 00:00:34.257 Running in /var/jenkins/workspace/autotest-per-patch/spdk 00:00:34.259 [Pipeline] { 00:00:34.277 [Pipeline] sh 00:00:34.563 ++ nproc 00:00:34.563 + threads=4 00:00:34.563 + git repack -a -d --threads=4 00:00:38.743 + git submodule foreach git repack -a -d --threads=4 00:00:38.743 Entering 'dpdk' 00:00:42.028 Entering 'intel-ipsec-mb' 00:00:42.028 Entering 'isa-l' 00:00:42.028 Entering 'isa-l-crypto' 00:00:42.300 Entering 'libvfio-user' 00:00:42.300 Entering 'ocf' 00:00:42.557 Entering 'xnvme' 00:00:42.814 + find .git -type f -name alternates -print -delete 00:00:42.814 .git/objects/info/alternates 00:00:42.814 .git/modules/xnvme/objects/info/alternates 00:00:42.814 .git/modules/dpdk/objects/info/alternates 00:00:42.814 .git/modules/ocf/objects/info/alternates 00:00:42.814 .git/modules/isa-l-crypto/objects/info/alternates 00:00:42.814 .git/modules/libvfio-user/objects/info/alternates 00:00:42.814 .git/modules/isa-l/objects/info/alternates 00:00:42.814 .git/modules/intel-ipsec-mb/objects/info/alternates 00:00:42.825 [Pipeline] } 00:00:42.847 [Pipeline] // dir 00:00:42.852 [Pipeline] } 00:00:42.874 [Pipeline] // retry 00:00:42.882 [Pipeline] sh 00:00:43.164 + hash pigz 00:00:43.164 + tar -cf spdk_a84bf89b48208c8f0849441de018ed17a2056966.tar.gz -I pigz spdk 00:00:45.705 [Pipeline] httpRequest 00:00:45.714 HttpMethod: PUT 00:00:45.715 URL: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_a84bf89b48208c8f0849441de018ed17a2056966.tar.gz 00:00:45.715 Sending request to url: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_a84bf89b48208c8f0849441de018ed17a2056966.tar.gz 00:00:49.549 Response Code: HTTP/1.1 200 OK 00:00:49.554 Success: Status code 200 is in the accepted range: 200 00:00:49.560 [Pipeline] echo 00:00:49.561 00:00:49.561 Locking 00:00:49.561 Waited 0s for lock 00:00:49.561 Everything Fine. Saved: /storage/packages/spdk_a84bf89b48208c8f0849441de018ed17a2056966.tar.gz 00:00:49.561 00:00:49.565 [Pipeline] sh 00:00:49.851 + git -C spdk log --oneline -n5 00:00:49.851 a84bf89b4 nvmf: consolidate listener addition in avahi_entry_group_add_listeners 00:00:49.851 719d03c6a sock/uring: only register net impl if supported 00:00:49.851 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:00:49.851 9937c0160 lib/rdma: bind TRACE_BDEV_IO_START/DONE to OBJECT_NVMF_RDMA_IO 00:00:49.851 6c7c1f57e accel: add sequence outstanding stat 00:00:49.870 [Pipeline] setCustomBuildProperty 00:00:49.879 [Pipeline] setCustomBuildProperty 00:00:49.889 [Pipeline] catchError 00:00:49.891 [Pipeline] { 00:00:49.910 [Pipeline] sh 00:00:50.193 + git -C spdk describe --tags --abbrev=0 origin/master 00:00:50.210 [Pipeline] sh 00:00:50.495 + git -C spdk describe --tags --abbrev=0 --exclude=LTS HEAD 00:00:50.511 [Pipeline] echo 00:00:50.513 Branch: master 00:00:50.517 [Pipeline] fileExists 00:00:50.535 [Pipeline] readJSON 00:00:50.550 [Pipeline] } 00:00:50.577 [Pipeline] // catchError 00:00:50.589 [Pipeline] sh 00:00:50.874 + /var/jenkins/workspace/autotest-per-patch/jbp/jenkins/jjb-config/jobs/scripts/get-pkgdep-jobs.sh /var/jenkins/workspace/autotest-per-patch/spdk 00:00:50.893 [Pipeline] } 00:00:50.922 [Pipeline] // stage 00:00:50.942 [Pipeline] catchError 00:00:50.944 [Pipeline] { 00:00:50.966 [Pipeline] stage 00:00:50.968 [Pipeline] { (Pre tests) 00:00:51.008 [Pipeline] parallel 00:00:51.022 [Pipeline] { (Branch: check-format-docker-autotest) 00:00:51.024 [Pipeline] { (Branch: check-so-deps-docker-autotest) 00:00:51.025 [Pipeline] { (Branch: doc-docker-autotest) 00:00:51.027 [Pipeline] { (Branch: build-files-docker-autotest) 00:00:51.051 [Pipeline] retry 00:00:51.053 [Pipeline] { 00:00:51.057 [Pipeline] retry 00:00:51.059 [Pipeline] { 00:00:51.065 [Pipeline] retry 00:00:51.066 [Pipeline] { 00:00:51.071 [Pipeline] retry 00:00:51.073 [Pipeline] { 00:00:51.098 [Pipeline] build 00:00:51.101 Scheduling project: check-format-docker-autotest 00:00:51.109 [Pipeline] build 00:00:51.112 Scheduling project: check-so-deps-docker-autotest 00:00:51.120 [Pipeline] build 00:00:51.123 Scheduling project: doc-docker-autotest 00:00:51.131 [Pipeline] build 00:00:51.134 Scheduling project: build-files-docker-autotest 00:00:56.372 Starting building: doc-docker-autotest #26761 00:00:56.376 Starting building: check-format-docker-autotest #26567 00:00:56.380 Starting building: build-files-docker-autotest #26546 00:00:56.384 Starting building: check-so-deps-docker-autotest #26578 00:01:30.187 Build doc-docker-autotest #26761 completed: SUCCESS 00:01:30.191 [Pipeline] } 00:01:30.225 [Pipeline] // retry 00:01:30.230 [Pipeline] } 00:01:42.140 Build check-format-docker-autotest #26567 completed: FAILURE 00:01:42.162 [Pipeline] echo 00:01:42.164 No retry patterns found. 00:01:42.166 [Pipeline] } 00:01:42.201 [Pipeline] // retry 00:01:42.212 [Pipeline] error 00:01:42.218 [Pipeline] } 00:01:42.224 Failed in branch check-format-docker-autotest 00:03:20.673 Build build-files-docker-autotest #26546 completed: SUCCESS 00:03:20.679 [Pipeline] } 00:03:20.729 [Pipeline] // retry 00:03:20.734 [Pipeline] } 00:03:59.408 Build check-so-deps-docker-autotest #26578 completed: SUCCESS 00:03:59.410 [Pipeline] } 00:03:59.448 [Pipeline] // retry 00:03:59.454 [Pipeline] } 00:03:59.508 [Pipeline] // parallel 00:03:59.516 [Pipeline] } 00:03:59.548 [Pipeline] // stage 00:03:59.557 [Pipeline] } 00:03:59.562 ERROR: Build check-format-docker-autotest #26567 failed 00:03:59.562 Setting overall build result to FAILURE 00:03:59.590 [Pipeline] // catchError 00:03:59.599 [Pipeline] catchError 00:03:59.601 [Pipeline] { 00:03:59.622 [Pipeline] stage 00:03:59.624 [Pipeline] { (Tests) 00:03:59.645 [Pipeline] unstable 00:03:59.648 WARNING: Previous stages failed 00:03:59.649 [Pipeline] } 00:03:59.679 [Pipeline] // stage 00:03:59.685 [Pipeline] } 00:03:59.714 [Pipeline] // catchError 00:03:59.723 [Pipeline] stage 00:03:59.725 [Pipeline] { (Autorun Post and Coverage) 00:03:59.747 [Pipeline] setCustomBuildProperty 00:03:59.771 [Pipeline] dir 00:03:59.771 Running in /var/jenkins/workspace/autotest-per-patch/doc-docker-autotest_26761 00:03:59.773 [Pipeline] { 00:03:59.797 [Pipeline] copyArtifacts 00:04:00.167 Copied 5 artifacts from "doc-docker-autotest" build number 26761 00:04:00.172 [Pipeline] writeFile 00:04:00.196 [Pipeline] } 00:04:00.230 [Pipeline] // dir 00:04:00.249 [Pipeline] dir 00:04:00.249 Running in /var/jenkins/workspace/autotest-per-patch/check-format-docker-autotest_26567 00:04:00.251 [Pipeline] { 00:04:00.276 [Pipeline] copyArtifacts 00:04:00.324 Copied 4 artifacts from "check-format-docker-autotest" build number 26567 00:04:00.329 [Pipeline] writeFile 00:04:00.353 [Pipeline] } 00:04:00.385 [Pipeline] // dir 00:04:00.477 [Pipeline] dir 00:04:00.478 Running in /var/jenkins/workspace/autotest-per-patch/build-files-docker-autotest_26546 00:04:00.479 [Pipeline] { 00:04:00.502 [Pipeline] copyArtifacts 00:04:00.554 Copied 4 artifacts from "build-files-docker-autotest" build number 26546 00:04:00.559 [Pipeline] writeFile 00:04:00.607 [Pipeline] } 00:04:00.631 [Pipeline] // dir 00:04:00.671 [Pipeline] dir 00:04:00.671 Running in /var/jenkins/workspace/autotest-per-patch/check-so-deps-docker-autotest_26578 00:04:00.673 [Pipeline] { 00:04:00.692 [Pipeline] copyArtifacts 00:04:00.741 Copied 4 artifacts from "check-so-deps-docker-autotest" build number 26578 00:04:00.745 [Pipeline] writeFile 00:04:00.771 [Pipeline] } 00:04:00.807 [Pipeline] // dir 00:04:00.813 [Pipeline] catchError 00:04:00.817 [Pipeline] { 00:04:00.829 [Pipeline] sh 00:04:01.106 + jbp/jenkins/jjb-config/jobs/scripts/post_gen_coverage.sh 00:04:01.106 + shopt -s globstar nullglob 00:04:01.106 + echo 'Start stage post_gen_coverage.sh' 00:04:01.106 Start stage post_gen_coverage.sh 00:04:01.106 + cd /var/jenkins/workspace/autotest-per-patch 00:04:01.106 + rm -rf /var/jenkins/workspace/autotest-per-patch/spdk/doc 00:04:01.106 + trap 'compress_coverage_and_docs; remove_partial_coverage_files && echo '\''End stage post_gen_coverage.sh'\''' EXIT 00:04:01.106 + move_artifacts 00:04:01.106 + local out_dirs 00:04:01.106 + out_dirs=(./**/output/) 00:04:01.106 + for dir in "${out_dirs[@]}" 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./build-files-docker-autotest_26546/output//doc.tar.xz ]] 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./build-files-docker-autotest_26546/output//ut_coverage.tar.xz ]] 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./build-files-docker-autotest_26546/output//llvm.tar.xz ]] 00:04:01.106 + mv ./build-files-docker-autotest_26546/output//build-repo-manifest.txt ./build-files-docker-autotest_26546/output//power.tar.xz ./build-files-docker-autotest_26546/output//test_completions.txt ./build-files-docker-autotest_26546/output//timing.txt ./build-files-docker-autotest_26546/output//.. 00:04:01.106 + rmdir ./build-files-docker-autotest_26546/output/ 00:04:01.106 + for dir in "${out_dirs[@]}" 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./check-format-docker-autotest_26567/output//doc.tar.xz ]] 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./check-format-docker-autotest_26567/output//ut_coverage.tar.xz ]] 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./check-format-docker-autotest_26567/output//llvm.tar.xz ]] 00:04:01.106 + mv ./check-format-docker-autotest_26567/output//build-repo-manifest.txt ./check-format-docker-autotest_26567/output//power.tar.xz ./check-format-docker-autotest_26567/output//test_completions.txt ./check-format-docker-autotest_26567/output//timing.txt ./check-format-docker-autotest_26567/output//.. 00:04:01.106 + rmdir ./check-format-docker-autotest_26567/output/ 00:04:01.106 + for dir in "${out_dirs[@]}" 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./check-so-deps-docker-autotest_26578/output//doc.tar.xz ]] 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./check-so-deps-docker-autotest_26578/output//ut_coverage.tar.xz ]] 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./check-so-deps-docker-autotest_26578/output//llvm.tar.xz ]] 00:04:01.106 + mv ./check-so-deps-docker-autotest_26578/output//build-repo-manifest.txt ./check-so-deps-docker-autotest_26578/output//power.tar.xz ./check-so-deps-docker-autotest_26578/output//test_completions.txt ./check-so-deps-docker-autotest_26578/output//timing.txt ./check-so-deps-docker-autotest_26578/output//.. 00:04:01.106 + rmdir ./check-so-deps-docker-autotest_26578/output/ 00:04:01.106 + for dir in "${out_dirs[@]}" 00:04:01.106 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.106 + [[ -f ./doc-docker-autotest_26761/output//doc.tar.xz ]] 00:04:01.106 + tar -C ./doc-docker-autotest_26761/output/ -xf ./doc-docker-autotest_26761/output//doc.tar.xz 00:04:01.367 + rm ./doc-docker-autotest_26761/output//doc.tar.xz 00:04:01.367 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.367 + [[ -f ./doc-docker-autotest_26761/output//ut_coverage.tar.xz ]] 00:04:01.367 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:01.367 + [[ -f ./doc-docker-autotest_26761/output//llvm.tar.xz ]] 00:04:01.367 + mv ./doc-docker-autotest_26761/output//build-repo-manifest.txt ./doc-docker-autotest_26761/output//doc ./doc-docker-autotest_26761/output//power.tar.xz ./doc-docker-autotest_26761/output//test_completions.txt ./doc-docker-autotest_26761/output//timing.txt ./doc-docker-autotest_26761/output//.. 00:04:01.367 + rmdir ./doc-docker-autotest_26761/output/ 00:04:01.367 + unpack_cov_files 00:04:01.367 + local info_files 00:04:01.367 + info_files=(*/cov_*.info.xz) 00:04:01.367 + printf '%s\n' 00:04:01.367 + xargs -P0 -r -n1 xz -d 00:04:01.367 + fix_downstream_job_paths 00:04:01.367 + sed -i -e 's#^SF:/.\+/spdk/#SF:/var/jenkins/workspace/autotest-per-patch/spdk/#g' 00:04:01.367 sed: no input files 00:04:01.367 + compress_coverage_and_docs 00:04:01.367 + echo 'Start compress coverage and docs' 00:04:01.367 Start compress coverage and docs 00:04:01.367 + tar -C coverage -czf coverage_autotest-per-patch_126150.tar.gz ./ --remove-files 00:04:01.367 tar: coverage: Cannot open: No such file or directory 00:04:01.367 tar: Error is not recoverable: exiting now 00:04:01.383 [Pipeline] } 00:04:01.387 ERROR: script returned exit code 2 00:04:01.416 [Pipeline] // catchError 00:04:01.426 [Pipeline] catchError 00:04:01.428 [Pipeline] { 00:04:01.450 [Pipeline] dir 00:04:01.450 Running in /var/jenkins/workspace/autotest-per-patch/post_process 00:04:01.452 [Pipeline] {