00:00:00.000 Started by user sys_sgci 00:00:00.006 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-upstream/autotest.groovy 00:00:00.007 The recommended git tool is: git 00:00:00.007 using credential 00000000-0000-0000-0000-000000000002 00:00:00.009 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.020 Fetching changes from the remote Git repository 00:00:00.021 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.032 Using shallow fetch with depth 1 00:00:00.032 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.032 > git --version # timeout=10 00:00:00.042 > git --version # 'git version 2.39.2' 00:00:00.042 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.053 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.053 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.117 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.131 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.142 Checking out Revision 4b79378c7834917407ff4d2cff4edf1dcbb13c5f (FETCH_HEAD) 00:00:02.142 > git config core.sparsecheckout # timeout=10 00:00:02.152 > git read-tree -mu HEAD # timeout=10 00:00:02.170 > git checkout -f 4b79378c7834917407ff4d2cff4edf1dcbb13c5f # timeout=5 00:00:02.191 Commit message: "jbp-per-patch: add create-perf-report job as a part of testing" 00:00:02.191 > git rev-list --no-walk 4b79378c7834917407ff4d2cff4edf1dcbb13c5f # timeout=10 00:00:02.352 [Pipeline] Start of Pipeline 00:00:02.366 [Pipeline] library 00:00:02.367 Loading library shm_lib@master 00:00:02.367 Library shm_lib@master is cached. Copying from home. 00:00:02.394 [Pipeline] node 00:00:02.402 Running on ME1 in /var/jenkins/workspace/autotest-per-patch 00:00:02.406 [Pipeline] { 00:00:02.423 [Pipeline] cleanWs 00:00:02.433 [WS-CLEANUP] Deleting project workspace... 00:00:02.433 [WS-CLEANUP] Deferred wipeout is used... 00:00:02.439 [WS-CLEANUP] done 00:00:02.443 [Pipeline] stage 00:00:02.447 [Pipeline] { (Prologue) 00:00:02.565 [Pipeline] withCredentials 00:00:02.577 > git --version # timeout=10 00:00:02.589 > git --version # 'git version 2.39.2' 00:00:02.609 Masking supported pattern matches of $GIT_USERNAME or $GIT_PASSWORD or $GIT_ASKPASS 00:00:02.611 [Pipeline] { 00:00:02.618 [Pipeline] retry 00:00:02.620 [Pipeline] { 00:00:02.838 [Pipeline] sh 00:00:03.118 + git ls-remote https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master 00:00:05.682 [Pipeline] } 00:00:05.705 [Pipeline] // retry 00:00:05.710 [Pipeline] } 00:00:05.732 [Pipeline] // withCredentials 00:00:05.743 [Pipeline] httpRequest 00:00:05.761 [Pipeline] echo 00:00:05.763 Sorcerer 10.211.164.101 is alive 00:00:05.772 [Pipeline] httpRequest 00:00:05.778 HttpMethod: GET 00:00:05.779 URL: http://10.211.164.101/packages/jbp_4b79378c7834917407ff4d2cff4edf1dcbb13c5f.tar.gz 00:00:05.780 Sending request to url: http://10.211.164.101/packages/jbp_4b79378c7834917407ff4d2cff4edf1dcbb13c5f.tar.gz 00:00:05.781 Response Code: HTTP/1.1 200 OK 00:00:05.781 Success: Status code 200 is in the accepted range: 200,404 00:00:05.782 Saving response body to /var/jenkins/workspace/autotest-per-patch/jbp_4b79378c7834917407ff4d2cff4edf1dcbb13c5f.tar.gz 00:00:05.926 [Pipeline] sh 00:00:06.213 + tar --no-same-owner -xf jbp_4b79378c7834917407ff4d2cff4edf1dcbb13c5f.tar.gz 00:00:06.236 [Pipeline] httpRequest 00:00:06.256 [Pipeline] echo 00:00:06.258 Sorcerer 10.211.164.101 is alive 00:00:06.268 [Pipeline] httpRequest 00:00:06.273 HttpMethod: GET 00:00:06.274 URL: http://10.211.164.101/packages/spdk_cbf927de0a78e36053e56c16aea8b45914962249.tar.gz 00:00:06.274 Sending request to url: http://10.211.164.101/packages/spdk_cbf927de0a78e36053e56c16aea8b45914962249.tar.gz 00:00:06.276 Response Code: HTTP/1.1 404 Not Found 00:00:06.277 Success: Status code 404 is in the accepted range: 200,404 00:00:06.277 Saving response body to /var/jenkins/workspace/autotest-per-patch/spdk_cbf927de0a78e36053e56c16aea8b45914962249.tar.gz 00:00:06.287 [Pipeline] sh 00:00:06.573 + rm -f spdk_cbf927de0a78e36053e56c16aea8b45914962249.tar.gz 00:00:06.590 [Pipeline] retry 00:00:06.592 [Pipeline] { 00:00:06.615 [Pipeline] checkout 00:00:06.624 The recommended git tool is: NONE 00:00:06.638 using credential 00000000-0000-0000-0000-000000000002 00:00:06.645 Cloning the remote Git repository 00:00:06.649 Honoring refspec on initial clone 00:00:06.651 Cloning repository https://review.spdk.io/gerrit/a/spdk/spdk 00:00:06.651 > git init /var/jenkins/workspace/autotest-per-patch/spdk # timeout=10 00:00:06.656 Using reference repository: /var/ci_repos/spdk_multi 00:00:06.657 Fetching upstream changes from https://review.spdk.io/gerrit/a/spdk/spdk 00:00:06.657 > git --version # timeout=10 00:00:06.658 > git --version # 'git version 2.42.0' 00:00:06.658 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:06.660 Setting http proxy: proxy-dmz.intel.com:911 00:00:06.660 > git fetch --tags --force --progress -- https://review.spdk.io/gerrit/a/spdk/spdk refs/changes/53/24153/4 +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:11.578 Avoid second fetch 00:00:11.588 Checking out Revision cbf927de0a78e36053e56c16aea8b45914962249 (FETCH_HEAD) 00:00:11.764 Commit message: "bdev/nvme: populate socket_id" 00:00:11.566 > git config remote.origin.url https://review.spdk.io/gerrit/a/spdk/spdk # timeout=10 00:00:11.568 > git config --add remote.origin.fetch refs/changes/53/24153/4 # timeout=10 00:00:11.569 > git config --add remote.origin.fetch +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:11.577 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:11.582 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:11.587 > git config core.sparsecheckout # timeout=10 00:00:11.588 > git checkout -f cbf927de0a78e36053e56c16aea8b45914962249 # timeout=10 00:00:11.763 > git rev-list --no-walk b82a65c315f5b39ca2d4e6834c0366b69d384e6d # timeout=10 00:00:11.775 > git remote # timeout=10 00:00:11.777 > git submodule init # timeout=10 00:00:11.801 > git submodule sync # timeout=10 00:00:11.824 > git config --get remote.origin.url # timeout=10 00:00:11.828 > git submodule init # timeout=10 00:00:11.850 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 00:00:11.851 > git config --get submodule.dpdk.url # timeout=10 00:00:11.853 > git remote # timeout=10 00:00:11.854 > git config --get remote.origin.url # timeout=10 00:00:11.856 > git config -f .gitmodules --get submodule.dpdk.path # timeout=10 00:00:11.858 > git config --get submodule.intel-ipsec-mb.url # timeout=10 00:00:11.859 > git remote # timeout=10 00:00:11.861 > git config --get remote.origin.url # timeout=10 00:00:11.862 > git config -f .gitmodules --get submodule.intel-ipsec-mb.path # timeout=10 00:00:11.864 > git config --get submodule.isa-l.url # timeout=10 00:00:11.865 > git remote # timeout=10 00:00:11.867 > git config --get remote.origin.url # timeout=10 00:00:11.868 > git config -f .gitmodules --get submodule.isa-l.path # timeout=10 00:00:11.870 > git config --get submodule.ocf.url # timeout=10 00:00:11.871 > git remote # timeout=10 00:00:11.873 > git config --get remote.origin.url # timeout=10 00:00:11.874 > git config -f .gitmodules --get submodule.ocf.path # timeout=10 00:00:11.875 > git config --get submodule.libvfio-user.url # timeout=10 00:00:11.877 > git remote # timeout=10 00:00:11.878 > git config --get remote.origin.url # timeout=10 00:00:11.880 > git config -f .gitmodules --get submodule.libvfio-user.path # timeout=10 00:00:11.881 > git config --get submodule.xnvme.url # timeout=10 00:00:11.883 > git remote # timeout=10 00:00:11.884 > git config --get remote.origin.url # timeout=10 00:00:11.886 > git config -f .gitmodules --get submodule.xnvme.path # timeout=10 00:00:11.887 > git config --get submodule.isa-l-crypto.url # timeout=10 00:00:11.889 > git remote # timeout=10 00:00:11.890 > git config --get remote.origin.url # timeout=10 00:00:11.892 > git config -f .gitmodules --get submodule.isa-l-crypto.path # timeout=10 00:00:11.895 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.895 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.895 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.896 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.896 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.896 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.896 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:11.897 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.897 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l-crypto # timeout=10 00:00:11.898 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.898 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi intel-ipsec-mb # timeout=10 00:00:11.899 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.899 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi xnvme # timeout=10 00:00:11.899 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.899 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi libvfio-user # timeout=10 00:00:11.899 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.899 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l # timeout=10 00:00:11.903 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.903 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi dpdk # timeout=10 00:00:11.926 Setting http proxy: proxy-dmz.intel.com:911 00:00:11.926 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi ocf # timeout=10 00:00:44.791 [Pipeline] dir 00:00:44.791 Running in /var/jenkins/workspace/autotest-per-patch/spdk 00:00:44.792 [Pipeline] { 00:00:44.805 [Pipeline] sh 00:00:45.084 ++ nproc 00:00:45.084 + threads=4 00:00:45.084 + git repack -a -d --threads=4 00:00:49.259 + git submodule foreach git repack -a -d --threads=4 00:00:49.260 Entering 'dpdk' 00:00:52.536 Entering 'intel-ipsec-mb' 00:00:52.536 Entering 'isa-l' 00:00:52.536 Entering 'isa-l-crypto' 00:00:52.536 Entering 'libvfio-user' 00:00:52.793 Entering 'ocf' 00:00:52.793 Entering 'xnvme' 00:00:53.049 + find .git -type f -name alternates -print -delete 00:00:53.049 .git/objects/info/alternates 00:00:53.049 .git/modules/xnvme/objects/info/alternates 00:00:53.049 .git/modules/dpdk/objects/info/alternates 00:00:53.049 .git/modules/ocf/objects/info/alternates 00:00:53.049 .git/modules/isa-l-crypto/objects/info/alternates 00:00:53.049 .git/modules/libvfio-user/objects/info/alternates 00:00:53.049 .git/modules/isa-l/objects/info/alternates 00:00:53.049 .git/modules/intel-ipsec-mb/objects/info/alternates 00:00:53.060 [Pipeline] } 00:00:53.082 [Pipeline] // dir 00:00:53.087 [Pipeline] } 00:00:53.107 [Pipeline] // retry 00:00:53.115 [Pipeline] sh 00:00:53.387 + hash pigz 00:00:53.387 + tar -cf spdk_cbf927de0a78e36053e56c16aea8b45914962249.tar.gz -I pigz spdk 00:00:55.922 [Pipeline] httpRequest 00:00:55.929 HttpMethod: PUT 00:00:55.930 URL: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_cbf927de0a78e36053e56c16aea8b45914962249.tar.gz 00:00:55.931 Sending request to url: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_cbf927de0a78e36053e56c16aea8b45914962249.tar.gz 00:00:59.128 Response Code: HTTP/1.1 200 OK 00:00:59.134 Success: Status code 200 is in the accepted range: 200 00:00:59.138 [Pipeline] echo 00:00:59.139 00:00:59.139 Locking 00:00:59.139 Waited 0s for lock 00:00:59.139 Everything Fine. Saved: /storage/packages/spdk_cbf927de0a78e36053e56c16aea8b45914962249.tar.gz 00:00:59.139 00:00:59.142 [Pipeline] sh 00:00:59.421 + git -C spdk log --oneline -n5 00:00:59.421 cbf927de0 bdev/nvme: populate socket_id 00:00:59.421 b82a65c31 bdev: add socket_id to spdk_bdev 00:00:59.421 963b39395 fio/nvme: use socket_id when allocating io buffers 00:00:59.421 7537c9118 spdk_nvme_perf: allocate buffers from socket_id reported by ctrlr 00:00:59.421 9bf834626 nvme/pcie: allocate cq from device-local numa node's memory 00:00:59.439 [Pipeline] setCustomBuildProperty 00:00:59.447 [Pipeline] setCustomBuildProperty 00:00:59.456 [Pipeline] catchError 00:00:59.458 [Pipeline] { 00:00:59.477 [Pipeline] sh 00:00:59.760 + git -C spdk describe --tags --abbrev=0 origin/master 00:00:59.774 [Pipeline] sh 00:01:00.057 + git -C spdk describe --tags --abbrev=0 --exclude=LTS HEAD 00:01:00.072 [Pipeline] echo 00:01:00.074 Branch: master 00:01:00.077 [Pipeline] fileExists 00:01:00.096 [Pipeline] readJSON 00:01:00.114 [Pipeline] } 00:01:00.140 [Pipeline] // catchError 00:01:00.148 [Pipeline] sh 00:01:00.427 + /var/jenkins/workspace/autotest-per-patch/jbp/jenkins/jjb-config/jobs/scripts/get-pkgdep-jobs.sh /var/jenkins/workspace/autotest-per-patch/spdk 00:01:00.445 [Pipeline] } 00:01:00.471 [Pipeline] // stage 00:01:00.490 [Pipeline] catchError 00:01:00.491 [Pipeline] { 00:01:00.513 [Pipeline] stage 00:01:00.515 [Pipeline] { (Pre tests) 00:01:00.552 [Pipeline] parallel 00:01:00.564 [Pipeline] { (Branch: check-format-docker-autotest) 00:01:00.565 [Pipeline] { (Branch: check-so-deps-docker-autotest) 00:01:00.567 [Pipeline] { (Branch: doc-docker-autotest) 00:01:00.568 [Pipeline] { (Branch: build-files-docker-autotest) 00:01:00.591 [Pipeline] retry 00:01:00.593 [Pipeline] { 00:01:00.597 [Pipeline] retry 00:01:00.599 [Pipeline] { 00:01:00.604 [Pipeline] retry 00:01:00.605 [Pipeline] { 00:01:00.610 [Pipeline] retry 00:01:00.612 [Pipeline] { 00:01:00.635 [Pipeline] build 00:01:00.638 Scheduling project: check-format-docker-autotest 00:01:00.645 [Pipeline] build 00:01:00.648 Scheduling project: check-so-deps-docker-autotest 00:01:00.655 [Pipeline] build 00:01:00.658 Scheduling project: doc-docker-autotest 00:01:00.665 [Pipeline] build 00:01:00.668 Scheduling project: build-files-docker-autotest 00:01:06.582 Starting building: check-format-docker-autotest #26499 00:01:06.586 Starting building: doc-docker-autotest #26693 00:01:06.589 Starting building: check-so-deps-docker-autotest #26503 00:01:06.592 Starting building: build-files-docker-autotest #26478 00:01:49.843 Build doc-docker-autotest #26693 completed: SUCCESS 00:01:49.844 [Pipeline] } 00:01:49.866 [Pipeline] // retry 00:01:49.871 [Pipeline] } 00:02:04.692 Build check-format-docker-autotest #26499 completed: FAILURE 00:02:04.714 [Pipeline] echo 00:02:04.716 No retry patterns found. 00:02:04.718 [Pipeline] } 00:02:04.751 [Pipeline] // retry 00:02:04.760 [Pipeline] error 00:02:04.766 [Pipeline] } 00:02:04.772 Failed in branch check-format-docker-autotest 00:03:48.196 Build build-files-docker-autotest #26478 completed: SUCCESS 00:03:48.198 [Pipeline] } 00:03:48.234 [Pipeline] // retry 00:03:48.240 [Pipeline] } 00:04:09.320 Build check-so-deps-docker-autotest #26503 completed: SUCCESS 00:04:09.322 [Pipeline] } 00:04:09.360 [Pipeline] // retry 00:04:09.367 [Pipeline] } 00:04:09.428 [Pipeline] // parallel 00:04:09.438 [Pipeline] } 00:04:09.471 [Pipeline] // stage 00:04:09.481 [Pipeline] } 00:04:09.486 ERROR: Build check-format-docker-autotest #26499 failed 00:04:09.486 Setting overall build result to FAILURE 00:04:09.516 [Pipeline] // catchError 00:04:09.527 [Pipeline] catchError 00:04:09.529 [Pipeline] { 00:04:09.551 [Pipeline] stage 00:04:09.553 [Pipeline] { (Tests) 00:04:09.576 [Pipeline] unstable 00:04:09.579 WARNING: Previous stages failed 00:04:09.581 [Pipeline] } 00:04:09.612 [Pipeline] // stage 00:04:09.619 [Pipeline] } 00:04:09.648 [Pipeline] // catchError 00:04:09.661 [Pipeline] stage 00:04:09.663 [Pipeline] { (Autorun Post and Coverage) 00:04:09.686 [Pipeline] setCustomBuildProperty 00:04:09.711 [Pipeline] dir 00:04:09.712 Running in /var/jenkins/workspace/autotest-per-patch/doc-docker-autotest_26693 00:04:09.713 [Pipeline] { 00:04:09.739 [Pipeline] copyArtifacts 00:04:09.967 Copied 5 artifacts from "doc-docker-autotest" build number 26693 00:04:09.972 [Pipeline] writeFile 00:04:09.996 [Pipeline] } 00:04:10.028 [Pipeline] // dir 00:04:10.047 [Pipeline] dir 00:04:10.048 Running in /var/jenkins/workspace/autotest-per-patch/check-format-docker-autotest_26499 00:04:10.050 [Pipeline] { 00:04:10.076 [Pipeline] copyArtifacts 00:04:10.114 Copied 4 artifacts from "check-format-docker-autotest" build number 26499 00:04:10.119 [Pipeline] writeFile 00:04:10.144 [Pipeline] } 00:04:10.184 [Pipeline] // dir 00:04:10.277 [Pipeline] dir 00:04:10.278 Running in /var/jenkins/workspace/autotest-per-patch/build-files-docker-autotest_26478 00:04:10.279 [Pipeline] { 00:04:10.302 [Pipeline] copyArtifacts 00:04:10.342 Copied 4 artifacts from "build-files-docker-autotest" build number 26478 00:04:10.346 [Pipeline] writeFile 00:04:10.386 [Pipeline] } 00:04:10.423 [Pipeline] // dir 00:04:10.483 [Pipeline] dir 00:04:10.483 Running in /var/jenkins/workspace/autotest-per-patch/check-so-deps-docker-autotest_26503 00:04:10.484 [Pipeline] { 00:04:10.507 [Pipeline] copyArtifacts 00:04:10.549 Copied 4 artifacts from "check-so-deps-docker-autotest" build number 26503 00:04:10.553 [Pipeline] writeFile 00:04:10.576 [Pipeline] } 00:04:10.606 [Pipeline] // dir 00:04:10.615 [Pipeline] catchError 00:04:10.617 [Pipeline] { 00:04:10.634 [Pipeline] sh 00:04:10.912 + jbp/jenkins/jjb-config/jobs/scripts/post_gen_coverage.sh 00:04:10.912 + shopt -s globstar nullglob 00:04:10.912 + echo 'Start stage post_gen_coverage.sh' 00:04:10.912 Start stage post_gen_coverage.sh 00:04:10.912 + cd /var/jenkins/workspace/autotest-per-patch 00:04:10.912 + rm -rf /var/jenkins/workspace/autotest-per-patch/spdk/doc 00:04:10.912 + trap 'compress_coverage_and_docs; remove_partial_coverage_files && echo '\''End stage post_gen_coverage.sh'\''' EXIT 00:04:10.912 + move_artifacts 00:04:10.912 + local out_dirs 00:04:10.912 + out_dirs=(./**/output/) 00:04:10.912 + for dir in "${out_dirs[@]}" 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./build-files-docker-autotest_26478/output//doc.tar.xz ]] 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./build-files-docker-autotest_26478/output//ut_coverage.tar.xz ]] 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./build-files-docker-autotest_26478/output//llvm.tar.xz ]] 00:04:10.912 + mv ./build-files-docker-autotest_26478/output//build-repo-manifest.txt ./build-files-docker-autotest_26478/output//power.tar.xz ./build-files-docker-autotest_26478/output//test_completions.txt ./build-files-docker-autotest_26478/output//timing.txt ./build-files-docker-autotest_26478/output//.. 00:04:10.912 + rmdir ./build-files-docker-autotest_26478/output/ 00:04:10.912 + for dir in "${out_dirs[@]}" 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./check-format-docker-autotest_26499/output//doc.tar.xz ]] 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./check-format-docker-autotest_26499/output//ut_coverage.tar.xz ]] 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./check-format-docker-autotest_26499/output//llvm.tar.xz ]] 00:04:10.912 + mv ./check-format-docker-autotest_26499/output//build-repo-manifest.txt ./check-format-docker-autotest_26499/output//power.tar.xz ./check-format-docker-autotest_26499/output//test_completions.txt ./check-format-docker-autotest_26499/output//timing.txt ./check-format-docker-autotest_26499/output//.. 00:04:10.912 + rmdir ./check-format-docker-autotest_26499/output/ 00:04:10.912 + for dir in "${out_dirs[@]}" 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./check-so-deps-docker-autotest_26503/output//doc.tar.xz ]] 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./check-so-deps-docker-autotest_26503/output//ut_coverage.tar.xz ]] 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./check-so-deps-docker-autotest_26503/output//llvm.tar.xz ]] 00:04:10.912 + mv ./check-so-deps-docker-autotest_26503/output//build-repo-manifest.txt ./check-so-deps-docker-autotest_26503/output//power.tar.xz ./check-so-deps-docker-autotest_26503/output//test_completions.txt ./check-so-deps-docker-autotest_26503/output//timing.txt ./check-so-deps-docker-autotest_26503/output//.. 00:04:10.912 + rmdir ./check-so-deps-docker-autotest_26503/output/ 00:04:10.912 + for dir in "${out_dirs[@]}" 00:04:10.912 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:10.912 + [[ -f ./doc-docker-autotest_26693/output//doc.tar.xz ]] 00:04:10.912 + tar -C ./doc-docker-autotest_26693/output/ -xf ./doc-docker-autotest_26693/output//doc.tar.xz 00:04:11.170 + rm ./doc-docker-autotest_26693/output//doc.tar.xz 00:04:11.170 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:11.170 + [[ -f ./doc-docker-autotest_26693/output//ut_coverage.tar.xz ]] 00:04:11.170 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:11.170 + [[ -f ./doc-docker-autotest_26693/output//llvm.tar.xz ]] 00:04:11.170 + mv ./doc-docker-autotest_26693/output//build-repo-manifest.txt ./doc-docker-autotest_26693/output//doc ./doc-docker-autotest_26693/output//power.tar.xz ./doc-docker-autotest_26693/output//test_completions.txt ./doc-docker-autotest_26693/output//timing.txt ./doc-docker-autotest_26693/output//.. 00:04:11.170 + rmdir ./doc-docker-autotest_26693/output/ 00:04:11.170 + unpack_cov_files 00:04:11.170 + local info_files 00:04:11.170 + info_files=(*/cov_*.info.xz) 00:04:11.170 + printf '%s\n' 00:04:11.170 + xargs -P0 -r -n1 xz -d 00:04:11.170 + fix_downstream_job_paths 00:04:11.170 + sed -i -e 's#^SF:/.\+/spdk/#SF:/var/jenkins/workspace/autotest-per-patch/spdk/#g' 00:04:11.170 sed: no input files 00:04:11.170 + compress_coverage_and_docs 00:04:11.170 + echo 'Start compress coverage and docs' 00:04:11.170 Start compress coverage and docs 00:04:11.170 + tar -C coverage -czf coverage_autotest-per-patch_126098.tar.gz ./ --remove-files 00:04:11.170 tar: coverage: Cannot open: No such file or directory 00:04:11.170 tar: Error is not recoverable: exiting now 00:04:11.194 [Pipeline] } 00:04:11.200 ERROR: script returned exit code 2 00:04:11.241 [Pipeline] // catchError 00:04:11.254 [Pipeline] catchError 00:04:11.256 [Pipeline] { 00:04:11.280 [Pipeline] dir 00:04:11.281 Running in /var/jenkins/workspace/autotest-per-patch/post_process 00:04:11.283 [Pipeline] {