Bug #55804
Updated by Rishabh Dave almost 2 years ago
Teuthology run: https://pulpito.ceph.com/vshankar-2022-04-26_06:23:29-fs:workload-wip-45556-20220418-102656-testing-default-smithi/ These 4 jobs failed with same error: 6806467, 6806495, 6806509, 6806523. The failure can be reproduced easily. Mount a subvolume, get pjdfstests pjdfst and then run - <pre> $ wget http://download.ceph.com/qa/pjd-fstest-20090130-RC-aclfixes.tgz $ tar zxvf pjd*.tgz </pre> And then run - <pre> $ # /mnt/cephfs is the cephfs mountpoint $ cd /mnt/cephfs && sudo cp -r ~/pjd-fstest-20090130-RC ./ && cd pjd-fstest-20090130-RC/ && sudo make clean && sudo make && cd ../ && sudo mkdir tmp && cd tmp </pre> And then the run the failing testsuites - <pre> $ sudo prove -r -v --exec 'bash -x' ../pjd*/tests/link/02.t $ sudo prove -r -v --exec 'bash -x' ../pjd*/tests/link/03.t </pre> <pre> 2022-04-26T06:59:11.148 INFO:tasks.workunit.client.0.smithi047.stdout:../pjd-fstest-20090130-RC/tests/link/02.t ...... 2022-04-26T06:59:11.148 INFO:tasks.workunit.client.0.smithi047.stdout:1..10 2022-04-26T06:59:11.148 INFO:tasks.workunit.client.0.smithi047.stdout:ok 1 2022-04-26T06:59:11.148 INFO:tasks.workunit.client.0.smithi047.stdout:ok 2 2022-04-26T06:59:11.148 INFO:tasks.workunit.client.0.smithi047.stdout:ok 3 2022-04-26T06:59:11.148 INFO:tasks.workunit.client.0.smithi047.stdout:not ok 4 2022-04-26T06:59:11.148 INFO:tasks.workunit.client.0.smithi047.stdout:ok 5 2022-04-26T06:59:11.149 INFO:tasks.workunit.client.0.smithi047.stdout:not ok 6 2022-04-26T06:59:11.149 INFO:tasks.workunit.client.0.smithi047.stdout:ok 7 2022-04-26T06:59:11.149 INFO:tasks.workunit.client.0.smithi047.stdout:ok 8 2022-04-26T06:59:11.149 INFO:tasks.workunit.client.0.smithi047.stdout:ok 9 2022-04-26T06:59:11.149 INFO:tasks.workunit.client.0.smithi047.stdout:ok 10 2022-04-26T06:59:11.149 INFO:tasks.workunit.client.0.smithi047.stdout:Failed 2/10 subtests </pre> 02.t test #4: @expect 0 link ${n0} ${name255}@ 02.t test #6: @expect 0 unlink ${name255}@ <pre> 2022-04-26T06:59:11.346 INFO:tasks.workunit.client.0.smithi047.stdout:../pjd-fstest-20090130-RC/tests/link/03.t ...... 2022-04-26T06:59:11.346 INFO:tasks.workunit.client.0.smithi047.stdout:1..16 2022-04-26T06:59:11.346 INFO:tasks.workunit.client.0.smithi047.stdout:ok 1 2022-04-26T06:59:11.346 INFO:tasks.workunit.client.0.smithi047.stdout:ok 2 2022-04-26T06:59:11.346 INFO:tasks.workunit.client.0.smithi047.stdout:ok 3 2022-04-26T06:59:11.346 INFO:tasks.workunit.client.0.smithi047.stdout:ok 4 2022-04-26T06:59:11.346 INFO:tasks.workunit.client.0.smithi047.stdout:ok 5 2022-04-26T06:59:11.346 INFO:tasks.workunit.client.0.smithi047.stdout:ok 6 2022-04-26T06:59:11.346 INFO:tasks.workunit.client.0.smithi047.stdout:ok 7 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:not ok 8 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:not ok 9 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:ok 10 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:ok 11 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:ok 12 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:ok 13 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:ok 14 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:ok 15 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:ok 16 2022-04-26T06:59:11.347 INFO:tasks.workunit.client.0.smithi047.stdout:Failed 2/16 subtests </pre> 03.t test #8: @expect 0 link ${n0} ${path1023}@ 03.t test #9: @expect 0 unlink ${path1023}@ Final lines of output of @pjd.sh@ - <pre> 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout:ok 40 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout:ok 41 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout:ok 42 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout:ok 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout: 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout:Test Summary Report 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout:------------------- 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout:../pjd-fstest-20090130-RC/tests/link/02.t (Wstat: 0 Tests: 10 Failed: 2) 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout: Failed tests: 4, 6 2022-04-26T06:59:51.015 INFO:tasks.workunit.client.0.smithi047.stdout:../pjd-fstest-20090130-RC/tests/link/03.t (Wstat: 0 Tests: 16 Failed: 2) 2022-04-26T06:59:51.016 INFO:tasks.workunit.client.0.smithi047.stdout: Failed tests: 8-9 2022-04-26T06:59:51.016 INFO:tasks.workunit.client.0.smithi047.stdout:Files=191, Tests=2286, 69 wallclock secs ( 1.30 usr 0.61 sys + 8.17 cusr 6.84 csys = 16.92 CPU) 2022-04-26T06:59:51.016 INFO:tasks.workunit.client.0.smithi047.stdout:Result: FAIL 2022-04-26T06:59:51.016 INFO:tasks.workunit:Stopping ['suites/pjd.sh'] on client.0... 2022-04-26T06:59:51.016 DEBUG:teuthology.orchestra.run.smithi047:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 </pre> Error - <pre> teuthology.exceptions.CommandFailedError: Command failed (workunit test suites/pjd.sh) on smithi047 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=1ccbc711b8876e630c0358e1d8d923daa34dca1e TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 6h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/pjd.sh' </pre> Traceback from the job 6806467 - <pre> Traceback (most recent call last): File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/run_tasks.py", line 103, in run_tasks manager = run_one_task(taskname, ctx=ctx, config=config) File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/run_tasks.py", line 82, in run_one_task return task(**kwargs) File "/home/teuthworker/src/git.ceph.com_ceph-c_1ccbc711b8876e630c0358e1d8d923daa34dca1e/qa/tasks/workunit.py", line 148, in task cleanup=cleanup) File "/home/teuthworker/src/git.ceph.com_ceph-c_1ccbc711b8876e630c0358e1d8d923daa34dca1e/qa/tasks/workunit.py", line 298, in _spawn_on_all_clients timeout=timeout) File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/parallel.py", line 84, in __exit__ for result in self: File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/parallel.py", line 98, in __next__ resurrect_traceback(result) File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/parallel.py", line 30, in resurrect_traceback raise exc.exc_info[1] File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/parallel.py", line 23, in capture_traceback return func(*args, **kwargs) File "/home/teuthworker/src/git.ceph.com_ceph-c_1ccbc711b8876e630c0358e1d8d923daa34dca1e/qa/tasks/workunit.py", line 427, in _run_tests label="workunit test {workunit}".format(workunit=workunit) File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/orchestra/remote.py", line 509, in run r = self._runner(client=self.ssh, name=self.shortname, **kwargs) File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/orchestra/run.py", line 455, in run r.wait() File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthworker/src/git.ceph.com_git_teuthology_788cfdd8098ad222aa448289edcfa4436091c32c/teuthology/orchestra/run.py", line 183, in _raise_for_status </pre>