Project

General

Profile

Bug #40926

"Command failed (workunit test rados/test.sh)" in rados

Added by Yuri Weinstein over 4 years ago. Updated over 4 years ago.

Status:
Duplicate
Priority:
Urgent
Assignee:
-
Category:
-
Target version:
-
% Done:

0%

Source:
Q/A
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
rados
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

Run: http://pulpito.ceph.com/teuthology-2019-07-24_06:00:03-smoke-mimic-testing-basic-smithi/
Jobs: '4144267', '4144268'
Logs: http://qa-proxy.ceph.com/teuthology/teuthology-2019-07-24_06:00:03-smoke-mimic-testing-basic-smithi/4144267/teuthology.log

2019-07-24T09:32:29.138 INFO:tasks.workunit.client.0.smithi090.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test.sh: line 10:  9622 Terminated              bash -o pipefail -exc "ceph_test_rados_$f $color 2>&1 | tee ceph_test_rados_$ff.log | sed \"s/^/$r: /\"" 
2019-07-24T09:32:29.139 INFO:tasks.workunit.client.0.smithi090.stderr:++ true
2019-07-24T09:32:29.141 DEBUG:teuthology.orchestra.run:got remote process result: 124
2019-07-24T09:32:29.142 INFO:tasks.workunit:Stopping ['rados/test.sh'] on client.0...
2019-07-24T09:32:29.142 INFO:teuthology.orchestra.run.smithi090:Running:
2019-07-24T09:32:29.142 INFO:teuthology.orchestra.run.smithi090:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0
2019-07-24T09:32:29.538 ERROR:teuthology.run_tasks:Saw exception from tasks.
Traceback (most recent call last):
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/run_tasks.py", line 86, in run_tasks
    manager = run_one_task(taskname, ctx=ctx, config=config)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/run_tasks.py", line 65, in run_one_task
    return task(**kwargs)
  File "/home/teuthworker/src/git.ceph.com_ceph_mimic/qa/tasks/workunit.py", line 123, in task
    timeout=timeout,cleanup=cleanup)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/parallel.py", line 85, in __exit__
    for result in self:
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/parallel.py", line 99, in next
    resurrect_traceback(result)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/parallel.py", line 22, in capture_traceback
    return func(*args, **kwargs)
  File "/home/teuthworker/src/git.ceph.com_ceph_mimic/qa/tasks/workunit.py", line 409, in _run_tests
    label="workunit test {workunit}".format(workunit=workunit)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/orchestra/remote.py", line 205, in run
    r = self._runner(client=self.ssh, name=self.shortname, **kwargs)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/orchestra/run.py", line 437, in run
    r.wait()
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/orchestra/run.py", line 162, in wait
    self._raise_for_status()
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/orchestra/run.py", line 184, in _raise_for_status
    node=self.hostname, label=self.label
CommandFailedError: Command failed (workunit test rados/test.sh) on smithi090 with status 124: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=c1cd465c4f7a185c193fc8999a74b9ff428ac254 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test.sh'

Related issues

Duplicates RADOS - Bug #40765: mimic: "Command failed (workunit test rados/test.sh)" in smoke/master/mimic Duplicate 07/12/2019

History

#1 Updated by Yuri Weinstein over 4 years ago

also in http://pulpito.ceph.com/teuthology-2019-07-25_02:25:02-upgrade:luminous-x-mimic-distro-basic-smithi/
'4146277', '4146270', '4146281', '4146286', '4146297', '4146293', '4146298', '4146301', '4146285', '4146274', '4146290', '4146282', '4146294', '4146289', '4146273', '4146278'

#2 Updated by Yuri Weinstein over 4 years ago

  • Priority changed from Normal to Urgent

#3 Updated by Brad Hubbard over 4 years ago

  • Status changed from New to Duplicate
2019-07-24T06:32:34.563 INFO:tasks.workunit.client.0.smithi090.stdout:                   api_io: [ RUN      ] LibRadosIoPP.RemoveTestPP

LibRadosIoPP.RemoveTestPP, which is part of ceph_test_rados_api_io, started running but there is no further output.

2019-07-24T06:32:29.131 INFO:tasks.workunit.client.0.smithi090.stderr:+ bash -o pipefail -exc 'ceph_test_rados_api_aio  2>&1 | tee ceph_test_rados_api_aio.log | sed "s/^/                  api_aio: /"'
2019-07-24T06:32:29.131 INFO:tasks.workunit.client.0.smithi090.stderr:++ printf %25s api_io
2019-07-24T06:32:29.131 INFO:tasks.workunit.client.0.smithi090.stderr:+ r='                   api_io'
2019-07-24T06:32:29.132 INFO:tasks.workunit.client.0.smithi090.stdout:test api_aio on pid 9614
2019-07-24T06:32:29.132 INFO:tasks.workunit.client.0.smithi090.stderr:++ echo api_io
2019-07-24T06:32:29.133 INFO:tasks.workunit.client.0.smithi090.stderr:++ awk '{print $1}'
2019-07-24T06:32:29.134 INFO:tasks.workunit.client.0.smithi090.stderr:+ ceph_test_rados_api_aio
2019-07-24T06:32:29.135 INFO:tasks.workunit.client.0.smithi090.stderr:+ tee ceph_test_rados_api_aio.log
2019-07-24T06:32:29.135 INFO:tasks.workunit.client.0.smithi090.stderr:+ sed 's/^/                  api_aio: /'
2019-07-24T06:32:29.138 INFO:tasks.workunit.client.0.smithi090.stderr:+ ff=api_io
2019-07-24T06:32:29.139 INFO:tasks.workunit.client.0.smithi090.stderr:+ pid=9622
2019-07-24T06:32:29.140 INFO:tasks.workunit.client.0.smithi090.stderr:+ echo 'test api_io on pid 9622'
2019-07-24T06:32:29.140 INFO:tasks.workunit.client.0.smithi090.stderr:+ pids[$f]=9622
2019-07-24T06:32:29.140 INFO:tasks.workunit.client.0.smithi090.stderr:+ for f in api_aio api_io api_asio api_list api_lock api_misc api_tier api_pool api_snapshots api_stat api_watch_notify api_cmd api_service api_c_write_operations api_c_read_operations list_parallel open_pools_parallel delete_pools_parallel watch_notify
2019-07-24T06:32:29.140 INFO:tasks.workunit.client.0.smithi090.stderr:+ '[' 1 -eq 1 ']'
2019-07-24T06:32:29.140 INFO:tasks.workunit.client.0.smithi090.stderr:+ bash -o pipefail -exc 'ceph_test_rados_api_io  2>&1 | tee ceph_test_rados_api_io.log | sed "s/^/                   api_io: /"'
2019-07-24T06:32:29.141 INFO:tasks.workunit.client.0.smithi090.stderr:++ printf %25s api_asio
2019-07-24T06:32:29.141 INFO:tasks.workunit.client.0.smithi090.stdout:test api_io on pid 9622

So we are looking for pid 9622.

2019-07-24T06:45:16.449 INFO:tasks.workunit.client.0.smithi090.stderr:+ pid=9622
2019-07-24T06:45:16.449 INFO:tasks.workunit.client.0.smithi090.stderr:+ wait 9622
2019-07-24T06:45:41.035 INFO:tasks.mon_thrash.mon_thrasher:killing mon.b
2019-07-24T06:45:41.035 INFO:tasks.mon_thrash.mon_thrasher:reviving mon.b
2019-07-24T06:45:41.036 INFO:tasks.ceph.mon.b:Restarting daemon
2019-07-24T06:45:41.036 INFO:teuthology.orchestra.run.smithi138:Running:
2019-07-24T06:45:41.036 INFO:teuthology.orchestra.run.smithi138:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mon -f --cluster ceph -i b
2019-07-24T06:45:41.041 INFO:tasks.ceph.mon.b:Started

It's the last pid we are waiting for.

2019-07-24T09:32:29.138 INFO:tasks.workunit.client.0.smithi090.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/rados/test.sh: line 10:  9622 Terminated              bash -o pipefail -exc "ceph_test_rados_$f $color 2>&1 | tee ceph_test_rados_$ff.log | sed \"s/^/$r: /\"" 

It's also in the "terminated" line.

Looks like this is a dupe of http://tracker.ceph.com/issues/40765

#4 Updated by Brad Hubbard over 4 years ago

  • Duplicates Bug #40765: mimic: "Command failed (workunit test rados/test.sh)" in smoke/master/mimic added

Also available in: Atom PDF