Actions
Bug #6733
closedrgw readwrite test fails on next branch
% Done:
0%
Source:
Q/A
Tags:
Backport:
Regression:
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Pull request ID:
Crash signature (v1):
Crash signature (v2):
Description
logs: ubuntu@teuthology:/a/teuthology-2013-11-05_23:01:20-rgw-next-testing-basic-plana/86307
2013-11-06T06:10:54.506 INFO:teuthology.orchestra.run.out:[10.214.131.32]: key: uljmk 2013-11-06T06:10:54.506 INFO:teuthology.orchestra.run.out:[10.214.131.32]: start: 1383746754.140498 2013-11-06T06:10:54.506 INFO:teuthology.orchestra.run.out:[10.214.131.32]: type: r 2013-11-06T06:10:54.506 INFO:teuthology.orchestra.run.out:[10.214.131.32]: worker: 5 2013-11-06T06:10:54.556 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,aifhycpbxoawmqnz> 2013-11-06T06:10:54.641 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,bvwhzxouudc> 2013-11-06T06:10:54.697 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,cmobvdxqktsrr> 2013-11-06T06:10:54.767 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,diyoulxeinbzzi> 2013-11-06T06:10:54.825 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,dlxtrrvpblpwmbjrcvrhveftc> 2013-11-06T06:10:54.881 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,mswinbptnzb> 2013-11-06T06:10:54.955 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,uljmk> 2013-11-06T06:10:55.492 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,veokrsal> 2013-11-06T06:10:55.738 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,xbrophcaduunqj> 2013-11-06T06:10:55.895 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Cleaning bucket <Bucket: rwtest> key <Key: rwtest,ydtniveppsmswdn> 2013-11-06T06:10:56.397 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Traceback (most recent call last): 2013-11-06T06:10:56.397 INFO:teuthology.orchestra.run.err:[10.214.131.32]: File "/home/ubuntu/cephtest/s3-tests/virtualenv/bin/s3tests-test-readwrite", line 9, in <module> 2013-11-06T06:10:56.397 INFO:teuthology.orchestra.run.err:[10.214.131.32]: load_entry_point('s3tests==0.0.1', 'console_scripts', 's3tests-test-readwrite')() 2013-11-06T06:10:56.397 INFO:teuthology.orchestra.run.err:[10.214.131.32]: File "/home/ubuntu/cephtest/s3-tests/s3tests/readwrite.py", line 255, in main 2013-11-06T06:10:56.398 INFO:teuthology.orchestra.run.err:[10.214.131.32]: trace=temp_dict['error']['traceback']) 2013-11-06T06:10:56.398 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Exception: exception: 2013-11-06T06:10:56.398 INFO:teuthology.orchestra.run.err:[10.214.131.32]: md5sum check failed 2013-11-06T06:10:56.398 INFO:teuthology.orchestra.run.err:[10.214.131.32]: None 2013-11-06T06:10:56.398 INFO:teuthology.orchestra.run.err:[10.214.131.32]: 2013-11-06T06:10:56.398 INFO:teuthology.orchestra.run.err:[10.214.131.32]: Exception KeyError: KeyError(17360688,) in <module 'threading' from '/usr/lib/python2.7/threading.pyc'> ignored 2013-11-06T06:10:57.740 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthworker/teuthology-next/teuthology/contextutil.py", line 25, in nested vars.append(enter()) File "/usr/lib/python2.7/contextlib.py", line 17, in __enter__ return self.gen.next() File "/home/teuthworker/teuthology-next/teuthology/task/s3readwrite.py", line 202, in run_tests stdin=conf, File "/home/teuthworker/teuthology-next/teuthology/orchestra/cluster.py", line 54, in run return [remote.run(**kwargs) for remote in remotes] File "/home/teuthworker/teuthology-next/teuthology/orchestra/remote.py", line 47, in run r = self._runner(client=self.ssh, **kwargs) File "/home/teuthworker/teuthology-next/teuthology/orchestra/run.py", line 271, in run r.exitstatus = _check_status(r.exitstatus) File "/home/teuthworker/teuthology-next/teuthology/orchestra/run.py", line 267, in _check_status raise CommandFailedError(command=r.command, exitstatus=status, node=host) CommandFailedError: Command failed on 10.214.131.32 with status 1: '/home/ubuntu/cephtest/s3-tests/virtualenv/bin/s3tests-test-readwrite' ubuntu@teuthology:/a/teuthology-2013-11-05_23:01:20-rgw-next-testing-basic-plana/86307$ cat config.yaml archive_path: /var/lib/teuthworker/archive/teuthology-2013-11-05_23:01:20-rgw-next-testing-basic-plana/86307 description: rgw/multifs/{clusters/fixed-2.yaml fs/ext4.yaml tasks/rgw_readwrite.yaml} email: null job_id: '86307' kernel: &id001 kdb: true sha1: b3efdaef9e1ac82e95ade43861f625b732dab06d last_in_suite: false machine_type: plana name: teuthology-2013-11-05_23:01:20-rgw-next-testing-basic-plana nuke-on-error: true os_type: ubuntu overrides: admin_socket: branch: next ceph: conf: mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 5 fs: ext4 log-whitelist: - slow request sha1: a3ccd29716af900be265b6f995eb4069b334c516 ceph-deploy: branch: dev: next conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: debug mon: 1 debug ms: 20 debug paxos: 20 install: ceph: sha1: a3ccd29716af900be265b6f995eb4069b334c516 s3tests: branch: next workunit: sha1: a3ccd29716af900be265b6f995eb4069b334c516 owner: scheduled_teuthology@teuthology roles: - - mon.a - mon.c - osd.0 - osd.1 - osd.2 - client.0 - - mon.b - mds.a - osd.3 - osd.4 - osd.5 - client.1 targets: ubuntu@plana08.front.sepia.ceph.com: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDhIfaWljFSsc6hEqxUeZz9VAzYSEeJR5CjJOR+w7v35e+rLUTEQzFT1+F+8SqLexRCXa+bAV4pdMAZdrUD/a30HQ459hFMA9RU7nigOsJUb7rAb3rAzT/znSYjjN12H/jRkIg8i35uqwyZTePTC0giTnQwaQi4SJL5H+78TL6cSmC4uWsKLAvTuv15lKU5pFg+ZlVLmB4OxJIM3UUp7EQRjNx8bRhzX+SbuQIySZNXuMiAIz9kDmh79MV6i5If4DVuYq64lR//kWKKammQi24BQKLD5cck0uIvyENqLX1Ci0GwyebHDXW62iYFhuSqkf4bisWNydrALe5fFQBIo6kp ubuntu@plana12.front.sepia.ceph.com: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCix3Z4OLu6zqp3TdoSLdxhEGW6XYvwNk2nAAZ1pkRlHcfT20Vrk/5XcU4uqt67zMDAxyl35J7yUkYrskb/ufhEnsFWLrLG7fgR5TuTBS+bkmIbPfSG414V65zGYTlGN86EtpHlE/KCX+oCihNn6TEUW6nZi2mavyzyLNWXwCh7asHarU0joFT1de33yUX471YejCOg+Kd+kCF/9k1b9eLKaYE1TvBFEFV1uiimtDBMWxydKFlUFpP3AnS6UraTIcoxlpJV8aUqmhFzNXjL24DekK5cjG1uw9J7hgGNhvXUSCZnmkXnrjCldQ7IqzY3EDcZqx2u9oopzrhu07X0+uOH tasks: - internal.lock_machines: - 2 - plana - internal.save_config: null - internal.check_lock: null - internal.connect: null - internal.check_conflict: null - internal.check_ceph_data: null - internal.vm_setup: null - kernel: *id001 - internal.base: null - internal.archive: null - internal.coredump: null - internal.sudo: null - internal.syslog: null - internal.timer: null - chef: null - clock.check: null - install: null - ceph: null - rgw: - client.0 - s3readwrite: client.0: readwrite: bucket: rwtest duration: 300 files: num: 10 size: 2000 stddev: 500 readers: 10 writers: 3 rgw_server: client.0 teuthology_branch: next verbose: true ubuntu@teuthology:/a/teuthology-2013-11-05_23:01:20-rgw-next-testing-basic-plana/86307$ cat summary.yaml description: rgw/multifs/{clusters/fixed-2.yaml fs/ext4.yaml tasks/rgw_readwrite.yaml} duration: 536.0474450588226 failure_reason: 'Command failed on 10.214.131.32 with status 1: ''/home/ubuntu/cephtest/s3-tests/virtualenv/bin/s3tests-test-readwrite''' flavor: basic mon.a-kernel-sha1: b3efdaef9e1ac82e95ade43861f625b732dab06d mon.b-kernel-sha1: b3efdaef9e1ac82e95ade43861f625b732dab06d owner: scheduled_teuthology@teuthology sentry_events: - http://sentry.ceph.com/inktank/teuthology/search?q=fe19c782d74143c9b6ce14e416ef4d22 success: false
Updated by Tamilarasi muthamizhan over 10 years ago
- Project changed from Ceph to rgw
Updated by Tamilarasi muthamizhan over 10 years ago
- Priority changed from Urgent to Normal
Updated by Tamilarasi muthamizhan over 10 years ago
it seems to be some kind of miscommunication between radosgw and apache, most likely apache is going down
Updated by Yehuda Sadeh about 10 years ago
- Status changed from New to Closed
There's a good chance that this was #7030. Closing.
Actions