Actions
Bug #49621
closedqa: ERROR: test_fragmented_injection (tasks.cephfs.test_data_scan.TestDataScan)
% Done:
0%
Source:
Development
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
fs
Component(FS):
Labels (FS):
qa-failure
Pull request ID:
Crash signature (v1):
Crash signature (v2):
Description
When running the teuthology test locally, the tasks.cephfs.test_data_scan.TestDataScan test failed:
ceph-fuse[2024651]: fuse finished with error 0 and tester_r 0 2021-03-05 16:54:17,689.689 INFO:tasks.cephfs.filesystem:Destroying file system cephfs and related pools 2021-03-05 16:54:26,055.055 INFO:__main__:Stopped test: test_fragmented_injection (tasks.cephfs.test_data_scan.TestDataScan) in 27.723641s 2021-03-05 16:54:26,056.056 INFO:__main__:test_fragmented_injection (tasks.cephfs.test_data_scan.TestDataScan) ... ERROR 2021-03-05 16:54:26,056.056 INFO:__main__: 2021-03-05 16:54:26,056.056 INFO:__main__:====================================================================== 2021-03-05 16:54:26,056.056 INFO:__main__:ERROR: test_fragmented_injection (tasks.cephfs.test_data_scan.TestDataScan) 2021-03-05 16:54:26,057.057 INFO:__main__:---------------------------------------------------------------------- 2021-03-05 16:54:26,057.057 INFO:__main__:Traceback (most recent call last): 2021-03-05 16:54:26,057.057 INFO:__main__: File "/data/ceph/qa/tasks/cephfs/test_data_scan.py", line 457, in test_fragmented_injection 2021-03-05 16:54:26,057.057 INFO:__main__: self.fs.mds_asok(["dirfrag", "split", "/subdir", "0/0", "1"], mds_id) 2021-03-05 16:54:26,057.057 INFO:__main__: File "/data/ceph/qa/tasks/cephfs/filesystem.py", line 1122, in mds_asok 2021-03-05 16:54:26,057.057 INFO:__main__: return self.json_asok(command, 'mds', mds_id, timeout=timeout) 2021-03-05 16:54:26,057.057 INFO:__main__: File "/data/ceph/qa/tasks/cephfs/filesystem.py", line 246, in json_asok 2021-03-05 16:54:26,057.057 INFO:__main__: proc = self.mon_manager.admin_socket(service_type, service_id, command, timeout=timeout) 2021-03-05 16:54:26,057.057 INFO:__main__: File "../qa/tasks/vstart_runner.py", line 1012, in admin_socket 2021-03-05 16:54:26,057.057 INFO:__main__: timeout=timeout, stdout=stdout) 2021-03-05 16:54:26,057.057 INFO:__main__: File "../qa/tasks/vstart_runner.py", line 401, in run 2021-03-05 16:54:26,058.058 INFO:__main__: return self._do_run(**kwargs) 2021-03-05 16:54:26,058.058 INFO:__main__: File "../qa/tasks/vstart_runner.py", line 465, in _do_run 2021-03-05 16:54:26,058.058 INFO:__main__: proc.wait() 2021-03-05 16:54:26,058.058 INFO:__main__: File "../qa/tasks/vstart_runner.py", line 215, in wait 2021-03-05 16:54:26,058.058 INFO:__main__: raise CommandFailedError(self.args, self.exitstatus) 2021-03-05 16:54:26,058.058 INFO:__main__:teuthology.exceptions.CommandFailedError: Command failed with status 22: ['./bin/ceph', 'daemon', 'mds', 'a', '--format=json', 'dirfrag', 'split', '/subdir', '0/0', '1'] 2021-03-05 16:54:26,058.058 INFO:__main__: Cannot find device "ceph-brx" 2021-03-05 16:54:26,166.166 INFO:__main__: 2021-03-05 16:54:26,167.167 INFO:__main__:---------------------------------------------------------------------- 2021-03-05 16:54:26,167.167 INFO:__main__:Ran 1 test in 27.725s 2021-03-05 16:54:26,167.167 INFO:__main__: 2021-03-05 16:54:26,168.168 INFO:__main__:FAILED (errors=1) 2021-03-05 16:54:26,168.168 INFO:__main__: 2021-03-05 16:54:26,168.168 INFO:__main__: 2021-03-05 16:54:26,168.168 INFO:__main__:====================================================================== 2021-03-05 16:54:26,168.168 INFO:__main__:ERROR: test_fragmented_injection (tasks.cephfs.test_data_scan.TestDataScan) 2021-03-05 16:54:26,169.169 INFO:__main__:---------------------------------------------------------------------- 2021-03-05 16:54:26,169.169 INFO:__main__:Traceback (most recent call last): 2021-03-05 16:54:26,169.169 INFO:__main__: File "/data/ceph/qa/tasks/cephfs/test_data_scan.py", line 457, in test_fragmented_injection 2021-03-05 16:54:26,169.169 INFO:__main__: self.fs.mds_asok(["dirfrag", "split", "/subdir", "0/0", "1"], mds_id) 2021-03-05 16:54:26,169.169 INFO:__main__: File "/data/ceph/qa/tasks/cephfs/filesystem.py", line 1122, in mds_asok 2021-03-05 16:54:26,169.169 INFO:__main__: return self.json_asok(command, 'mds', mds_id, timeout=timeout) 2021-03-05 16:54:26,169.169 INFO:__main__: File "/data/ceph/qa/tasks/cephfs/filesystem.py", line 246, in json_asok 2021-03-05 16:54:26,170.170 INFO:__main__: proc = self.mon_manager.admin_socket(service_type, service_id, command, timeout=timeout) 2021-03-05 16:54:26,170.170 INFO:__main__: File "../qa/tasks/vstart_runner.py", line 1012, in admin_socket 2021-03-05 16:54:26,170.170 INFO:__main__: timeout=timeout, stdout=stdout) 2021-03-05 16:54:26,170.170 INFO:__main__: File "../qa/tasks/vstart_runner.py", line 401, in run 2021-03-05 16:54:26,170.170 INFO:__main__: return self._do_run(**kwargs) 2021-03-05 16:54:26,170.170 INFO:__main__: File "../qa/tasks/vstart_runner.py", line 465, in _do_run 2021-03-05 16:54:26,170.170 INFO:__main__: proc.wait() 2021-03-05 16:54:26,171.171 INFO:__main__: File "../qa/tasks/vstart_runner.py", line 215, in wait 2021-03-05 16:54:26,171.171 INFO:__main__: raise CommandFailedError(self.args, self.exitstatus) 2021-03-05 16:54:26,171.171 INFO:__main__:teuthology.exceptions.CommandFailedError: Command failed with status 22: ['./bin/ceph', 'daemon', 'mds', 'a', '--format=json', 'dirfrag', 'split', '/subdir', '0/0', '1'] 2021-03-05 16:54:26,171.171 INFO:__main__: 2021-03-05 16:54:26,172.172 INFO:__main__: 2021-03-05 16:54:26,172.172 INFO:__main__: (venv) [root@lxbceph1 build]#
From https://github.com/ceph/ceph/pull/38684#issuecomment-791271513.
Updated by Xiubo Li about 3 years ago
- Assignee set to Xiubo Li
This only failed in the local test, will work on it tommorrow.
Updated by Xiubo Li about 3 years ago
- Status changed from New to In Progress
- Priority changed from Normal to High
Updated by Xiubo Li about 3 years ago
- Status changed from In Progress to Fix Under Review
- Pull request ID set to 40174
Updated by Xiubo Li about 3 years ago
There is another error log ahead of the above call trace:
Can't get admin socket path: unable to get conf option admin_socket for mds: b"error parsing 'mds': expected string of the form TYPE.ID, valid types are: auth, mon, osd, mds, mgr, client\n"
Updated by Patrick Donnelly about 3 years ago
- Status changed from Fix Under Review to Resolved
- Target version set to v17.0.0
- Source set to Development
Actions