Bug #13431
closedErrors in "rbd-fio-test"
0%
Description
Run: http://pulpito.ceph.com/teuthology-2015-10-05_23:00:07-rbd-master---basic-multi/
Job: 1091063
Logs: http://qa-proxy.ceph.com/teuthology/teuthology-2015-10-05_23:00:07-rbd-master---basic-multi/1091063/teuthology.log
2015-10-08T11:41:30.067 INFO:teuthology.orchestra.run.plana54:Running: 'rm -rf ~/rbd-fio-test' 2015-10-08T11:41:30.246 INFO:tasks.rbd_fio:Uninstall librbd devel package on plana54 2015-10-08T11:41:30.247 INFO:teuthology.orchestra.run.plana54:Running: 'sudo apt-get -y remove librbd-dev' 2015-10-08T11:41:30.287 INFO:teuthology.orchestra.run.plana54.stdout:Reading package lists... 2015-10-08T11:41:30.671 INFO:teuthology.orchestra.run.plana54.stdout:Building dependency tree... 2015-10-08T11:41:30.674 INFO:teuthology.orchestra.run.plana54.stdout:Reading state information... 2015-10-08T11:41:30.941 INFO:teuthology.orchestra.run.plana54.stdout:The following package was automatically installed and is no longer required: 2015-10-08T11:41:30.941 INFO:teuthology.orchestra.run.plana54.stdout: librados-dev 2015-10-08T11:41:30.941 INFO:teuthology.orchestra.run.plana54.stdout:Use 'apt-get autoremove' to remove it. 2015-10-08T11:41:30.973 INFO:teuthology.orchestra.run.plana54.stdout:The following packages will be REMOVED: 2015-10-08T11:41:30.973 INFO:teuthology.orchestra.run.plana54.stdout: librbd-dev 2015-10-08T11:41:31.263 INFO:teuthology.orchestra.run.plana54.stdout:0 upgraded, 0 newly installed, 1 to remove and 276 not upgraded. 2015-10-08T11:41:31.263 INFO:teuthology.orchestra.run.plana54.stdout:After this operation, 370 MB disk space will be freed. 2015-10-08T11:41:31.489 INFO:teuthology.orchestra.run.plana54.stdout:(Reading database ... 104787 files and directories currently installed.) 2015-10-08T11:41:31.494 INFO:teuthology.orchestra.run.plana54.stdout:Removing librbd-dev (9.0.3-2146-gd1e6976-1trusty) ... 2015-10-08T11:41:33.178 ERROR:teuthology.parallel:Exception in parallel execution Traceback (most recent call last): File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 82, in __exit__ for result in self: File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 101, in next resurrect_traceback(result) File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 19, in capture_traceback return func(*args, **kwargs) File "/var/lib/teuthworker/src/ceph-qa-suite_master/tasks/rbd_fio.py", line 186, in run_fio remote.run(args=['mkdir', run.Raw('~/rbd-fio-test'),]) File "/home/teuthworker/src/teuthology_master/teuthology/orchestra/remote.py", line 156, in run r = self._runner(client=self.ssh, name=self.shortname, **kwargs) File "/home/teuthworker/src/teuthology_master/teuthology/orchestra/run.py", line 378, in run r.wait() File "/home/teuthworker/src/teuthology_master/teuthology/orchestra/run.py", line 114, in wait label=self.label) CommandFailedError: Command failed on plana54 with status 1: 'mkdir ~/rbd-fio-test' 2015-10-08T11:41:33.194 ERROR:teuthology.run_tasks:Saw exception from tasks. Traceback (most recent call last): File "/home/teuthworker/src/teuthology_master/teuthology/run_tasks.py", line 56, in run_tasks manager.__enter__() File "/usr/lib/python2.7/contextlib.py", line 17, in __enter__ return self.gen.next() File "/var/lib/teuthworker/src/ceph-qa-suite_master/tasks/rbd_fio.py", line 60, in task p.spawn(run_fio, remote, config[client_config]) File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 82, in __exit__ for result in self: File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 101, in next resurrect_traceback(result) File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 19, in capture_traceback return func(*args, **kwargs) File "/var/lib/teuthworker/src/ceph-qa-suite_master/tasks/rbd_fio.py", line 186, in run_fio remote.run(args=['mkdir', run.Raw('~/rbd-fio-test'),]) File "/home/teuthworker/src/teuthology_master/teuthology/orchestra/remote.py", line 156, in run r = self._runner(client=self.ssh, name=self.shortname, **kwargs) File "/home/teuthworker/src/teuthology_master/teuthology/orchestra/run.py", line 378, in run r.wait() File "/home/teuthworker/src/teuthology_master/teuthology/orchestra/run.py", line 114, in wait label=self.label) CommandFailedError: Command failed on plana54 with status 1: 'mkdir ~/rbd-fio-test' 2015-10-08T11:41:33.235 ERROR:teuthology.run_tasks: Sentry event: http://sentry.ceph.com/sepia/teuthology/search?q=0bad53fe44b4404fbbc1b91985e838a4 CommandFailedError: Command failed on plana54 with status 1: 'mkdir ~/rbd-fio-test'
Updated by Vasu Kulkarni over 8 years ago
- Project changed from Ceph to teuthology
- Assignee set to Vasu Kulkarni
Not sure why mkdir failed, logged into failed node and dont see any stale directory, will keep an eye on future runs.
Updated by Yuri Weinstein over 8 years ago
Run: http://pulpito.ceph.com/teuthology-2015-10-09_23:00:08-rbd-master---basic-multi/
Jobs: ['1098256', '1098363', '1098421']
Updated by Jason Dillaman over 8 years ago
Is there a rationale for using sudo? You should perform all work within the directory specified via the $TESTDIR environment variable.
Updated by Vasu Kulkarni over 8 years ago
@yuri, the issues you pasted are not related to mkdir, its due to fio abruptly ending, we have similar issue where there is crash in mutex.cc, probably after that is fixed those issues can be revisited.
@Jason Borden:
the reason I am using sudo for rbd map and some operation is I think on rhel one of the command is failing without sudo, I had raised an issue earlier on that but if we want to fail the test due to it, i can remove it.
$TESTDIR i think is mostly useful for qa workunit scripts since there we are using ENV to communicate b/w distinct process, but here the script knows the folders are created in home dir and finally: block cleans up at the end of the test
Updated by Vasu Kulkarni over 8 years ago
I see the issue, if run fails abruptly it cannot enter the cleanup block causing stale dir's, I will move the test dir inside ~/cephtest so that nuke will handle cleanup.
Updated by Vasu Kulkarni over 8 years ago
- Status changed from New to Fix Under Review
Updated by Vasu Kulkarni over 8 years ago
- Status changed from Fix Under Review to Closed