Project

General

Profile

Actions

Bug #17680

closed

ceph-disk reports "mount_activate: Failed to activate" in teuthology workunit (permissions issue)

Added by Nathan Cutler over 7 years ago. Updated over 7 years ago.

Status:
Can't reproduce
Priority:
Normal
Assignee:
Category:
-
Target version:
-
% Done:

0%

Source:
Community (dev)
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

I am developing the following ceph-qa-suite test which I am running in OVH: https://github.com/SUSE/ceph-qa-suite/blob/wip-16878/suites/rados/singleton-nomsgr/all/16878.yaml

As you can see, the test creates OSDs on two real disks, plus there is a third disk which I would like to use for external journals.

I am using the following workunit: https://github.com/SUSE/ceph/blob/wip-16878/qa/workunits/rados/16878.sh

The command "sudo ceph-disk --verbose ceph-disk activate /dev/vdb1" (near the end of the script) fails. So far, it has done so with two different error messages.

The first time it failed, I was running the commands from the workunit script manually and I must have done something slightly different, because the error was http://paste2.org/1OP0JDt7 and I have been unable to reproduce that.

Actions #1

Updated by Nathan Cutler over 7 years ago

  • Subject changed from ceph-disk reports "mount_activate: Failed to activate" in teuthology workunit to ceph-disk reports "mount_activate: Failed to activate" in teuthology workunit (permissions issue)
  • Description updated (diff)
Actions #2

Updated by Nathan Cutler over 7 years ago

  • Status changed from New to Can't reproduce
Actions

Also available in: Atom PDF