Project

General

Profile

Actions

Documentation #45383

closed

Cephadm.py OSD deployment fails: full device path or just the name?

Added by Georgios Kyratsas about 4 years ago. Updated over 3 years ago.

Status:
Can't reproduce
Priority:
Normal
Assignee:
-
Category:
cephadm
Target version:
% Done:

0%

Tags:
Backport:
Reviewed:
Affected Versions:
Pull request ID:

Description

OSD deployment on cephadm.py fails on my local teuthology server due to not failing to recognize the device. When I just reverted commit f026a1c it works fine for me so could someone clarify if the syntax with short name should work or not.

2020-05-04T09:50:46.693 INFO:tasks.cephadm:Deploying osd.0 on target-geky-069 with /dev/vde...
2020-05-04T09:50:46.694 INFO:teuthology.orchestra.run.target-geky-069:> sudo cephadm --image registry.suse.de/devel/storage/7.0/cr/containers/ses/7/ceph/ceph shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ee0a3b12-8dea-11ea-9277-fa163e22acf0 -- ceph-volume lvm zap /dev/vde
2020-05-04T09:50:48.303 INFO:teuthology.orchestra.run.target-geky-069.stderr:--> Zapping: /dev/vde
2020-05-04T09:50:48.303 INFO:teuthology.orchestra.run.target-geky-069.stderr:--> --destroy was not specified, but zapping a whole device will remove the partition table
2020-05-04T09:50:48.304 INFO:teuthology.orchestra.run.target-geky-069.stderr:Running command: /usr/bin/dd if=/dev/zero of=/dev/vde bs=1M count=10 conv=fsync
2020-05-04T09:50:48.304 INFO:teuthology.orchestra.run.target-geky-069.stderr: stderr: 10+0 records in
2020-05-04T09:50:48.305 INFO:teuthology.orchestra.run.target-geky-069.stderr:10+0 records out
2020-05-04T09:50:48.306 INFO:teuthology.orchestra.run.target-geky-069.stderr: stderr: 10485760 bytes (10 MB, 10 MiB) copied, 0.0997165 s, 105 MB/s
2020-05-04T09:50:48.307 INFO:teuthology.orchestra.run.target-geky-069.stderr:--> Zapping successful for: <Raw Device: /dev/vde>
2020-05-04T09:50:48.437 INFO:teuthology.orchestra.run.target-geky-069:> sudo cephadm --image registry.suse.de/devel/storage/7.0/cr/containers/ses/7/ceph/ceph shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ee0a3b12-8dea-11ea-9277-fa163e22acf0 -- ceph orch daemon add osd target-geky-069:vde
2020-05-04T09:50:51.597 INFO:teuthology.orchestra.run.target-geky-069.stderr:Error EINVAL: Traceback (most recent call last):
2020-05-04T09:50:51.598 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/mgr_module.py", line 1153, in _handle_command
2020-05-04T09:50:51.598 INFO:teuthology.orchestra.run.target-geky-069.stderr:    return self.handle_command(inbuf, cmd)
2020-05-04T09:50:51.599 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 110, in handle_command
2020-05-04T09:50:51.599 INFO:teuthology.orchestra.run.target-geky-069.stderr:    return dispatch[cmd['prefix']].call(self, cmd, inbuf)
2020-05-04T09:50:51.600 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/mgr_module.py", line 308, in call
2020-05-04T09:50:51.600 INFO:teuthology.orchestra.run.target-geky-069.stderr:    return self.func(mgr, **kwargs)
2020-05-04T09:50:51.601 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 72, in <lambda>
2020-05-04T09:50:51.602 INFO:teuthology.orchestra.run.target-geky-069.stderr:    wrapper_copy = lambda *l_args, **l_kwargs: wrapper(*l_args, **l_kwargs)
2020-05-04T09:50:51.602 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 63, in wrapper
2020-05-04T09:50:51.603 INFO:teuthology.orchestra.run.target-geky-069.stderr:    return func(*args, **kwargs)
2020-05-04T09:50:51.603 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/orchestrator/module.py", line 597, in _daemon_add_osd
2020-05-04T09:50:51.604 INFO:teuthology.orchestra.run.target-geky-069.stderr:    completion = self.create_osds(drive_group)
2020-05-04T09:50:51.604 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 1542, in inner
2020-05-04T09:50:51.604 INFO:teuthology.orchestra.run.target-geky-069.stderr:    completion = self._oremote(method_name, args, kwargs)
2020-05-04T09:50:51.605 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 1614, in _oremote
2020-05-04T09:50:51.606 INFO:teuthology.orchestra.run.target-geky-069.stderr:    return mgr.remote(o, meth, *args, **kwargs)
2020-05-04T09:50:51.607 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/mgr_module.py", line 1515, in remote
2020-05-04T09:50:51.607 INFO:teuthology.orchestra.run.target-geky-069.stderr:    args, kwargs)
2020-05-04T09:50:51.607 INFO:teuthology.orchestra.run.target-geky-069.stderr:RuntimeError: Remote method threw exception: Traceback (most recent call last):
2020-05-04T09:50:51.608 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/cephadm/module.py", line 559, in wrapper
2020-05-04T09:50:51.608 INFO:teuthology.orchestra.run.target-geky-069.stderr:    return AsyncCompletion(value=f(*args, **kwargs), name=f.__name__)
2020-05-04T09:50:51.608 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/cephadm/module.py", line 2142, in create_osds
2020-05-04T09:50:51.609 INFO:teuthology.orchestra.run.target-geky-069.stderr:    replace_osd_ids=drive_group.osd_id_claims.get(host, []))
2020-05-04T09:50:51.609 INFO:teuthology.orchestra.run.target-geky-069.stderr:  File "/usr/share/ceph/mgr/cephadm/module.py", line 2248, in _create_osd
2020-05-04T09:50:51.609 INFO:teuthology.orchestra.run.target-geky-069.stderr:    code, '\n'.join(err)))
2020-05-04T09:50:51.610 INFO:teuthology.orchestra.run.target-geky-069.stderr:RuntimeError: cephadm exited with an error code: 1, stderr:INFO:cephadm:/usr/bin/podman:stderr  stderr: lsblk: vde: not a block device
2020-05-04T09:50:51.610 INFO:teuthology.orchestra.run.target-geky-069.stderr:INFO:cephadm:/usr/bin/podman:stderr  stderr: blkid: error: vde: No such file or directory
2020-05-04T09:50:51.611 INFO:teuthology.orchestra.run.target-geky-069.stderr:INFO:cephadm:/usr/bin/podman:stderr  stderr: Unknown device, --name=, --path=, or absolute path in /dev/ or /sys expected.

Traceback is:

Error EINVAL: Traceback (most recent call last):
  File "/usr/share/ceph/mgr/mgr_module.py", line 1153, in _handle_command
    return self.handle_command(inbuf, cmd)
  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 110, in handle_command
    return dispatch[cmd['prefix']].call(self, cmd, inbuf)
  File "/usr/share/ceph/mgr/mgr_module.py", line 308, in call
    return self.func(mgr, **kwargs)
  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 72, in <lambda>
    wrapper_copy = lambda *l_args, **l_kwargs: wrapper(*l_args, **l_kwargs)
  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 63, in wrapper
    return func(*args, **kwargs)
  File "/usr/share/ceph/mgr/orchestrator/module.py", line 597, in _daemon_add_osd
    completion = self.create_osds(drive_group)
  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 1542, in inner
    completion = self._oremote(method_name, args, kwargs)
  File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 1614, in _oremote
    return mgr.remote(o, meth, *args, **kwargs)
  File "/usr/share/ceph/mgr/mgr_module.py", line 1515, in remote
    args, kwargs)
RuntimeError: Remote method threw exception: Traceback (most recent call last):
  File "/usr/share/ceph/mgr/cephadm/module.py", line 559, in wrapper
    return AsyncCompletion(value=f(*args, **kwargs), name=f.__name__)
  File "/usr/share/ceph/mgr/cephadm/module.py", line 2142, in create_osds
    replace_osd_ids=drive_group.osd_id_claims.get(host, []))
  File "/usr/share/ceph/mgr/cephadm/module.py", line 2248, in _create_osd
    code, '\n'.join(err)))
RuntimeError: cephadm exited with an error code: 1, stderr:INFO:cephadm:/usr/bin/podman:stderr  stderr: lsblk: vde: not a block device
Actions

Also available in: Atom PDF