Project

General

Profile

Actions

Bug #63998

open

Containers have too old lsblk. Partitions aren't reported correctly

Added by Matti Saarinen 4 months ago. Updated 4 months ago.

Status:
New
Priority:
Normal
Assignee:
-
Category:
container tools
Target version:
-
% Done:

0%

Source:
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

If OSD nodes have logical drives and I want Ceph to use the partition of the drives it won't work. It's because lsblk that comes with the container doesn't report the partitions as partitions and therefore ceph-volume ignores them. Could the lsblk be upgraded to a version that reports the partition correctly? Also, if there is a workaround I would be very intereted in learning it.

Here is an example:

# grep PRETTY /etc/os-release 
PRETTY_NAME="Red Hat Enterprise Linux 9.3 (Plow)" 

# cephadm version
cephadm version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)

# cephadm --image d2cdd87030d1 inspect-image                            
{
    "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)",
    "image_id": "d2cdd87030d19393788f48f06e1a453e100160debff5a32c6b537a2b2e450e8e",
    "repo_digests": [
        "quay.io/ceph/ceph@sha256:5e15da7c3173070a55d3cafdf76075fbd4e49f348f1efec53235290f85659c6e" 
    ]
}

# lsblk | grep md
└─md0             9:0    0   1.5T  0 raid1
  ├─md0p1       259:4    0     1T  0 part
  └─md0p2       259:5    0 466.3G  0 part  
└─md0             9:0    0   1.5T  0 raid1
  ├─md0p1       259:4    0     1T  0 part
  └─md0p2       259:5    0 466.3G  0 part

# cephadm shell
Using ceph image with id 'd2cdd87030d1' and tag '<none>' created on 2023-12-15 19:31:03 +0000 UTC
quay.io/ceph/ceph@sha256:5e15da7c3173070a55d3cafdf76075fbd4e49f348f1efec53235290f85659c6e

# lsblk -V
lsblk from util-linux 2.32.1

# lsblk | grep md
`-md0             9:0    0   1.5T  0 raid1 
  |-md0p1       259:4    0     1T  0 md
  `-md0p2       259:5    0 466.3G  0 md
`-md0             9:0    0   1.5T  0 raid1
  |-md0p1       259:4    0     1T  0 md    
  `-md0p2       259:5    0 466.3G  0 md

Actions #1

Updated by Matti Saarinen 4 months ago

One essential information I forgot. The host has lsblk 2.37.4.

Actions

Also available in: Atom PDF