Project

General

Profile

Actions

Bug #53958

open

gibba0045 is having nvme disk issues

Added by Vikhyat Umrao about 2 years ago. Updated about 2 years ago.

Status:
In Progress
Priority:
Normal
Assignee:
Category:
-
Target version:
-
% Done:

0%

Source:
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Crash signature (v1):
Crash signature (v2):

Description

- In gibba0045 we had suddenly eight OSD's going down/out.

# ceph osd tree down
ID   CLASS  WEIGHT    TYPE NAME          STATUS  REWEIGHT  PRI-AFF
 -1         13.27295  root default                                
-77          0.31480      host gibba045                           
888    ssd   0.01369          osd.888      down         0  1.00000
895    ssd   0.01369          osd.895      down         0  1.00000
916    ssd   0.01369          osd.916      down         0  1.00000
934    ssd   0.01369          osd.934      down         0  1.00000
940    ssd   0.01369          osd.940      down         0  1.00000
957    ssd   0.01369          osd.957      down         0  1.00000
962    ssd   0.01369          osd.962      down         0  1.00000
971    ssd   0.01369          osd.971      down         0  1.00000

- ceph-volume lvm list

[root@gibba045 log]# podman exec -ti eea512a11e9f /bin/bash
[root@gibba045 /]# ceph-volume lvm list

===== osd.888 ======

  [block]       /dev/ceph-dcfb10b5-bb93-4843-b6db-69461a979b7a/osd-block-d3684eac-a6fe-4da9-a9ed-9883f0d20d7a

      block device              /dev/ceph-dcfb10b5-bb93-4843-b6db-69461a979b7a/osd-block-d3684eac-a6fe-4da9-a9ed-9883f0d20d7a
      block uuid                T23AbT-lmWo-aVlo-Zrgc-SIxm-emhv-hSwXWU
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  d3684eac-a6fe-4da9-a9ed-9883f0d20d7a
      osd id                    888
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme10n1

===== osd.895 ======

  [block]       /dev/ceph-a790fbe7-a98e-47b5-9292-22a7d0fc46f4/osd-block-3fde610c-dbda-4034-a3f1-8ee29c59e1fa

      block device              /dev/ceph-a790fbe7-a98e-47b5-9292-22a7d0fc46f4/osd-block-3fde610c-dbda-4034-a3f1-8ee29c59e1fa
      block uuid                Smfxkn-QibQ-l4Rm-lKdC-0dml-IPlp-n3jkz0
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  3fde610c-dbda-4034-a3f1-8ee29c59e1fa
      osd id                    895
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme11n1

===== osd.902 ======

  [block]       /dev/ceph-bf656960-7bf8-4aa2-b8d6-8a2e2f92b8ba/osd-block-7b2c518a-c51e-46ed-94b6-5d3689ebe584

      block device              /dev/ceph-bf656960-7bf8-4aa2-b8d6-8a2e2f92b8ba/osd-block-7b2c518a-c51e-46ed-94b6-5d3689ebe584
      block uuid                q5sQUG-nyWO-SGjB-hFWp-1DUD-WGdr-r3D6yj
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  7b2c518a-c51e-46ed-94b6-5d3689ebe584
      osd id                    902
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme12n1

===== osd.909 ======

  [block]       /dev/ceph-6891c3ca-4ff1-4a4d-a2d7-a34e7781c066/osd-block-1ccc5615-c372-4aec-89f5-e0e7a9d9cbcc

      block device              /dev/ceph-6891c3ca-4ff1-4a4d-a2d7-a34e7781c066/osd-block-1ccc5615-c372-4aec-89f5-e0e7a9d9cbcc
      block uuid                rnQqQZ-vhDe-mPB0-Ylgj-xNpL-wavf-YKN3g3
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  1ccc5615-c372-4aec-89f5-e0e7a9d9cbcc
      osd id                    909
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme13n1

===== osd.916 ======

  [block]       /dev/ceph-cf710f92-1968-488f-9e02-65182713f614/osd-block-8ea5242a-1f0e-4484-92ae-7d0282ace900

      block device              /dev/ceph-cf710f92-1968-488f-9e02-65182713f614/osd-block-8ea5242a-1f0e-4484-92ae-7d0282ace900
      block uuid                yg5rBX-6NjV-EQ7k-YMw7-ATlS-nVQR-OiSpIf
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  8ea5242a-1f0e-4484-92ae-7d0282ace900
      osd id                    916
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme14n1

===== osd.922 ======

  [block]       /dev/ceph-b9c1b5fa-9c66-4959-a281-48ae35254519/osd-block-bac4fe8a-de9f-41fb-ba74-f100fa9cf894

      block device              /dev/ceph-b9c1b5fa-9c66-4959-a281-48ae35254519/osd-block-bac4fe8a-de9f-41fb-ba74-f100fa9cf894
      block uuid                0O7OuD-ohrv-tGu7-Wa91-7RFB-UoCK-YZenUl
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  bac4fe8a-de9f-41fb-ba74-f100fa9cf894
      osd id                    922
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme15n1

===== osd.928 ======

  [block]       /dev/ceph-b9d50d23-4fa6-4e6d-9255-d37ff557fe0d/osd-block-882f1dbb-bbe0-4a44-af05-91ecad8a94c9

      block device              /dev/ceph-b9d50d23-4fa6-4e6d-9255-d37ff557fe0d/osd-block-882f1dbb-bbe0-4a44-af05-91ecad8a94c9
      block uuid                Z5zAYJ-TqHH-LCjn-dETK-46vG-QisJ-5H65Pd
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  882f1dbb-bbe0-4a44-af05-91ecad8a94c9
      osd id                    928
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme16n1

===== osd.934 ======

  [block]       /dev/ceph-30b5dc82-f48e-449b-83a2-2bbed6354769/osd-block-6d38d53f-56f9-4771-b1f7-dc88c9c17bd0

      block device              /dev/ceph-30b5dc82-f48e-449b-83a2-2bbed6354769/osd-block-6d38d53f-56f9-4771-b1f7-dc88c9c17bd0
      block uuid                Ik4A8H-DsKr-qiiJ-cRIM-olO4-JNFr-UGgjen
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  6d38d53f-56f9-4771-b1f7-dc88c9c17bd0
      osd id                    934
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme17n1

===== osd.940 ======

  [block]       /dev/ceph-7314288b-8f7b-49e8-bb92-2d3d81741bca/osd-block-77385071-84c3-48b9-a103-d39e44734d89

      block device              /dev/ceph-7314288b-8f7b-49e8-bb92-2d3d81741bca/osd-block-77385071-84c3-48b9-a103-d39e44734d89
      block uuid                YPxW3w-qyzv-ODmB-APKN-tDbC-51xV-vovmoA
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  77385071-84c3-48b9-a103-d39e44734d89
      osd id                    940
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme18n1

===== osd.946 ======

  [block]       /dev/ceph-2159dcd0-1b21-4e3d-a1e0-1edd8a4b2d5c/osd-block-f30aff32-0724-4f60-a7ca-070b77752905

      block device              /dev/ceph-2159dcd0-1b21-4e3d-a1e0-1edd8a4b2d5c/osd-block-f30aff32-0724-4f60-a7ca-070b77752905
      block uuid                azVlX0-gKSe-FBDd-Zz0y-JiQP-RGNo-dZV5te
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  f30aff32-0724-4f60-a7ca-070b77752905
      osd id                    946
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme19n1

===== osd.952 ======

  [block]       /dev/ceph-2b4e4ce2-f59e-4e7b-99bf-4b1a7cc42a41/osd-block-2628e7df-564c-4911-8ee8-a15f32933955

      block device              /dev/ceph-2b4e4ce2-f59e-4e7b-99bf-4b1a7cc42a41/osd-block-2628e7df-564c-4911-8ee8-a15f32933955
      block uuid                rW9L5Q-5iQe-n9Bn-N4bQ-0jqL-9MN0-d9GBfG
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  2628e7df-564c-4911-8ee8-a15f32933955
      osd id                    952
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme1n1

===== osd.957 ======

  [block]       /dev/ceph-119ee58d-f1e6-4fdb-9402-5da8f377969c/osd-block-bc3d2437-1eb4-4807-93c1-200e812b26c0

      block device              /dev/ceph-119ee58d-f1e6-4fdb-9402-5da8f377969c/osd-block-bc3d2437-1eb4-4807-93c1-200e812b26c0
      block uuid                337eej-zvte-a4wc-DuDf-hABo-kM2V-VhDxQ5
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  bc3d2437-1eb4-4807-93c1-200e812b26c0
      osd id                    957
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme20n1

===== osd.962 ======

  [block]       /dev/ceph-91d8a7ae-65d9-4650-97ac-3f230ace295e/osd-block-2958c407-c3bc-4164-a76c-92fea54b263c

      block device              /dev/ceph-91d8a7ae-65d9-4650-97ac-3f230ace295e/osd-block-2958c407-c3bc-4164-a76c-92fea54b263c
      block uuid                saQmmW-5r5f-SHzn-o3Uo-uUoN-tXdT-QXKOWL
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  2958c407-c3bc-4164-a76c-92fea54b263c
      osd id                    962
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme21n1

===== osd.967 ======

  [block]       /dev/ceph-bc51a448-ad60-43d1-bbca-0ede5098b543/osd-block-7e1d3eff-3641-4333-bf77-719a8bcff97e

      block device              /dev/ceph-bc51a448-ad60-43d1-bbca-0ede5098b543/osd-block-7e1d3eff-3641-4333-bf77-719a8bcff97e
      block uuid                uXjyxH-7vk5-URfB-Xx8Z-sMce-pqw1-Bi2Zej
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  7e1d3eff-3641-4333-bf77-719a8bcff97e
      osd id                    967
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme22n1

===== osd.971 ======

  [block]       /dev/ceph-f9d42a94-0f01-402e-a596-b965fdae0c68/osd-block-874768f1-32fa-4577-8806-ed56feaf787a

      block device              /dev/ceph-f9d42a94-0f01-402e-a596-b965fdae0c68/osd-block-874768f1-32fa-4577-8806-ed56feaf787a
      block uuid                EPlhFu-XZ5F-2pGg-ULNG-JhPJ-csNU-9dgsPe
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  874768f1-32fa-4577-8806-ed56feaf787a
      osd id                    971
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme23n1

===== osd.974 ======

  [block]       /dev/ceph-33eefbec-6060-4ed0-a4d8-53cb46e5a610/osd-block-3edc946b-e864-40cd-bc8f-d79141709a52

      block device              /dev/ceph-33eefbec-6060-4ed0-a4d8-53cb46e5a610/osd-block-3edc946b-e864-40cd-bc8f-d79141709a52
      block uuid                fJgHCW-xSzP-Cw04-fN8b-RVPc-gCkT-QJexHu
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  3edc946b-e864-40cd-bc8f-d79141709a52
      osd id                    974
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme2n1

===== osd.977 ======

  [block]       /dev/ceph-49f22f33-fadb-44ef-8197-d0a58147c33c/osd-block-70010f28-0141-4b55-afdc-9cca7a8d81e6

      block device              /dev/ceph-49f22f33-fadb-44ef-8197-d0a58147c33c/osd-block-70010f28-0141-4b55-afdc-9cca7a8d81e6
      block uuid                2WwfqP-9qEQ-pTJw-dprp-Z1ON-YfSf-VeESgb
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  70010f28-0141-4b55-afdc-9cca7a8d81e6
      osd id                    977
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme3n1

===== osd.980 ======

  [block]       /dev/ceph-66cef31f-130e-4b10-9f98-2776936e57c9/osd-block-a053bfd4-9513-49ab-ac4d-83adc1071dc9

      block device              /dev/ceph-66cef31f-130e-4b10-9f98-2776936e57c9/osd-block-a053bfd4-9513-49ab-ac4d-83adc1071dc9
      block uuid                TgYyWJ-DrAg-sLNR-WcDd-7wmJ-agYP-MRhnbT
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  a053bfd4-9513-49ab-ac4d-83adc1071dc9
      osd id                    980
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme4n1

===== osd.983 ======

  [block]       /dev/ceph-989dbd6a-bf17-41fa-bfe5-172d879ff3ab/osd-block-b22f67d8-1612-406e-ac82-51981e8f91a0

      block device              /dev/ceph-989dbd6a-bf17-41fa-bfe5-172d879ff3ab/osd-block-b22f67d8-1612-406e-ac82-51981e8f91a0
      block uuid                RzDEzB-9sss-7WfO-Hay1-gK5Y-ny02-J0eGPy
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  b22f67d8-1612-406e-ac82-51981e8f91a0
      osd id                    983
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme5n1

===== osd.986 ======

  [block]       /dev/ceph-7b9f3416-d9fa-4377-b49b-87fce31cd7e6/osd-block-ed88d79b-5f06-4f44-ac55-386c182ba1ac

      block device              /dev/ceph-7b9f3416-d9fa-4377-b49b-87fce31cd7e6/osd-block-ed88d79b-5f06-4f44-ac55-386c182ba1ac
      block uuid                Bp7DO8-Zhh3-JUFb-yiZV-Bfec-T7rR-aXr7U1
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  ed88d79b-5f06-4f44-ac55-386c182ba1ac
      osd id                    986
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme6n1

===== osd.989 ======

  [block]       /dev/ceph-f6489745-1056-4be4-aea4-d13b2497f797/osd-block-b25d8a9f-8003-4ca9-b813-8973f8b0162f

      block device              /dev/ceph-f6489745-1056-4be4-aea4-d13b2497f797/osd-block-b25d8a9f-8003-4ca9-b813-8973f8b0162f
      block uuid                ik3fBp-kwLw-bMhh-eDij-Yrol-Ir11-UtO9JR
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  b25d8a9f-8003-4ca9-b813-8973f8b0162f
      osd id                    989
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme7n1

===== osd.992 ======

  [block]       /dev/ceph-089fbbfb-1a68-483a-ba85-4a0ccd1c0302/osd-block-e30bcc46-6e1b-43eb-b115-1f1c85ecb09f

      block device              /dev/ceph-089fbbfb-1a68-483a-ba85-4a0ccd1c0302/osd-block-e30bcc46-6e1b-43eb-b115-1f1c85ecb09f
      block uuid                OXbl6K-Ft7j-uOdN-wckC-6EWo-1j3O-9sriRP
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  e30bcc46-6e1b-43eb-b115-1f1c85ecb09f
      osd id                    992
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme8n1

===== osd.995 ======

  [block]       /dev/ceph-e877bc60-d4a4-4c88-8697-aed71778c1d1/osd-block-63cb34a3-9351-4ec5-81da-1237f53fa3c1

      block device              /dev/ceph-e877bc60-d4a4-4c88-8697-aed71778c1d1/osd-block-63cb34a3-9351-4ec5-81da-1237f53fa3c1
      block uuid                Rh0u3T-NX0I-gUij-LT5H-PhfB-cArz-lSjNNN
      cephx lockbox secret      
      cluster fsid              2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8
      cluster name              ceph
      crush device class        None
      encrypted                 0
      osd fsid                  63cb34a3-9351-4ec5-81da-1237f53fa3c1
      osd id                    995
      osdspec affinity          all-available-devices
      type                      block
      vdo                       0
      devices                   /dev/nvme9n1
[root@gibba045 /]# 

- OSD logs:

[root@gibba045 2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8]# grep -nr "I/O error" *
ceph-osd.916.log:7542:2022-01-20T23:03:27.901+0000 7fa44f7c5700 -1 bdev(0x561941110800 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:18647:    -2> 2022-01-20T23:03:27.901+0000 7fa44f7c5700 -1 bdev(0x561941110800 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:22015:2022-01-20T23:04:04.466+0000 7f6b985a4700 -1 bdev(0x55ead04d6c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:26651:    -2> 2022-01-20T23:04:04.466+0000 7f6b985a4700 -1 bdev(0x55ead04d6c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:29390:2022-01-20T23:04:21.486+0000 7f033555a700 -1 bdev(0x56418b676c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:32389:    -2> 2022-01-20T23:04:21.486+0000 7f033555a700 -1 bdev(0x56418b676c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:35100:2022-01-20T23:04:38.949+0000 7f34a7f4a700 -1 bdev(0x560ead7a8c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:38099:    -2> 2022-01-20T23:04:38.949+0000 7f34a7f4a700 -1 bdev(0x560ead7a8c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:40810:2022-01-20T23:04:55.977+0000 7f0c086c1700 -1 bdev(0x5636a0b70c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:43806:    -2> 2022-01-20T23:04:55.977+0000 7f0c086c1700 -1 bdev(0x5636a0b70c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:46517:2022-01-20T23:05:25.753+0000 7f95e83b2700 -1 bdev(0x5634df9b8c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.916.log:49517:    -2> 2022-01-20T23:05:25.753+0000 7f95e83b2700 -1 bdev(0x5634df9b8c00 /var/lib/ceph/osd/ceph-916/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:6999:2022-01-20T22:58:21.734+0000 7fefa7104700 -1 bdev(0x556f42792800 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:17563:    -2> 2022-01-20T22:58:21.734+0000 7fefa7104700 -1 bdev(0x556f42792800 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:20945:2022-01-20T22:58:49.376+0000 7f7dbc91f700 -1 bdev(0x562756d68c00 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:25377:    -2> 2022-01-20T22:58:49.376+0000 7f7dbc91f700 -1 bdev(0x562756d68c00 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:28113:2022-01-20T22:59:06.734+0000 7f0cc83b8700 -1 bdev(0x55ca090c0c00 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:31109:    -2> 2022-01-20T22:59:06.734+0000 7f0cc83b8700 -1 bdev(0x55ca090c0c00 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:33820:2022-01-20T22:59:23.998+0000 7f30acf5c700 -1 bdev(0x55eb3c47f000 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:36819:    -2> 2022-01-20T22:59:23.998+0000 7f30acf5c700 -1 bdev(0x55eb3c47f000 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:39530:2022-01-20T22:59:41.113+0000 7f2b0f6c1700 -1 bdev(0x55ff7138ac00 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:42526:    -2> 2022-01-20T22:59:41.113+0000 7f2b0f6c1700 -1 bdev(0x55ff7138ac00 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:45237:2022-01-20T22:59:58.385+0000 7f27b6fc7700 -1 bdev(0x55ba3173cc00 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.934.log:48236:    -2> 2022-01-20T22:59:58.385+0000 7f27b6fc7700 -1 bdev(0x55ba3173cc00 /var/lib/ceph/osd/ceph-934/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:7074:2022-01-20T22:57:33.503+0000 7f92d04a9700 -1 bdev(0x55ff5d402c00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:11749:    -2> 2022-01-20T22:57:33.503+0000 7f92d04a9700 -1 bdev(0x55ff5d402c00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:14485:2022-01-20T22:57:50.599+0000 7f419d4e3700 -1 bdev(0x5616fcae6c00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:17484:    -2> 2022-01-20T22:57:50.599+0000 7f419d4e3700 -1 bdev(0x5616fcae6c00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:20195:2022-01-20T22:58:07.731+0000 7f3b5a2a2700 -1 bdev(0x564376a8cc00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:23194:    -2> 2022-01-20T22:58:07.731+0000 7f3b5a2a2700 -1 bdev(0x564376a8cc00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:25905:2022-01-20T22:58:25.007+0000 7f4d2a581700 -1 bdev(0x55689b7ecc00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:28904:    -2> 2022-01-20T22:58:25.007+0000 7f4d2a581700 -1 bdev(0x55689b7ecc00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:31615:2022-01-20T22:58:44.176+0000 7f08c0a1b700 -1 bdev(0x55ee293d0c00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.957.log:34611:    -2> 2022-01-20T22:58:44.176+0000 7f08c0a1b700 -1 bdev(0x55ee293d0c00 /var/lib/ceph/osd/ceph-957/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.971.log:9607:2022-01-20T22:57:31.023+0000 7f224c404700 -1 bdev(0x56092a44ec00 /var/lib/ceph/osd/ceph-971/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.971.log:12599:    -2> 2022-01-20T22:57:31.023+0000 7f224c404700 -1 bdev(0x56092a44ec00 /var/lib/ceph/osd/ceph-971/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.971.log:15310:2022-01-20T22:57:48.353+0000 7ff4ad102700 -1 bdev(0x55dba1b92c00 /var/lib/ceph/osd/ceph-971/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.971.log:18303:    -2> 2022-01-20T22:57:48.353+0000 7ff4ad102700 -1 bdev(0x55dba1b92c00 /var/lib/ceph/osd/ceph-971/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.971.log:21014:2022-01-20T22:58:05.836+0000 7fa861b0e700 -1 bdev(0x55d831cacc00 /var/lib/ceph/osd/ceph-971/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.971.log:24013:    -2> 2022-01-20T22:58:05.836+0000 7fa861b0e700 -1 bdev(0x55d831cacc00 /var/lib/ceph/osd/ceph-971/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.971.log:26724:2022-01-20T22:58:23.034+0000 7efc837b3700 -1 bdev(0x557203a90c00 /var/lib/ceph/osd/ceph-971/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.971.log:29723:    -2> 2022-01-20T22:58:23.034+0000 7efc837b3700 -1 bdev(0x557203a90c00 /var/lib/ceph/osd/ceph-971/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.980.log:6999:2022-01-20T23:00:16.484+0000 7ff0afd5c700 -1 bdev(0x56515f43a800 /var/lib/ceph/osd/ceph-980/block) _aio_thread got r=-121 ((121) Remote I/O error)
ceph-osd.980.log:17563:    -2> 2022-01-20T23:00:16.484+0000 7ff0afd5c700 -1 bdev(0x56515f43a800 /var/lib/ceph/osd/ceph-980/block) _aio_thread got r=-121 ((121) Remote I/O error)
[root@gibba045 2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8]# grep -nr "unexpected" *
[root@gibba045 2c15bd96-7a20-11ec-b7e0-3cecef3d8fb8]# grep -nr "Unexpected" *
ceph-osd.916.log:7544:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:18649:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:22017:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:26653:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:29392:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:32391:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:35102:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:38101:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:40812:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:43808:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:46519:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.916.log:49519:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:7001:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:17565:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:20947:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:25379:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:28115:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:31111:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:33822:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:36821:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:39532:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:42528:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:45239:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.934.log:48238:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:7076:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:11751:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:14487:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:17486:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:20197:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:23196:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:25907:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:28906:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:31617:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.957.log:34613:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.971.log:9609:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.971.log:12601:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.971.log:15312:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.971.log:18305:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.971.log:21016:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.971.log:24015:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.971.log:26726:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.971.log:29725:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.980.log:7001:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")
ceph-osd.980.log:17565:/home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/16.2.7/rpm/el8/BUILD/ceph-16.2.7/src/blk/kernel/KernelDevice.cc: 600: ceph_abort_msg("Unexpected IO error. This may suggest a hardware issue. Please check your kernel log!")

- System(kernel) logs:


Jan 20 22:51:40 gibba045 kernel: blk_update_request: critical target error, dev nvme21n1, sector 5323136 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 0
Jan 20 22:51:40 gibba045 kernel: blk_update_request: critical target error, dev nvme21n1, sector 5323144 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 0
Jan 20 22:51:40 gibba045 kernel: blk_update_request: critical target error, dev nvme21n1, sector 5323152 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 0
Jan 20 22:51:40 gibba045 kernel: blk_update_request: critical target error, dev nvme21n1, sector 5323160 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 0
Jan 20 22:58:05 gibba045 kernel: blk_update_request: critical target error, dev nvme23n1, sector 5320704 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 0
Jan 20 22:58:07 gibba045 kernel: blk_update_request: critical target error, dev nvme20n1, sector 5319680 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 0
Jan 20 22:58:21 gibba045 kernel: blk_update_request: critical target error, dev nvme17n1, sector 5324800 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 0
Jan 20 22:58:21 gibba045 kernel: blk_update_request: critical target error, dev nvme17n1, sector 5324808 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 0
Actions #1

Updated by adam kraitman about 2 years ago

  • Assignee set to adam kraitman
Actions #2

Updated by adam kraitman about 2 years ago

  • Status changed from New to In Progress
Actions #3

Updated by David Galloway about 2 years ago

I think the NVMe card has indeed died. Will file an RMA.

Actions #4

Updated by Vikhyat Umrao about 2 years ago

David Galloway wrote:

I think the NVMe card has indeed died. Will file an RMA.

Thanks, David.

Actions

Also available in: Atom PDF