Bug #23092
openRGW leaking/orphan data with Jewel
0%
Description
This was previously reported under #18331 but this popped up again on a Jewel cluster.
Some information:
- Jewel 10.2.9
- Two RADOS Gateway Instances
- Docker DTR as client ( https://docs.docker.com/registry/spec/api/#detail )
Using this command orphans can be found and removed, but that should not be the 'normal'.
radosgw-admin orphans find --pool default.rgw.buckets.data --job-id=checkorph > orphans2 awk '/leaked/ { print $2 }' orphans2 | while read line; do rados -p default.rgw.buckets.data rm $line; done radosgw-admin orphans finish --job-id=checkorph
I've checked a few things:
- RADOS Gateways do not crash
- Object versioning is not used
Currently we don't have any log files regarding this because running with debug_rgw=20 would generate a lot of data and might need to run for a very long time.
Multipart uploads and possible overwrites of objects are done by the Docker DTR and I have a feeling it's related to that when I see output of the orphan scan:
2018-02-22 09:49:08.694513 7fe3c4fd0a00 0 run(): building index of all bucket indexes storing 1 entries at orphan.scan.checkorph.buckets.9 2018-02-22 09:49:08.712446 7fe3c4fd0a00 0 run(): building index of all linked objects 2018-02-22 09:49:08.712673 7fe3c4fd0a00 0 building linked oids index: 0/64 ERROR: read_entries(orphan.scan.checkorph.buckets.0) returned ret=-2 2018-02-22 09:49:08.714690 7fe3c4fd0a00 0 building linked oids index: 1/64 ERROR: read_entries(orphan.scan.checkorph.buckets.1) returned ret=-2 2018-02-22 09:49:08.717263 7fe3c4fd0a00 0 building linked oids index: 2/64 ERROR: read_entries(orphan.scan.checkorph.buckets.2) returned ret=-2 2018-02-22 09:49:08.719710 7fe3c4fd0a00 0 building linked oids index: 3/64 ERROR: read_entries(orphan.scan.checkorph.buckets.3) returned ret=-2 2018-02-22 09:49:08.721511 7fe3c4fd0a00 0 building linked oids index: 4/64 ERROR: read_entries(orphan.scan.checkorph.buckets.4) returned ret=-2 2018-02-22 09:49:08.723848 7fe3c4fd0a00 0 building linked oids index: 5/64 ERROR: read_entries(orphan.scan.checkorph.buckets.5) returned ret=-2 2018-02-22 09:49:08.725737 7fe3c4fd0a00 0 building linked oids index: 6/64 ERROR: read_entries(orphan.scan.checkorph.buckets.6) returned ret=-2 2018-02-22 09:49:08.727379 7fe3c4fd0a00 0 building linked oids index: 7/64 ERROR: read_entries(orphan.scan.checkorph.buckets.7) returned ret=-2 2018-02-22 09:49:08.729231 7fe3c4fd0a00 0 building linked oids index: 8/64 ERROR: read_entries(orphan.scan.checkorph.buckets.8) returned ret=-2 2018-02-22 09:49:08.731622 7fe3c4fd0a00 0 building linked oids index: 9/64 2018-02-22 09:49:08.750650 7fe3c4fd0a00 0 building linked oids index: 10/64 ERROR: read_entries(orphan.scan.checkorph.buckets.10) returned ret=-2 2018-02-22 09:49:08.752870 7fe3c4fd0a00 0 building linked oids index: 11/64 ERROR: read_entries(orphan.scan.checkorph.buckets.11) returned ret=-2 2018-02-22 09:49:08.754326 7fe3c4fd0a00 0 building linked oids index: 12/64 ERROR: read_entries(orphan.scan.checkorph.buckets.12) returned ret=-2 2018-02-22 09:49:08.756223 7fe3c4fd0a00 0 building linked oids index: 13/64 ERROR: read_entries(orphan.scan.checkorph.buckets.13) returned ret=-2 2018-02-22 09:49:08.757784 7fe3c4fd0a00 0 building linked oids index: 14/64 ERROR: read_entries(orphan.scan.checkorph.buckets.14) returned ret=-2 2018-02-22 09:49:08.759495 7fe3c4fd0a00 0 building linked oids index: 15/64 ERROR: read_entries(orphan.scan.checkorph.buckets.15) returned ret=-2 2018-02-22 09:49:08.761156 7fe3c4fd0a00 0 building linked oids index: 16/64 ERROR: read_entries(orphan.scan.checkorph.buckets.16) returned ret=-2 2018-02-22 09:49:08.763629 7fe3c4fd0a00 0 building linked oids index: 17/64 ERROR: read_entries(orphan.scan.checkorph.buckets.17) returned ret=-2 2018-02-22 09:49:08.765347 7fe3c4fd0a00 0 building linked oids index: 18/64 ERROR: read_entries(orphan.scan.checkorph.buckets.18) returned ret=-2 2018-02-22 09:49:08.767029 7fe3c4fd0a00 0 building linked oids index: 19/64 ERROR: read_entries(orphan.scan.checkorph.buckets.19) returned ret=-2 2018-02-22 09:49:08.768641 7fe3c4fd0a00 0 building linked oids index: 20/64 ERROR: read_entries(orphan.scan.checkorph.buckets.20) returned ret=-2 2018-02-22 09:49:08.770190 7fe3c4fd0a00 0 building linked oids index: 21/64 ERROR: read_entries(orphan.scan.checkorph.buckets.21) returned ret=-2 2018-02-22 09:49:08.771814 7fe3c4fd0a00 0 building linked oids index: 22/64 ERROR: read_entries(orphan.scan.checkorph.buckets.22) returned ret=-2 2018-02-22 09:49:08.774429 7fe3c4fd0a00 0 building linked oids index: 23/64 ERROR: read_entries(orphan.scan.checkorph.buckets.23) returned ret=-2 2018-02-22 09:49:08.776749 7fe3c4fd0a00 0 building linked oids index: 24/64 ERROR: read_entries(orphan.scan.checkorph.buckets.24) returned ret=-2 2018-02-22 09:49:08.778995 7fe3c4fd0a00 0 building linked oids index: 25/64 ERROR: read_entries(orphan.scan.checkorph.buckets.25) returned ret=-2 2018-02-22 09:49:08.781064 7fe3c4fd0a00 0 building linked oids index: 26/64 ERROR: read_entries(orphan.scan.checkorph.buckets.26) returned ret=-2 2018-02-22 09:49:08.782814 7fe3c4fd0a00 0 building linked oids index: 27/64 ERROR: read_entries(orphan.scan.checkorph.buckets.27) returned ret=-2 2018-02-22 09:49:08.785029 7fe3c4fd0a00 0 building linked oids index: 28/64 ERROR: read_entries(orphan.scan.checkorph.buckets.28) returned ret=-2 2018-02-22 09:49:08.786391 7fe3c4fd0a00 0 building linked oids index: 29/64 ERROR: read_entries(orphan.scan.checkorph.buckets.29) returned ret=-2 2018-02-22 09:49:08.787968 7fe3c4fd0a00 0 building linked oids index: 30/64 ERROR: read_entries(orphan.scan.checkorph.buckets.30) returned ret=-2 2018-02-22 09:49:08.789261 7fe3c4fd0a00 0 building linked oids index: 31/64 ERROR: read_entries(orphan.scan.checkorph.buckets.31) returned ret=-2 2018-02-22 09:49:08.791160 7fe3c4fd0a00 0 building linked oids index: 32/64 ERROR: read_entries(orphan.scan.checkorph.buckets.32) returned ret=-2 2018-02-22 09:49:08.793280 7fe3c4fd0a00 0 building linked oids index: 33/64 ERROR: read_entries(orphan.scan.checkorph.buckets.33) returned ret=-2 2018-02-22 09:49:08.795083 7fe3c4fd0a00 0 building linked oids index: 34/64 ERROR: read_entries(orphan.scan.checkorph.buckets.34) returned ret=-2 2018-02-22 09:49:08.796738 7fe3c4fd0a00 0 building linked oids index: 35/64 ERROR: read_entries(orphan.scan.checkorph.buckets.35) returned ret=-2 2018-02-22 09:49:08.798553 7fe3c4fd0a00 0 building linked oids index: 36/64 ERROR: read_entries(orphan.scan.checkorph.buckets.36) returned ret=-2 2018-02-22 09:49:08.801154 7fe3c4fd0a00 0 building linked oids index: 37/64 ERROR: read_entries(orphan.scan.checkorph.buckets.37) returned ret=-2 2018-02-22 09:49:08.802854 7fe3c4fd0a00 0 building linked oids index: 38/64 ERROR: read_entries(orphan.scan.checkorph.buckets.38) returned ret=-2 2018-02-22 09:49:08.804554 7fe3c4fd0a00 0 building linked oids index: 39/64 ERROR: read_entries(orphan.scan.checkorph.buckets.39) returned ret=-2 2018-02-22 09:49:08.806850 7fe3c4fd0a00 0 building linked oids index: 40/64 ERROR: read_entries(orphan.scan.checkorph.buckets.40) returned ret=-2 2018-02-22 09:49:08.808575 7fe3c4fd0a00 0 building linked oids index: 41/64 ERROR: read_entries(orphan.scan.checkorph.buckets.41) returned ret=-2 2018-02-22 09:49:08.810261 7fe3c4fd0a00 0 building linked oids index: 42/64 ERROR: read_entries(orphan.scan.checkorph.buckets.42) returned ret=-2 2018-02-22 09:49:08.811851 7fe3c4fd0a00 0 building linked oids index: 43/64 ERROR: read_entries(orphan.scan.checkorph.buckets.43) returned ret=-2 2018-02-22 09:49:08.814315 7fe3c4fd0a00 0 building linked oids index: 44/64 ERROR: read_entries(orphan.scan.checkorph.buckets.44) returned ret=-2 2018-02-22 09:49:08.815980 7fe3c4fd0a00 0 building linked oids index: 45/64 ERROR: read_entries(orphan.scan.checkorph.buckets.45) returned ret=-2 2018-02-22 09:49:08.818174 7fe3c4fd0a00 0 building linked oids index: 46/64 ERROR: read_entries(orphan.scan.checkorph.buckets.46) returned ret=-2 2018-02-22 09:49:08.819920 7fe3c4fd0a00 0 building linked oids index: 47/64 ERROR: read_entries(orphan.scan.checkorph.buckets.47) returned ret=-2 2018-02-22 09:49:08.821492 7fe3c4fd0a00 0 building linked oids index: 48/64 ERROR: read_entries(orphan.scan.checkorph.buckets.48) returned ret=-2 2018-02-22 09:49:08.823463 7fe3c4fd0a00 0 building linked oids index: 49/64 ERROR: read_entries(orphan.scan.checkorph.buckets.49) returned ret=-2 2018-02-22 09:49:08.825118 7fe3c4fd0a00 0 building linked oids index: 50/64 ERROR: read_entries(orphan.scan.checkorph.buckets.50) returned ret=-2 2018-02-22 09:49:08.826757 7fe3c4fd0a00 0 building linked oids index: 51/64 ERROR: read_entries(orphan.scan.checkorph.buckets.51) returned ret=-2 2018-02-22 09:49:08.828403 7fe3c4fd0a00 0 building linked oids index: 52/64 ERROR: read_entries(orphan.scan.checkorph.buckets.52) returned ret=-2 2018-02-22 09:49:08.830000 7fe3c4fd0a00 0 building linked oids index: 53/64 ERROR: read_entries(orphan.scan.checkorph.buckets.53) returned ret=-2 2018-02-22 09:49:08.831462 7fe3c4fd0a00 0 building linked oids index: 54/64 ERROR: read_entries(orphan.scan.checkorph.buckets.54) returned ret=-2 2018-02-22 09:49:08.833956 7fe3c4fd0a00 0 building linked oids index: 55/64 ERROR: read_entries(orphan.scan.checkorph.buckets.55) returned ret=-2 2018-02-22 09:49:08.835377 7fe3c4fd0a00 0 building linked oids index: 56/64 ERROR: read_entries(orphan.scan.checkorph.buckets.56) returned ret=-2 2018-02-22 09:49:08.837581 7fe3c4fd0a00 0 building linked oids index: 57/64 ERROR: read_entries(orphan.scan.checkorph.buckets.57) returned ret=-2 2018-02-22 09:49:08.839159 7fe3c4fd0a00 0 building linked oids index: 58/64 ERROR: read_entries(orphan.scan.checkorph.buckets.58) returned ret=-2 2018-02-22 09:49:08.841316 7fe3c4fd0a00 0 building linked oids index: 59/64 ERROR: read_entries(orphan.scan.checkorph.buckets.59) returned ret=-2 2018-02-22 09:49:08.842964 7fe3c4fd0a00 0 building linked oids index: 60/64 ERROR: read_entries(orphan.scan.checkorph.buckets.60) returned ret=-2 2018-02-22 09:49:08.845283 7fe3c4fd0a00 0 building linked oids index: 61/64 ERROR: read_entries(orphan.scan.checkorph.buckets.61) returned ret=-2 2018-02-22 09:49:08.847025 7fe3c4fd0a00 0 building linked oids index: 62/64 ERROR: read_entries(orphan.scan.checkorph.buckets.62) returned ret=-2 2018-02-22 09:49:08.849227 7fe3c4fd0a00 0 building linked oids index: 63/64 ERROR: read_entries(orphan.scan.checkorph.buckets.63) returned ret=-2 storing 2 entries at orphan.scan.checkorph.linked.0 storing 1 entries at orphan.scan.checkorph.linked.2 storing 1 entries at orphan.scan.checkorph.linked.3 storing 2 entries at orphan.scan.checkorph.linked.9 storing 1 entries at orphan.scan.checkorph.linked.10 storing 1 entries at orphan.scan.checkorph.linked.12 storing 1 entries at orphan.scan.checkorph.linked.13 storing 1 entries at orphan.scan.checkorph.linked.16 storing 1 entries at orphan.scan.checkorph.linked.20 storing 1 entries at orphan.scan.checkorph.linked.25 storing 1 entries at orphan.scan.checkorph.linked.27 storing 1 entries at orphan.scan.checkorph.linked.31 storing 1 entries at orphan.scan.checkorph.linked.33 storing 1 entries at orphan.scan.checkorph.linked.41 storing 1 entries at orphan.scan.checkorph.linked.44 storing 1 entries at orphan.scan.checkorph.linked.55 storing 1 entries at orphan.scan.checkorph.linked.60 storing 1 entries at orphan.scan.checkorph.linked.62 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/fbb24482-8c3a-4951-869a-5f4b2aed8fb1/data.2~iUCoJQLFh_AWXyfAO8_1uN6py0tf428.1 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/7b71139a-b01b-4a08-9364-2492693b6037/data.2~RG4YW1HsWmb2UT1leHWm5_Pp5Daiyy3.1 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/7b71139a-b01b-4a08-9364-2492693b6037/data.2~RG4YW1HsWmb2UT1leHWm5_Pp5Daiyy3.10 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/7b71139a-b01b-4a08-9364-2492693b6037/data.2~RG4YW1HsWmb2UT1leHWm5_Pp5Daiyy3.11 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/7b71139a-b01b-4a08-9364-2492693b6037/data.2~RG4YW1HsWmb2UT1leHWm5_Pp5Daiyy3.12 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/7b71139a-b01b-4a08-9364-2492693b6037/data.2~RG4YW1HsWmb2UT1leHWm5_Pp5Daiyy3.13 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/7b71139a-b01b-4a08-9364-2492693b6037/data.2~RG4YW1HsWmb2UT1leHWm5_Pp5Daiyy3.2 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/7b71139a-b01b-4a08-9364-2492693b6037/data.2~RG4YW1HsWmb2UT1leHWm5_Pp5Daiyy3.3 ... ... ... leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__shadow_docker/registry/v2/repositories/admin/nginx/_uploads/f00d6919-9739-4e62-acfd-34eec348efcd/data.2~2Ozda0EauUKukcDBnzRGA4FwA1Cwyf0.6_2 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__shadow_docker/registry/v2/repositories/admin/nginx/_uploads/f00d6919-9739-4e62-acfd-34eec348efcd/data.2~2Ozda0EauUKukcDBnzRGA4FwA1Cwyf0.7_1 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__shadow_docker/registry/v2/repositories/admin/nginx/_uploads/f00d6919-9739-4e62-acfd-34eec348efcd/data.2~2Ozda0EauUKukcDBnzRGA4FwA1Cwyf0.7_2 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__shadow_docker/registry/v2/repositories/admin/nginx/_uploads/f00d6919-9739-4e62-acfd-34eec348efcd/data.2~2Ozda0EauUKukcDBnzRGA4FwA1Cwyf0.8_1 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__shadow_docker/registry/v2/repositories/admin/nginx/_uploads/f00d6919-9739-4e62-acfd-34eec348efcd/data.2~2Ozda0EauUKukcDBnzRGA4FwA1Cwyf0.8_2 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__shadow_docker/registry/v2/repositories/admin/nginx/_uploads/f00d6919-9739-4e62-acfd-34eec348efcd/data.2~2Ozda0EauUKukcDBnzRGA4FwA1Cwyf0.9_1 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__shadow_docker/registry/v2/repositories/admin/nginx/_uploads/f00d6919-9739-4e62-acfd-34eec348efcd/data.2~2Ozda0EauUKukcDBnzRGA4FwA1Cwyf0.9_2 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/7af6c6a4-3c7d-41ba-b258-98b04a303fd6/data.2~V-7rpyG2O1AysCIioFJiz_QXVqoLPTE.1 leaked: cb83e8f5-515d-458a-905c-5c718e1c8352.14097.2__multipart_docker/registry/v2/repositories/admin/nginx/_uploads/fe9dc324-41ae-4709-b556-daaf6ec5539c/data.2~_IIrN5zhPSky2KL_DAuZMtE67oGOic7.1
A test cluster has the same issue and will be upgraded to Luminous to verify if it's still present there.
Is this a known issue?