Actions
Bug #44352
openpool listings are slow after deleting objects
Status:
New
Priority:
Normal
Assignee:
-
Category:
-
Target version:
-
% Done:
0%
Source:
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Component(RADOS):
Pull request ID:
Crash signature (v1):
Crash signature (v2):
Description
I'm seeing a weird problem on a system where the following was done:
- multi-site setup with two zones
- primary went offline (sync status was fully synced)
- secondary was made master and default
- primary was deleted
Everything's working fine, but some radosgw-admin commands are slow. radosgw startup is also slower than usual.
These commands are slow to respond:
radosgw-admin zone list radosgw-admin zonegroup list radosgw-admin realm list radosgw-admin user list
They all take almost exactly one minute to complete, indicating some kind of timeout.
It's working perfectly fine except for the slowness for some commands.
It doesn't affect all commands, for example, getting the default realm/zonegroup/zone with "get" instead of "list" is as fast as usual.
Here's what it looks like with debug logging for rgw
$ radosgw-admin zone list --debug-rgw=30/30 2020-02-28 12:50:03.765 7fd109ff3700 2 RGWDataChangesLog::ChangesRenewThread: start 2020-02-28 12:50:03.765 7fd1097f2700 20 reqs_thread_entry: start 2020-02-28 12:50:25.764 7fd109ff3700 2 RGWDataChangesLog::ChangesRenewThread: start 2020-02-28 12:50:47.764 7fd109ff3700 2 RGWDataChangesLog::ChangesRenewThread: start 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got zonegroup_info.a3463f9f-2a47-4afc-b794-f8fac1904bd3 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.355a1244-2ed9-430a-91a7-50ea6b0a0287.1 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got default.zone.f779b461-23be-4400-8e80-aac8be7aebb7 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got default.zone.35493e32-0bac-4cbd-8524-f5f266e9b70d 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got period_config.f779b461-23be-4400-8e80-aac8be7aebb7 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got zonegroups_names.XXXX 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.27ab57cf-1845-4485-bcbd-e83972cd898f.4 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got default.zonegroup.35493e32-0bac-4cbd-8524-f5f266e9b70d 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got zone_info.9a745fce-1fd4-4ff5-ba70-311aa8932d98 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.27ab57cf-1845-4485-bcbd-e83972cd898f.latest_epoch 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got realms_names.XXXXX 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got realms.0354ca1d-3ec8-426e-a6bf-8455c53a5971 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got default.realm 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got period_config.0354ca1d-3ec8-426e-a6bf-8455c53a5971 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got default.zone.0354ca1d-3ec8-426e-a6bf-8455c53a5971 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got default.zonegroup.0354ca1d-3ec8-426e-a6bf-8455c53a5971 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.f779b461-23be-4400-8e80-aac8be7aebb7:staging 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.355a1244-2ed9-430a-91a7-50ea6b0a0287.latest_epoch 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got default.zonegroup.f779b461-23be-4400-8e80-aac8be7aebb7 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.27ab57cf-1845-4485-bcbd-e83972cd898f.3 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got realms.0354ca1d-3ec8-426e-a6bf-8455c53a5971.control 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.0354ca1d-3ec8-426e-a6bf-8455c53a5971:staging 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got period_config.35493e32-0bac-4cbd-8524-f5f266e9b70d 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got period_config.e716690f-2b87-4432-aed0-24e4bb24f96e 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.27ab57cf-1845-4485-bcbd-e83972cd898f.2 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.27ab57cf-1845-4485-bcbd-e83972cd898f.1 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.27ab57cf-1845-4485-bcbd-e83972cd898f.5 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got zonegroup_info.da0c964c-6029-48cc-bfe6-4a32aaff2f34 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got zone_names.XXXX 2020-02-28 12:51:06.307 7fd12a56d9c0 20 RGWRados::pool_iterate: got periods.35493e32-0bac-4cbd-8524-f5f266e9b70d:staging 2020-02-28 12:51:06.307 7fd12a56d9c0 20 rados->read ofs=0 len=0 2020-02-28 12:51:06.311 7fd12a56d9c0 20 rados_obj.operate() r=0 bl.length=46 2020-02-28 12:51:06.311 7fd12a56d9c0 20 rados->read ofs=0 len=0 { "default_info": "9a745fce-1fd4-4ff5-ba70-311aa8932d98", "zones": [ "XXXX" ] } 2020-02-28 12:51:06.311 7fd12a56d9c0 20 rados_obj.operate() r=0 bl.length=107 2020-02-28 12:51:06.311 7fd12a56d9c0 20 rados->read ofs=0 len=0 2020-02-28 12:51:06.311 7fd12a56d9c0 20 rados_obj.operate() r=0 bl.length=46
I've also encountered the following unusual warnings in the radosgw log:
Feb 27 18:36:28 XXX radosgw[189536]: 2020-02-27 18:36:28.220 7fb4c7d92700 -1 rgw realm watcher: RGWRealmWatcher::handle_error oid=realms.35493e32-0bac-4cbd-8524-f5f266e9b70d.control err=-107 Feb 27 18:36:28 XXX radosgw[189536]: 2020-02-27 18:36:28.220 7fb5eafd8700 -1 RGWWatcher::handle_error cookie 94472282110976 err (107) Transport endpoint is not connected Feb 27 18:36:28 XXX radosgw[189536]: 2020-02-27 18:36:28.220 7fb4c7d92700 -1 rgw realm watcher: RGWRealmWatcher::handle_error oid=realms.35493e32-0bac-4cbd-8524-f5f266e9b70d.control err=-107 Feb 27 18:36:28 XXX radosgw[189536]: 2020-02-27 18:36:28.220 7fb5eafd8700 -1 RGWWatcher::handle_error cookie 94472282110976 err (107) Transport endpoint is not connected
Actions