Project

General

Profile

Bug #13223

Can’t reach quota limit when use multipart upload objects

Added by gu ce over 8 years ago. Updated over 6 years ago.

Status:
Closed
Priority:
Normal
Assignee:
-
Target version:
-
% Done:

0%

Source:
Community (dev)
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

use multipart upload some object to bucket which enable bucket quota
I found that the number of objects can’t reach quota limit
e.g. quota config=10, only can upload 8 object, hammer 0.94.3 and master 9.0.3 both

I looked the code, found there are two stats RGWStorageStats and RGWQuotaCache
the two stats value have a big different when use multipart upload

[version]
ceph version 0.94.3 (95cefea9fd9ab740263bf8bb4796fd864d9afe2b)

[config, --max-objects=1]
radosgw-admin user create --uid=111 --display-name=111 --access-key=111 --secret=111
radosgw-admin quota set --quota-scope=bucket --uid=111 --max-objects=1
radosgw-admin quota enable --quota-scope=bucket --uid=111
s3cmd mb s3://bucket_111_a

[upload only one 100M file, QuotaExceeded]
s3cmd put data_100M s3://bucket_111_a/1
data_100M -> s3://bucket_111_a/1 [part 1 of 7, 15MB]
15728640 of 15728640 100% in 0s 53.44 MB/s done
ERROR: Upload of 'data_100M' part 1 failed. Aborting multipart upload.
ERROR: S3 error: 403 (QuotaExceeded):

[config, --max-objects=3]
radosgw-admin quota set --quota-scope=bucket --uid=111 --max-objects=3

[upload first 100M file, success]
s3cmd put data_100M s3://bucket_111_a/1
data_100M -> s3://bucket_111_a/1 [part 1 of 7, 15MB]
15728640 of 15728640 100% in 0s 43.19 MB/s done
data_100M -> s3://bucket_111_a/1 [part 2 of 7, 15MB]
15728640 of 15728640 100% in 0s 39.03 MB/s done
data_100M -> s3://bucket_111_a/1 [part 3 of 7, 15MB]
15728640 of 15728640 100% in 0s 38.62 MB/s done
data_100M -> s3://bucket_111_a/1 [part 4 of 7, 15MB]
15728640 of 15728640 100% in 0s 38.68 MB/s done
data_100M -> s3://bucket_111_a/1 [part 5 of 7, 15MB]
15728640 of 15728640 100% in 0s 43.85 MB/s done
data_100M -> s3://bucket_111_a/1 [part 6 of 7, 15MB]
15728640 of 15728640 100% in 0s 44.72 MB/s done
data_100M -> s3://bucket_111_a/1 [part 7 of 7, 10MB]
10485760 of 10485760 100% in 0s 48.12 MB/s done

[upload second 100M file, QuotaExceeded]
s3cmd put data_100M s3://bucket_111_a/2
data_100M -> s3://bucket_111_a/2 [part 1 of 7, 15MB]
15728640 of 15728640 100% in 0s 44.14 MB/s done
data_100M -> s3://bucket_111_a/2 [part 2 of 7, 15MB]
15728640 of 15728640 100% in 0s 52.70 MB/s done
ERROR: Upload of 'data_100M' part 2 failed. Aborting multipart upload.
ERROR: S3 error: 403 (QuotaExceeded):

History

#1 Updated by Nathan Cutler over 8 years ago

  • Target version deleted (v9.0.3)
  • Source changed from other to Community (dev)

#2 Updated by Nathan Cutler over 8 years ago

Hi! Do you think this is reproducible on latest master?

#3 Updated by Nathan Cutler over 8 years ago

  • Status changed from New to Fix Under Review

#4 Updated by Nathan Cutler over 6 years ago

  • Status changed from Fix Under Review to Closed

PR was closed, so closing the tracker as well.

Also available in: Atom PDF