Project

General

Profile

Bug #43259

S3 CopyObject: failed to parse copy location

Added by Vladimir Buyanov over 4 years ago. Updated over 1 year ago.

Status:
Resolved
Priority:
Normal
Assignee:
-
Target version:
-
% Done:

100%

Source:
Tags:
Backport:
pacific octopus
Regression:
Yes
Severity:
2 - major
Reviewed:
Affected Versions:
ceph-qa-suite:
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

Hello.
Commit https://github.com/ceph/ceph/commit/ea979b915581c02c0bc8dba23f4fd83e635fe9a7#diff-91c04dc5e6532eab0d6f52eec3d03b1e break S3 CopyObject functions.
Copy requests failing with error "failed to parse copy location".

Expected behavior (13.2.6 and older versions): urlencoded "CopySource" field decoded without errors.
Current behavior (13.2.7): urlencoded "CopySource" field parsing error. It's happen because in new version the string splitted by '/' before urldecode.

I write simple Golang program, that repeat this issue.

package main
import (
    "github.com/aws/aws-sdk-go/aws" 
    "github.com/aws/aws-sdk-go/aws/credentials" 
    "github.com/aws/aws-sdk-go/aws/session" 
    "github.com/aws/aws-sdk-go/service/s3" 
    "net/url" 
)

const (
    endpoint = "https://s3.exmaple.com" 
    key = "KEY" 
    secret = "SECRET" 
)

func main() {
    sess := session.Must(session.NewSession())
    sess.Config.S3ForcePathStyle = aws.Bool(true)
    sess.Config.Region = aws.String("us-east-1")
    cred := credentials.NewStaticCredentials(key, secret, "")
    sess.Config.WithCredentials(cred)
    sess.Config.Endpoint = aws.String(endpoint)
    svc := s3.New(sess)

    req := &s3.CopyObjectInput{
        Key: aws.String("file-new.txt"),
        Bucket: aws.String("bucket"),
        CopySource: aws.String(url.QueryEscape("bucket/file.txt")),
    }

    if _, err := svc.CopyObject(req); err != nil {
        panic(err)
    }
}


Related issues

Copied to rgw - Backport #51700: octopus: S3 CopyObject: failed to parse copy location Resolved
Copied to rgw - Backport #51701: pacific: S3 CopyObject: failed to parse copy location Resolved

Associated revisions

Revision b7621625 (diff)
Added by Paul Reece over 2 years ago

rgw: url_decode before parsing copysource in copyobject

If the copysource on copyobject call was URL-encoded, it would fail as it would not parse the '/' seperating bucket and key name

URL encoding may be necessary for certain characters in a copysource, and several public examples show URL encoding the copysource

Fixes: #43259

Signed-off-by: Paul Reece <>

Revision c83afb43 (diff)
Added by Paul Reece over 2 years ago

Amend b7621625ed69f21a5bf701b3385ddee281ff3715 to not call url_decode excessively

Fixes: #43259

Signed-off-by: Paul Reece <>

Revision 9a4b8b40 (diff)
Added by Paul Reece about 2 years ago

rgw: url_decode before parsing copysource in copyobject

If the copysource on copyobject call was URL-encoded, it would fail as it would not parse the '/' seperating bucket and key name

URL encoding may be necessary for certain characters in a copysource, and several public examples show URL encoding the copysource

Fixes: #43259

Signed-off-by: Paul Reece <>
(cherry picked from commit b7621625ed69f21a5bf701b3385ddee281ff3715)

Revision a1bcf497 (diff)
Added by Paul Reece about 2 years ago

Amend b7621625ed69f21a5bf701b3385ddee281ff3715 to not call url_decode excessively

Fixes: #43259

Signed-off-by: Paul Reece <>
(cherry picked from commit c83afb4359b9f8b6d8b6942e74a52f303a474d54)

Revision 7f3311b1 (diff)
Added by Paul Reece about 2 years ago

rgw: url_decode before parsing copysource in copyobject

If the copysource on copyobject call was URL-encoded, it would fail as it would not parse the '/' seperating bucket and key name

URL encoding may be necessary for certain characters in a copysource, and several public examples show URL encoding the copysource

Fixes: #43259

Signed-off-by: Paul Reece <>
(cherry picked from commit b7621625ed69f21a5bf701b3385ddee281ff3715)

Revision 107eb0bc (diff)
Added by Paul Reece about 2 years ago

Amend b7621625ed69f21a5bf701b3385ddee281ff3715 to not call url_decode excessively

Fixes: #43259

Signed-off-by: Paul Reece <>
(cherry picked from commit c83afb4359b9f8b6d8b6942e74a52f303a474d54)

Conflicts:
src/rgw/rgw_op.cc

History

#1 Updated by Vladimir Buyanov over 4 years ago

Also affected:
- 14.2.3
- 14.2.4
- 14.2.5

#3 Updated by Kefu Chai over 4 years ago

  • Status changed from New to Fix Under Review
  • Pull request ID set to 32205

#4 Updated by Kefu Chai over 4 years ago

  • Backport set to nautilus

#5 Updated by Vladimir Buyanov over 4 years ago

Backport for is mimic needed too.

#6 Updated by Nathan Cutler over 4 years ago

  • Backport changed from nautilus to nautilus, mimic

#7 Updated by Paul Reece over 2 years ago

Looks like the PR was auto-closed

I had this bug occur, but found an easy workaround: simply removing the url.QueryEscape() from the CopySource (and commenting out the net/url import) allowed the file to be copied for me

#9 Updated by Casey Bodley over 2 years ago

  • Status changed from Fix Under Review to Pending Backport
  • Backport changed from nautilus, mimic to pacific octopus
  • Pull request ID changed from 32205 to 42126

#10 Updated by Backport Bot over 2 years ago

  • Copied to Backport #51700: octopus: S3 CopyObject: failed to parse copy location added

#11 Updated by Backport Bot over 2 years ago

  • Copied to Backport #51701: pacific: S3 CopyObject: failed to parse copy location added

#12 Updated by Backport Bot over 1 year ago

  • Tags set to backport_processed

#13 Updated by Konstantin Shalygin over 1 year ago

  • Status changed from Pending Backport to Resolved
  • % Done changed from 0 to 100
  • Tags deleted (backport_processed)

Also available in: Atom PDF