Project

General

Profile

Actions

Bug #36491

open

s3a-hadoop: fails to create bucket index pool

Added by Yuri Weinstein over 5 years ago. Updated over 5 years ago.

Status:
New
Priority:
Normal
Assignee:
Target version:
-
% Done:

0%

Source:
Q/A
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
rgw
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

This is for 12.2.9

Run: http://pulpito.front.sepia.ceph.com/yuriw-2018-10-16_15:39:25-rgw-luminous-distro-basic-smithi/
Job: 3147844
Logs: http://qa-proxy.ceph.com/teuthology/yuriw-2018-10-16_15:39:25-rgw-luminous-distro-basic-smithi/3147844/teuthology.log

2018-10-16T16:30:33.951 INFO:teuthology.orchestra.run.ovh075.stdout:conn = boto.connect_s3(
2018-10-16T16:30:33.951 INFO:teuthology.orchestra.run.ovh075.stdout:        aws_access_key_id = access_key,
2018-10-16T16:30:33.951 INFO:teuthology.orchestra.run.ovh075.stdout:        aws_secret_access_key = secret_key,
2018-10-16T16:30:33.952 INFO:teuthology.orchestra.run.ovh075.stdout:        host = 's3.ceph.com',
2018-10-16T16:30:33.952 INFO:teuthology.orchestra.run.ovh075.stdout:        is_secure=False,
2018-10-16T16:30:33.952 INFO:teuthology.orchestra.run.ovh075.stdout:        calling_format = boto.s3.connection.OrdinaryCallingFormat(),
2018-10-16T16:30:33.953 INFO:teuthology.orchestra.run.ovh075.stdout:        )
2018-10-16T16:30:33.953 INFO:teuthology.orchestra.run.ovh075.stdout:bucket = conn.create_bucket('s3atest')
2018-10-16T16:30:33.953 INFO:teuthology.orchestra.run.ovh075.stdout:for bucket in conn.get_all_buckets():
2018-10-16T16:30:33.953 INFO:teuthology.orchestra.run.ovh075.stdout:        print bucket.name + "    " + bucket.creation_date
2018-10-16T16:30:33.954 INFO:teuthology.orchestra.run.ovh075:Running: '/home/ubuntu/cephtest/venv/bin/python /home/ubuntu/cephtest/create_bucket.py'
2018-10-16T16:30:34.775 INFO:teuthology.orchestra.run.ovh075.stderr:Traceback (most recent call last):
2018-10-16T16:30:34.775 INFO:teuthology.orchestra.run.ovh075.stderr:  File "/home/ubuntu/cephtest/create_bucket.py", line 15, in <module>
2018-10-16T16:30:34.775 INFO:teuthology.orchestra.run.ovh075.stderr:    bucket = conn.create_bucket('s3atest')
2018-10-16T16:30:34.776 INFO:teuthology.orchestra.run.ovh075.stderr:  File "/home/ubuntu/cephtest/venv/lib/python2.7/site-packages/boto/s3/connection.py", line 628, in create_bucket
2018-10-16T16:30:34.779 INFO:teuthology.orchestra.run.ovh075.stderr:    response.status, response.reason, body)
2018-10-16T16:30:34.779 INFO:teuthology.orchestra.run.ovh075.stderr:boto.exception.S3ResponseError: S3ResponseError: 416 Requested Range Not Satisfiable
2018-10-16T16:30:34.779 INFO:teuthology.orchestra.run.ovh075.stderr:<?xml version="1.0" encoding="UTF-8"?><Error><Code>InvalidRange</Code><BucketName>s3atest</BucketName><RequestId>tx000000000000000000001-005bc6122a-1037-default</RequestId><HostId>1037-default-default</HostId></Error>
2018-10-16T16:30:34.791 DEBUG:teuthology.orchestra.run:got remote process result: 1
2018-10-16T16:30:34.791 ERROR:teuthology.run_tasks:Saw exception from tasks.
Traceback (most recent call last):
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/run_tasks.py", line 89, in run_tasks
    manager.__enter__()
  File "/usr/lib/python2.7/contextlib.py", line 17, in __enter__
    return self.gen.next()
  File "/home/teuthworker/src/github.com_ceph_ceph_luminous/qa/tasks/s3a_hadoop.py", line 83, in task
    setup_user_bucket(rgw_node, dnsmasq_name, access_key, secret_key, bucket_name, testdir)
  File "/home/teuthworker/src/github.com_ceph_ceph_luminous/qa/tasks/s3a_hadoop.py", line 250, in setup_user_bucket
    '{testdir}/create_bucket.py'.format(testdir=testdir),
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/orchestra/remote.py", line 193, in run
    r = self._runner(client=self.ssh, name=self.shortname, **kwargs)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/orchestra/run.py", line 429, in run
    r.wait()
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/orchestra/run.py", line 161, in wait
    self._raise_for_status()
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/orchestra/run.py", line 183, in _raise_for_status
    node=self.hostname, label=self.label
CommandFailedError: Command failed on ovh075 with status 1: '/home/ubuntu/cephtest/venv/bin/python /home/ubuntu/cephtest/create_bucket.py'
Actions #1

Updated by Casey Bodley over 5 years ago

  • Project changed from Linux kernel client to rgw
  • Subject changed from create_bucket.py error in rgw to s3a-hadoop: fails to create bucket index pool
  • Assignee set to Vasu Kulkarni
  • Priority changed from Urgent to Normal

it looks like ceph-ansible isn't configuring the pg nums correctly, and the default.rgw.buckets.index pool creation fails:

2018-10-15 22:47:24.514045 7f2d6b924700  1 -- 158.69.65.139:0/2905298535 --> 158.69.65.139:6789/0 -- pool_op(create pool 0 auid 0 tid 708 name default.rgw.buckets.index v0) v4 -- 0x556700649b00 con 0
2018-10-15 22:47:25.065546 7f2d8ba1d700  1 -- 158.69.65.139:0/2905298535 <== mon.0 158.69.65.139:6789/0 9 ==== osd_map(39..39 src has 1..39) v3 ==== 244+0+0 (4066938438 0 0) 0x556700649b00 con 0x5567001e1000
2018-10-15 22:47:25.067621 7f2d8ba1d700  1 -- 158.69.65.139:0/2905298535 <== mon.0 158.69.65.139:6789/0 10 ==== pool_op_reply(tid 708 (34) Numerical result out of range v39) v1 ==== 43+0+0 (401380274 0 0) 0x55670022f180 con 0x5567001e1000
2018-10-15 22:47:25.067734 7f2d6b924700  0 rgw_init_ioctx ERROR: librados::Rados::pool_create returned (34) Numerical result out of range (this can be due to a pool or placement group misconfiguration, e.g. pg_num < pgp_num or mon_max_pg_per_osd exceeded)
2018-10-15 22:47:25.067774 7f2d6b924700 20 rgw_create_bucket returned ret=-34 bucket=s3atest[f1629ca2-82a4-450f-b64c-9e3a58db5393.4152.1]
2018-10-15 22:47:25.067797 7f2d6b924700  2 req 1:0.556873:s3:PUT /s3atest/:create_bucket:completing
2018-10-15 22:47:25.068042 7f2d6b924700  2 req 1:0.557133:s3:PUT /s3atest/:create_bucket:op status=-34
2018-10-15 22:47:25.068047 7f2d6b924700  2 req 1:0.557138:s3:PUT /s3atest/:create_bucket:http status=416
2018-10-15 22:47:25.068072 7f2d6b924700  1 ====== req done req=0x7f2d6b91df90 op status=-34 http_status=416 ======

Lowered to normal priority because it had previously been failing with:

Command failed on ovh004 with status 1: 'sudo python -c \'import shutil, sys; shutil.copyfileobj(sys.stdin, file(sys.argv1, "wb"))\' /etc/resolv.conf'

Vasu, can you please take a look?

Actions

Also available in: Atom PDF