Bug #15316
Updated by Loïc Dachary about 8 years ago
(rgw:frontend: apache) teuthology wasn't judged apache2 whether to install, an if apache2 isn't install, the log info is on below. so you can't pick information what you want. <pre> 2016-03-29T09:43:05.889 DEBUG:teuthology.run:Config: archive_path: /home/teuthworker/archive/teuthology-2016-03-28_19:54:09-rgw:multifs-v0.94.6---basic-plana/286 branch: v0.94.6 description: rgw:multifs/{overrides.yaml clusters/fixed-2.yaml frontend/apache.yaml fs/xfs.yaml rgw_pool_type/ec.yaml tasks/rgw_multipart_upload.yaml} email: null job_id: '286' last_in_suite: false machine_type: plana name: teuthology-2016-03-28_19:54:09-rgw:multifs-v0.94.6---basic-plana nuke-on-error: true os_type: ubuntu overrides: admin_socket: branch: v0.94.6 ceph: conf: client: debug rgw: 20 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug filestore: 20 debug journal: 20 debug ms: 1 debug osd: 20 osd sloppy crc: true fs: xfs log-whitelist: - slow request sha1: e832001feaf8c176593e0325c8298e3f16dfb403 ceph-deploy: branch: dev-commit: e832001feaf8c176593e0325c8298e3f16dfb403 conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: debug mon: 1 debug ms: 20 debug paxos: 20 osd default pool size: 2 install: ceph: sha1: e832001feaf8c176593e0325c8298e3f16dfb403 rgw: ec-data-pool: true frontend: apache s3tests: slow_backend: true workunit: sha1: e832001feaf8c176593e0325c8298e3f16dfb403 owner: scheduled_teuthology@scheduler priority: 1000 roles: - - mon.a - mon.c - osd.0 - osd.1 - osd.2 - client.0 - - mon.b - osd.3 - osd.4 - osd.5 - client.1 sha1: e832001feaf8c176593e0325c8298e3f16dfb403 suite: rgw:multifs suite_branch: hammer suite_path: /home/foo2/src/ceph-qa-suite_hammer suite_sha1: 789be166faa196901c92745fe626b282c0ece6a7 tasks: - install: null - ceph: null - rgw: - client.0 - workunit: clients: client.0: - rgw/s3_multipart_upload.pl teuthology_branch: master tube: plana verbose: false worker_log: /home/teuthworker/archive/worker_logs/worker.plana.10905 ...... 016-03-29T09:55:36.020 INFO:teuthology.run_tasks:Running task rgw... 2016-03-29T09:55:36.022 INFO:tasks.rgw:Using apache as radosgw frontend 2016-03-29T09:55:36.023 INFO:tasks.rgw:Creating apache directories... 2016-03-29T09:55:36.023 INFO:teuthology.orchestra.run.plana36:Running: 'mkdir -p /home/ubuntu/cephtest/apache/htdocs.client.0 /home/ubuntu/cephtest/apache/tmp.client.0/fastcgi_sock && mkdir /home/ubuntu/cephtest/archive/apache.client.0' 2016-03-29T09:55:36.036 DEBUG:tasks.rgw:In rgw.configure_regions_and_zones() and regions is None. Bailing 2016-03-29T09:55:36.036 INFO:tasks.rgw:Configuring users... 2016-03-29T09:55:36.037 INFO:tasks.rgw:creating data pools 2016-03-29T09:55:36.037 INFO:teuthology.orchestra.run.plana36:Running: 'ceph osd erasure-code-profile set client.0 k=2 m=1 ruleset-failure-domain=osd' 2016-03-29T09:55:37.057 INFO:teuthology.orchestra.run.plana36:Running: 'ceph osd pool create .rgw.buckets 64 64 erasure client.0' 2016-03-29T09:55:39.551 INFO:teuthology.orchestra.run.plana36.stderr:pool '.rgw.buckets' created 2016-03-29T09:55:39.562 INFO:tasks.rgw:Shipping apache config and rgw.fcgi... 2016-03-29T09:55:39.563 INFO:teuthology.orchestra.run.plana36:Running: 'sudo lsb_release -is' 2016-03-29T09:55:39.630 DEBUG:teuthology.misc:System to be installed: Ubuntu 2016-03-29T09:55:39.631 INFO:teuthology.orchestra.run.plana36:Running: 'python -c \'import shutil, sys; shutil.copyfileobj(sys.stdin, file(sys.argv[1], "wb"))\' /home/ubuntu/cephtest/apache/apache.client.0.conf' 2016-03-29T09:55:39.661 INFO:teuthology.orchestra.run.plana36:Running: 'python -c \'import shutil, sys; shutil.copyfileobj(sys.stdin, file(sys.argv[1], "wb"))\' /home/ubuntu/cephtest/apache/htdocs.client.0/rgw.fcgi' 2016-03-29T09:55:39.795 INFO:teuthology.orchestra.run.plana36:Running: 'chmod a=rx /home/ubuntu/cephtest/apache/htdocs.client.0/rgw.fcgi' 2016-03-29T09:55:39.853 INFO:tasks.rgw:Starting rgw... 2016-03-29T09:55:39.854 INFO:tasks.rgw:rgw client.0 config is {} 2016-03-29T09:55:39.855 INFO:tasks.rgw:client client.0 is id 0 2016-03-29T09:55:39.855 INFO:tasks.rgw.client.0:Restarting daemon 2016-03-29T09:55:39.856 INFO:teuthology.orchestra.run.plana36:Running: 'sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper term radosgw --rgw-socket-path /home/ubuntu/cephtest/apache/tmp.client.0/fastcgi_sock/rgw_sock -n client.0 -k /etc/ceph/ceph.client.0.keyring --log-file /var/log/ceph/rgw.client.0.log --rgw_ops_log_socket_path /home/ubuntu/cephtest/rgw.opslog.client.0.sock --foreground | sudo tee /var/log/ceph/rgw.client.0.stdout 2>&1' 2016-03-29T09:55:39.945 INFO:tasks.rgw.client.0:Started 2016-03-29T09:55:39.945 INFO:tasks.rgw:Starting apache... 2016-03-29T09:55:39.946 INFO:teuthology.orchestra.run.plana36:Running: 'sudo lsb_release -is' 2016-03-29T09:55:40.022 DEBUG:teuthology.misc:System to be installed: Ubuntu 2016-03-29T09:55:40.022 INFO:teuthology.orchestra.run.plana36:Running: 'adjust-ulimits daemon-helper kill apache2 -X -f /home/ubuntu/cephtest/apache/apache.client.0.conf' 2016-03-29T09:55:40.033 INFO:teuthology.run_tasks:Running task workunit... 2016-03-29T09:55:40.034 INFO:tasks.workunit:Pulling workunits from ref e832001feaf8c176593e0325c8298e3f16dfb403 2016-03-29T09:55:40.034 INFO:tasks.workunit:Making a separate scratch dir for every client... 2016-03-29T09:55:40.035 DEBUG:tasks.workunit:getting remote for 0 role client.0 2016-03-29T09:55:40.035 INFO:teuthology.orchestra.run.plana36:Running: 'stat -- /home/ubuntu/cephtest/mnt.0' 2016-03-29T09:55:40.047 INFO:teuthology.orchestra.run.plana36.stderr:stat: cannot stat ‘/home/ubuntu/cephtest/mnt.0’: No such file or directory 2016-03-29T09:55:40.048 INFO:teuthology.orchestra.run.plana36:Running: 'mkdir -- /home/ubuntu/cephtest/mnt.0' 2016-03-29T09:55:40.139 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2016-03-29T09:55:40.140 INFO:teuthology.orchestra.run.plana36:Running: 'cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0' 2016-03-29T09:55:40.235 INFO:teuthology.orchestra.run.plana36:Running: 'mkdir -- /home/ubuntu/cephtest/workunit.client.0 && git archive --remote=git://git.ceph.com/ceph.git e832001feaf8c176593e0325c8298e3f16dfb403:qa/workunits | tar -C /home/ubuntu/cephtest/workunit.client.0 -x -f-' 2016-03-29T09:55:40.287 INFO:tasks.rgw.client.0.plana36.stderr:Traceback (most recent call last): 2016-03-29T09:55:40.288 INFO:tasks.rgw.client.0.plana36.stderr: File "/usr/bin/daemon-helper", line 66, in <module> 2016-03-29T09:55:40.288 INFO:tasks.rgw.client.0.plana36.stderr: preexec_fn=os.setsid, 2016-03-29T09:55:40.288 INFO:tasks.rgw.client.0.plana36.stderr: File "/usr/lib/python2.7/subprocess.py", line 710, in __init__ 2016-03-29T09:55:40.289 INFO:tasks.rgw.client.0.plana36.stderr: errread, errwrite) 2016-03-29T09:55:40.289 INFO:tasks.rgw.client.0.plana36.stderr: File "/usr/lib/python2.7/subprocess.py", line 1327, in _execute_child 2016-03-29T09:55:40.290 INFO:tasks.rgw.client.0.plana36.stderr: raise child_exception 2016-03-29T09:55:40.291 INFO:tasks.rgw.client.0.plana36.stderr:OSError: [Errno 2] No such file or directory 2016-03-29T09:55:41.260 INFO:teuthology.orchestra.run.plana36:Running: "cd -- /home/ubuntu/cephtest/workunit.client.0 && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\\0' >/home/ubuntu/cephtest/workunits.list.client.0" 2016-03-29T09:55:41.469 INFO:tasks.workunit.client.0.plana36.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2016-03-29T09:55:41.471 INFO:tasks.workunit.client.0.plana36.stdout:make[1]: Entering directory `/home/ubuntu/cephtest/workunit.client.0/direct_io' 2016-03-29T09:55:41.472 INFO:tasks.workunit.client.0.plana36.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2016-03-29T09:55:48.481 INFO:tasks.workunit.client.0.plana36.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2016-03-29T09:55:48.958 INFO:tasks.workunit.client.0.plana36.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2016-03-29T09:55:49.025 INFO:tasks.workunit.client.0.plana36.stdout:make[1]: Leaving directory `/home/ubuntu/cephtest/workunit.client.0/direct_io' 2016-03-29T09:55:49.027 INFO:tasks.workunit.client.0.plana36.stdout:make[1]: Entering directory `/home/ubuntu/cephtest/workunit.client.0/fs' 2016-03-29T09:55:49.027 INFO:tasks.workunit.client.0.plana36.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2016-03-29T09:55:49.080 INFO:tasks.workunit.client.0.plana36.stdout:make[1]: Leaving directory `/home/ubuntu/cephtest/workunit.client.0/fs' 2016-03-29T09:55:49.180 INFO:tasks.workunit:Running workunits matching rgw/s3_multipart_upload.pl on client.0... 2016-03-29T09:55:49.181 INFO:tasks.workunit:Running workunit rgw/s3_multipart_upload.pl... 2016-03-29T09:55:49.182 INFO:teuthology.orchestra.run.plana36:Running (workunit test rgw/s3_multipart_upload.pl): 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=e832001feaf8c176593e0325c8298e3f16dfb403 TESTDIR="/home/ubuntu/cephtest" CEPH_ID="0" PATH=$PATH:/usr/sbin adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/workunit.client.0/rgw/s3_multipart_upload.pl' 2016-03-29T09:55:49.276 INFO:tasks.workunit.client.0.plana36.stderr:Can't locate Amazon/S3.pm in @INC (you may need to install the Amazon::S3 module) (@INC contains: /etc/perl /usr/local/lib/perl/5.18.2 /usr/local/share/perl/5.18.2 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.18 /usr/share/perl/5.18 /usr/local/lib/site_perl .) at /home/ubuntu/cephtest/workunit.client.0/rgw/s3_multipart_upload.pl line 30. 2016-03-29T09:55:49.276 INFO:tasks.workunit.client.0.plana36.stderr:BEGIN failed--compilation aborted at /home/ubuntu/cephtest/workunit.client.0/rgw/s3_multipart_upload.pl line 30. 2016-03-29T09:55:49.277 INFO:tasks.workunit:Stopping ['rgw/s3_multipart_upload.pl'] on client.0... 2016-03-29T09:55:49.277 INFO:teuthology.orchestra.run.plana36:Running: 'rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/workunit.client.0 /home/ubuntu/cephtest/clone' 2016-03-29T09:55:49.307 ERROR:teuthology.parallel:Exception in parallel execution Traceback (most recent call last): File "/home/foo2/src/teuthology_master/teuthology/parallel.py", line 82, in __exit__ for result in self: File "/home/foo2/src/teuthology_master/teuthology/parallel.py", line 101, in next resurrect_traceback(result) File "/home/foo2/src/teuthology_master/teuthology/parallel.py", line 19, in capture_traceback return func(*args, **kwargs) File "/home/foo2/src/ceph-qa-suite_hammer/tasks/workunit.py", line 385, in _run_tests label="workunit test {workunit}".format(workunit=workunit) File "/home/foo2/src/teuthology_master/teuthology/orchestra/remote.py", line 156, in run r = self._runner(client=self.ssh, name=self.shortname, **kwargs) File "/home/foo2/src/teuthology_master/teuthology/orchestra/run.py", line 378, in run r.wait() File "/home/foo2/src/teuthology_master/teuthology/orchestra/run.py", line 114, in wait label=self.label) CommandFailedError: Command failed (workunit test rgw/s3_multipart_upload.pl) on plana36 with status 2: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=e832001feaf8c176593e0325c8298e3f16dfb403 TESTDIR="/home/ubuntu/cephtest" CEPH_ID="0" PATH=$PATH:/usr/sbin adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/workunit.client.0/rgw/s3_multipart_upload.pl' 2016-03-29T09:55:49.309 ERROR:teuthology.run_tasks:Saw exception from tasks. </pre>