Project

General

Profile

Actions

Bug #11994

closed

"unknown op append_excl" in upgrade:giant-x-next-distro-basic-vps run

Added by Yuri Weinstein almost 9 years ago. Updated over 8 years ago.

Status:
Resolved
Priority:
Urgent
Assignee:
Category:
-
Target version:
-
% Done:

0%

Source:
Q/A
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
upgrade/firefly, upgrade/firefly-x, upgrade/giant-x, upgrade/hammer
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

Run: http://pulpito.ceph.com/teuthology-2015-06-11_17:05:04-upgrade:giant-x-next-distro-basic-vps/
Jobs: 52 jobs
Logs for one: http://qa-proxy.ceph.com/teuthology/teuthology-2015-06-11_17:05:04-upgrade:giant-x-next-distro-basic-vps/929545/

2015-06-11T17:18:28.217 INFO:teuthology.orchestra.run.vpm042.stdout:Ign http://gitbuilder.ceph.com wheezy/main Translation-en
2015-06-11T17:18:28.218 INFO:teuthology.orchestra.run.vpm042.stdout:Ign http://gitbuilder.ceph.com wheezy/main Translation-en_US
2015-06-11T17:18:28.218 INFO:teuthology.orchestra.run.vpm042.stdout:Ign http://gitbuilder.ceph.com wheezy/main Translation-en
2015-06-11T17:18:28.837 INFO:teuthology.orchestra.run.vpm042.stdout:Fetched 13.3 kB in 1s (8,259 B/s)
2015-06-11T17:18:29.337 INFO:teuthology.orchestra.run.vpm042.stderr:pool 'unique_pool_0' created
2015-06-11T17:18:29.422 INFO:teuthology.orchestra.run.vpm069:Running: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0'
2015-06-11T17:18:29.576 INFO:tasks.rados.rados.0.vpm069.stdout:adding op weight snap_remove -> 50
2015-06-11T17:18:29.577 INFO:tasks.rados.rados.0.vpm069.stdout:adding op weight snap_create -> 50
2015-06-11T17:18:29.577 INFO:tasks.rados.rados.0.vpm069.stdout:adding op weight rollback -> 50
2015-06-11T17:18:29.577 INFO:tasks.rados.rados.0.vpm069.stderr:unknown op append_excl
2015-06-11T17:18:29.593 ERROR:teuthology.parallel:Exception in parallel execution
Traceback (most recent call last):
  File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 82, in __exit__
    for result in self:
  File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 101, in next
    resurrect_traceback(result)
  File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 19, in capture_traceback
    return func(*args, **kwargs)
  File "/home/teuthworker/src/teuthology_master/teuthology/task/parallel.py", line 50, in _run_spawned
    mgr = run_tasks.run_one_task(taskname, ctx=ctx, config=config)
  File "/home/teuthworker/src/teuthology_master/teuthology/run_tasks.py", line 41, in run_one_task
    return fn(**kwargs)
  File "/home/teuthworker/src/teuthology_master/teuthology/task/sequential.py", line 55, in task
    mgr.__exit__(*exc_info)
  File "/usr/lib/python2.7/contextlib.py", line 24, in __exit__
    self.gen.next()
  File "/var/lib/teuthworker/src/ceph-qa-suite_next/tasks/rados.py", line 245, in task
    running.get()
  File "/usr/lib/python2.7/dist-packages/gevent/greenlet.py", line 331, in get
    raise self._exception
CommandFailedError: Command failed on vpm069 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0'
2015-06-11T17:18:29.612 ERROR:teuthology.run_tasks:Saw exception from tasks.
Traceback (most recent call last):
  File "/home/teuthworker/src/teuthology_master/teuthology/run_tasks.py", line 53, in run_tasks
    manager = run_one_task(taskname, ctx=ctx, config=config)
  File "/home/teuthworker/src/teuthology_master/teuthology/run_tasks.py", line 41, in run_one_task
    return fn(**kwargs)
  File "/home/teuthworker/src/teuthology_master/teuthology/task/parallel.py", line 43, in task
    p.spawn(_run_spawned, ctx, confg, taskname)
  File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 89, in __exit__
    raise
CommandFailedError: Command failed on vpm069 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0'
2015-06-11T17:18:29.647 ERROR:teuthology.run_tasks: Sentry event: http://sentry.ceph.com/sepia/teuthology/search?q=c1bc8c99606c41e5a00306b1488eb049
CommandFailedError: Command failed on vpm069 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0'
2015-06-11T17:18:29.647 DEBUG:teuthology.run_tasks:Unwinding manager ceph
2015-06-11T17:18:29.648 INFO:teuthology.orchestra.run.vpm069:Running: 'adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph pg dump --format json'
Actions

Also available in: Atom PDF