Project

General

Profile

Actions

Bug #11994

closed

"unknown op append_excl" in upgrade:giant-x-next-distro-basic-vps run

Added by Yuri Weinstein almost 9 years ago. Updated over 8 years ago.

Status:
Resolved
Priority:
Urgent
Assignee:
Category:
-
Target version:
-
% Done:

0%

Source:
Q/A
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
upgrade/firefly, upgrade/firefly-x, upgrade/giant-x, upgrade/hammer
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

Run: http://pulpito.ceph.com/teuthology-2015-06-11_17:05:04-upgrade:giant-x-next-distro-basic-vps/
Jobs: 52 jobs
Logs for one: http://qa-proxy.ceph.com/teuthology/teuthology-2015-06-11_17:05:04-upgrade:giant-x-next-distro-basic-vps/929545/

2015-06-11T17:18:28.217 INFO:teuthology.orchestra.run.vpm042.stdout:Ign http://gitbuilder.ceph.com wheezy/main Translation-en
2015-06-11T17:18:28.218 INFO:teuthology.orchestra.run.vpm042.stdout:Ign http://gitbuilder.ceph.com wheezy/main Translation-en_US
2015-06-11T17:18:28.218 INFO:teuthology.orchestra.run.vpm042.stdout:Ign http://gitbuilder.ceph.com wheezy/main Translation-en
2015-06-11T17:18:28.837 INFO:teuthology.orchestra.run.vpm042.stdout:Fetched 13.3 kB in 1s (8,259 B/s)
2015-06-11T17:18:29.337 INFO:teuthology.orchestra.run.vpm042.stderr:pool 'unique_pool_0' created
2015-06-11T17:18:29.422 INFO:teuthology.orchestra.run.vpm069:Running: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0'
2015-06-11T17:18:29.576 INFO:tasks.rados.rados.0.vpm069.stdout:adding op weight snap_remove -> 50
2015-06-11T17:18:29.577 INFO:tasks.rados.rados.0.vpm069.stdout:adding op weight snap_create -> 50
2015-06-11T17:18:29.577 INFO:tasks.rados.rados.0.vpm069.stdout:adding op weight rollback -> 50
2015-06-11T17:18:29.577 INFO:tasks.rados.rados.0.vpm069.stderr:unknown op append_excl
2015-06-11T17:18:29.593 ERROR:teuthology.parallel:Exception in parallel execution
Traceback (most recent call last):
  File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 82, in __exit__
    for result in self:
  File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 101, in next
    resurrect_traceback(result)
  File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 19, in capture_traceback
    return func(*args, **kwargs)
  File "/home/teuthworker/src/teuthology_master/teuthology/task/parallel.py", line 50, in _run_spawned
    mgr = run_tasks.run_one_task(taskname, ctx=ctx, config=config)
  File "/home/teuthworker/src/teuthology_master/teuthology/run_tasks.py", line 41, in run_one_task
    return fn(**kwargs)
  File "/home/teuthworker/src/teuthology_master/teuthology/task/sequential.py", line 55, in task
    mgr.__exit__(*exc_info)
  File "/usr/lib/python2.7/contextlib.py", line 24, in __exit__
    self.gen.next()
  File "/var/lib/teuthworker/src/ceph-qa-suite_next/tasks/rados.py", line 245, in task
    running.get()
  File "/usr/lib/python2.7/dist-packages/gevent/greenlet.py", line 331, in get
    raise self._exception
CommandFailedError: Command failed on vpm069 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0'
2015-06-11T17:18:29.612 ERROR:teuthology.run_tasks:Saw exception from tasks.
Traceback (most recent call last):
  File "/home/teuthworker/src/teuthology_master/teuthology/run_tasks.py", line 53, in run_tasks
    manager = run_one_task(taskname, ctx=ctx, config=config)
  File "/home/teuthworker/src/teuthology_master/teuthology/run_tasks.py", line 41, in run_one_task
    return fn(**kwargs)
  File "/home/teuthworker/src/teuthology_master/teuthology/task/parallel.py", line 43, in task
    p.spawn(_run_spawned, ctx, confg, taskname)
  File "/home/teuthworker/src/teuthology_master/teuthology/parallel.py", line 89, in __exit__
    raise
CommandFailedError: Command failed on vpm069 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0'
2015-06-11T17:18:29.647 ERROR:teuthology.run_tasks: Sentry event: http://sentry.ceph.com/sepia/teuthology/search?q=c1bc8c99606c41e5a00306b1488eb049
CommandFailedError: Command failed on vpm069 with status 1: 'CEPH_CLIENT_ID=0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph_test_rados --ec-pool --max-ops 4000 --objects 50 --max-in-flight 16 --size 4000000 --min-stride-size 400000 --max-stride-size 800000 --max-seconds 0 --op snap_remove 50 --op snap_create 50 --op rollback 50 --op append_excl 50 --op setattr 25 --op read 100 --op copy_from 50 --op write 0 --op write_excl 0 --op rmattr 25 --op append 50 --op delete 50 --pool unique_pool_0'
2015-06-11T17:18:29.647 DEBUG:teuthology.run_tasks:Unwinding manager ceph
2015-06-11T17:18:29.648 INFO:teuthology.orchestra.run.vpm069:Running: 'adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph pg dump --format json'
Actions #1

Updated by Sage Weil almost 9 years ago

  • Assignee set to Sage Weil
Actions #2

Updated by Sage Weil almost 9 years ago

  • Priority changed from Normal to Urgent
Actions #3

Updated by Sage Weil almost 9 years ago

  • Status changed from New to In Progress
Actions #4

Updated by Samuel Just almost 9 years ago

It's running the new version of rados.py with an old version of ceph_test_rados.

Actions #6

Updated by Yuri Weinstein almost 9 years ago

Run: http://pulpito.ceph.com/teuthology-2015-06-16_17:05:01-upgrade:giant-x-next-distro-basic-multi/
Jobs: ['936330', '936331', '936332', '936334', '936344', '936346', '936348', '936359', '936363', '936365', '936367', '936369', '936377', '936379', '936382']

Actions #7

Updated by Yuri Weinstein almost 9 years ago

  • ceph-qa-suite upgrade/firefly-x added

Run: http://pulpito.ceph.com/teuthology-2015-06-21_17:18:01-upgrade:firefly-x-next-distro-basic-vps/
Jobs: ['943417', '943418', '943422', '943426', '943429', '943432', '943434', '943436', '943439', '943441', '943443', '943448', '943449', '943450', '943456', '943457', '943458', '943461', '943463', '943465', '943467', '943468', '943471', '943472', '943474', '943475']

Actions #8

Updated by Yuri Weinstein almost 9 years ago

  • Release set to hammer
  • ceph-qa-suite upgrade/hammer added

In hammer
Run: http://pulpito.ceph.com/teuthology-2015-06-24_07:53:40-upgrade:hammer-hammer---basic-vps/
Jobs: 32
Logs for one: http://qa-proxy.ceph.com/teuthology/teuthology-2015-06-24_07:53:40-upgrade:hammer-hammer---basic-vps/947780/

2015-06-24T08:49:05.637 INFO:teuthology.task.install:Package version is 0.94.2-12-g78d894a-1precise
2015-06-24T08:49:05.638 INFO:teuthology.orchestra.run.vpm048:Running: 'echo deb http://gitbuilder.ceph.com/ceph-deb-precise-x86_64-basic/ref/hammer precise main | sudo tee /etc/apt/sources.list.d/ceph.list'
2015-06-24T08:49:05.695 INFO:teuthology.orchestra.run.vpm048:Running: 'sudo apt-get update'
2015-06-24T08:49:05.697 INFO:tasks.rados.rados.0.vpm156.stdout:adding op weight snap_remove -> 50
2015-06-24T08:49:05.697 INFO:tasks.rados.rados.0.vpm156.stdout:adding op weight snap_create -> 50
2015-06-24T08:49:05.698 INFO:tasks.rados.rados.0.vpm156.stdout:adding op weight rollback -> 50
2015-06-24T08:49:05.698 INFO:tasks.rados.rados.0.vpm156.stdout:adding op weight read -> 100
2015-06-24T08:49:05.698 INFO:tasks.rados.rados.0.vpm156.stdout:adding op weight write -> 50
2015-06-24T08:49:05.698 INFO:tasks.rados.rados.0.vpm156.stderr:unknown op write_excl
2015-06-24T08:49:05.717 ERROR:teuthology.parallel:Exception in parallel execution

Actions #11

Updated by Yuri Weinstein almost 9 years ago

Run: http://pulpito.ceph.com/teuthology-2015-07-12_17:18:02-upgrade:firefly-x-next-distro-basic-vps/
Jobs: ['971059', '971064', '971068', '971077', '971082', '971086']

2015-07-13T08:50:05.004 INFO:tasks.rados.rados.0.vpm003.stdout:adding op weight rollback -> 50
2015-07-13T08:50:05.004 INFO:tasks.rados.rados.0.vpm003.stderr:unknown op append_excl
Actions #12

Updated by Sage Weil almost 9 years ago

  • Status changed from In Progress to Resolved

Yuri Weinstein wrote:

Run: http://pulpito.ceph.com/teuthology-2015-07-12_17:18:02-upgrade:firefly-x-next-distro-basic-vps/
Jobs: ['971059', '971064', '971068', '971077', '971082', '971086']

[...]

firefly can't go to next... have to stop at hammer first.

Actions #13

Updated by Yuri Weinstein almost 9 years ago

Sage - shall we stop running firefly-x against next?

Actions #14

Updated by Yuri Weinstein almost 9 years ago

FYI Removed dumpling-firefly-x and firefly-x upgrading to next from nightlies schedules.

Actions #15

Updated by Yuri Weinstein almost 9 years ago

  • Status changed from Resolved to New

Reopened as it still shows up in runs

Run: http://pulpito.ceph.com/teuthology-2015-07-14_17:05:12-upgrade:giant-x-next-distro-basic-vps/
Jobs: ['973754', '973765', '973774', '973784', '973786', '973792', '973797', '973799', '973805', '973815', '973821', '973825', '973830', '973831', '973834', '973837', '973838', '973841']

Actions #16

Updated by Sage Weil over 8 years ago

  • Status changed from New to Resolved
Actions

Also available in: Atom PDF