Project

General

Profile

Actions

Bug #62449

open

test/cls_2pc_queue: TestCls2PCQueue.MultiProducer and TestCls2PCQueue.AsyncConsumer failure

Added by Laura Flores 9 months ago. Updated 6 months ago.

Status:
Pending Backport
Priority:
Normal
Target version:
-
% Done:

0%

Source:
Tags:
notifications backport_processed
Backport:
reef
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

/a/yuriw-2023-08-11_02:49:40-rados-wip-yuri4-testing-2023-08-10-1739-distro-default-smithi/7367069

2023-08-11T10:27:29.321 INFO:tasks.workunit.client.0.smithi088.stdout:[ RUN      ] TestCls2PCQueue.MultiProducer
2023-08-11T10:27:30.684 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:30 smithi196 ceph-mon[163147]: osdmap e866: 8 total, 8 up, 8 in
2023-08-11T10:27:30.684 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:30 smithi196 ceph-mon[163147]: pgmap v972: 105 pgs: 105 active+clean; 583 KiB data, 2.9 GiB used, 712 GiB / 715 GiB avail
2023-08-11T10:27:30.787 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:30 smithi088 ceph-mon[178644]: osdmap e866: 8 total, 8 up, 8 in
2023-08-11T10:27:30.788 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:30 smithi088 ceph-mon[178644]: pgmap v972: 105 pgs: 105 active+clean; 583 KiB data, 2.9 GiB used, 712 GiB / 715 GiB avail
2023-08-11T10:27:30.788 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:30 smithi088 ceph-mon[182061]: osdmap e866: 8 total, 8 up, 8 in
2023-08-11T10:27:30.789 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:30 smithi088 ceph-mon[182061]: pgmap v972: 105 pgs: 105 active+clean; 583 KiB data, 2.9 GiB used, 712 GiB / 715 GiB avail
2023-08-11T10:27:31.364 INFO:tasks.workunit.client.0.smithi088.stdout:/home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/17.2.6-882-gfd55b450/rpm/el8/BUILD/ceph-17.2.6-882-gfd55b450/src/test/cls_2pc_queue/test_cls_2pc_queue.cc:720: Failure
2023-08-11T10:27:31.364 INFO:tasks.workunit.client.0.smithi088.stdout:Expected equality of these values:
2023-08-11T10:27:31.364 INFO:tasks.workunit.client.0.smithi088.stdout:  0
2023-08-11T10:27:31.364 INFO:tasks.workunit.client.0.smithi088.stdout:  ioctx.operate(queue_name, &op)
2023-08-11T10:27:31.365 INFO:tasks.workunit.client.0.smithi088.stdout:    Which is: -22
2023-08-11T10:27:31.684 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:31 smithi196 ceph-mon[163147]: osdmap e867: 8 total, 8 up, 8 in
2023-08-11T10:27:31.684 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:31 smithi196 ceph-mon[163147]: from='client.? 172.21.15.88:0/2067352807' entity='client.admin' cmd=[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-17","app": "rados","yes_i_really_mean_it": true}]: dispatch
2023-08-11T10:27:31.684 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:31 smithi196 ceph-mon[163147]: from='client.? ' entity='client.admin' cmd=[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-17","app": "rados","yes_i_really_mean_it": true}]: dispatch
2023-08-11T10:27:31.786 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:31 smithi088 ceph-mon[178644]: osdmap e867: 8 total, 8 up, 8 in
2023-08-11T10:27:31.786 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:31 smithi088 ceph-mon[178644]: from='client.? 172.21.15.88:0/2067352807' entity='client.admin' cmd=[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-17","app": "rados","yes_i_really_mean_it": true}]: dispatch
2023-08-11T10:27:31.787 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:31 smithi088 ceph-mon[178644]: from='client.? ' entity='client.admin' cmd=[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-17","app": "rados","yes_i_really_mean_it": true}]: dispatch
2023-08-11T10:27:31.787 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:31 smithi088 ceph-mon[182061]: osdmap e867: 8 total, 8 up, 8 in
2023-08-11T10:27:31.787 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:31 smithi088 ceph-mon[182061]: from='client.? 172.21.15.88:0/2067352807' entity='client.admin' cmd=[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-17","app": "rados","yes_i_really_mean_it": true}]: dispatch
2023-08-11T10:27:31.787 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:31 smithi088 ceph-mon[182061]: from='client.? ' entity='client.admin' cmd=[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-17","app": "rados","yes_i_really_mean_it": true}]: dispatch
2023-08-11T10:27:32.684 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:32 smithi196 ceph-mon[163147]: from='client.? ' entity='client.admin' cmd='[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-17","app": "rados","yes_i_really_mean_it": true}]': finished
2023-08-11T10:27:32.684 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:32 smithi196 ceph-mon[163147]: osdmap e868: 8 total, 8 up, 8 in
2023-08-11T10:27:32.685 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:32 smithi196 ceph-mon[163147]: pgmap v975: 137 pgs: 8 creating+peering, 21 unknown, 108 active+clean; 583 KiB data, 2.9 GiB used, 712 GiB / 715 GiB avail
2023-08-11T10:27:32.786 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:32 smithi088 ceph-mon[178644]: from='client.? ' entity='client.admin' cmd='[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-17","app": "rados","yes_i_really_mean_it": true}]': finished
2023-08-11T10:27:32.786 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:32 smithi088 ceph-mon[178644]: osdmap e868: 8 total, 8 up, 8 in
2023-08-11T10:27:32.786 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:32 smithi088 ceph-mon[178644]: pgmap v975: 137 pgs: 8 creating+peering, 21 unknown, 108 active+clean; 583 KiB data, 2.9 GiB used, 712 GiB / 715 GiB avail
2023-08-11T10:27:32.787 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:32 smithi088 ceph-mon[182061]: from='client.? ' entity='client.admin' cmd='[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-17","app": "rados","yes_i_really_mean_it": true}]': finished
2023-08-11T10:27:32.787 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:32 smithi088 ceph-mon[182061]: osdmap e868: 8 total, 8 up, 8 in
2023-08-11T10:27:32.787 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:32 smithi088 ceph-mon[182061]: pgmap v975: 137 pgs: 8 creating+peering, 21 unknown, 108 active+clean; 583 KiB data, 2.9 GiB used, 712 GiB / 715 GiB avail
2023-08-11T10:27:33.286 INFO:journalctl@ceph.mgr.y.smithi088.stdout:Aug 11 10:27:32 smithi088 conmon[171663]: ::ffff:172.21.15.196 - - [11/Aug/2023:10:27:32] "GET /metrics HTTP/1.1" 200 33816 "" "Prometheus/2.43.0" 
2023-08-11T10:27:33.684 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:33 smithi196 ceph-mon[163147]: osdmap e869: 8 total, 8 up, 8 in
2023-08-11T10:27:33.786 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:33 smithi088 ceph-mon[178644]: osdmap e869: 8 total, 8 up, 8 in
2023-08-11T10:27:33.786 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:33 smithi088 ceph-mon[182061]: osdmap e869: 8 total, 8 up, 8 in
2023-08-11T10:27:34.934 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:34 smithi196 ceph-mon[163147]: pgmap v977: 137 pgs: 8 creating+peering, 7 unknown, 122 active+clean; 802 KiB data, 2.9 GiB used, 712 GiB / 715 GiB avail; 605 KiB/s rd, 1.9 MiB/s wr, 2.05k op/s
2023-08-11T10:27:35.036 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:34 smithi088 ceph-mon[178644]: pgmap v977: 137 pgs: 8 creating+peering, 7 unknown, 122 active+clean; 802 KiB data, 2.9 GiB used, 712 GiB / 715 GiB avail; 605 KiB/s rd, 1.9 MiB/s wr, 2.05k op/s
2023-08-11T10:27:35.037 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:34 smithi088 ceph-mon[182061]: pgmap v977: 137 pgs: 8 creating+peering, 7 unknown, 122 active+clean; 802 KiB data, 2.9 GiB used, 712 GiB / 715 GiB avail; 605 KiB/s rd, 1.9 MiB/s wr, 2.05k op/s
2023-08-11T10:27:37.185 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:36 smithi196 ceph-mon[163147]: pgmap v978: 137 pgs: 3 creating+peering, 134 active+clean; 802 KiB data, 3.0 GiB used, 712 GiB / 715 GiB avail; 457 KiB/s rd, 1.4 MiB/s wr, 1.55k op/s
2023-08-11T10:27:37.286 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:36 smithi088 ceph-mon[182061]: pgmap v978: 137 pgs: 3 creating+peering, 134 active+clean; 802 KiB data, 3.0 GiB used, 712 GiB / 715 GiB avail; 457 KiB/s rd, 1.4 MiB/s wr, 1.55k op/s
2023-08-11T10:27:37.287 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:36 smithi088 ceph-mon[178644]: pgmap v978: 137 pgs: 3 creating+peering, 134 active+clean; 802 KiB data, 3.0 GiB used, 712 GiB / 715 GiB avail; 457 KiB/s rd, 1.4 MiB/s wr, 1.55k op/s
2023-08-11T10:27:39.184 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:38 smithi196 ceph-mon[163147]: pgmap v979: 137 pgs: 137 active+clean; 1.9 MiB data, 3.0 GiB used, 712 GiB / 715 GiB avail; 2.4 MiB/s rd, 7.6 MiB/s wr, 8.37k op/s
2023-08-11T10:27:39.230 INFO:tasks.workunit.client.0.smithi088.stdout:/home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/17.2.6-882-gfd55b450/rpm/el8/BUILD/ceph-17.2.6-882-gfd55b450/src/test/cls_2pc_queue/test_cls_2pc_queue.cc:726: Failure
2023-08-11T10:27:39.230 INFO:tasks.workunit.client.0.smithi088.stdout:Expected equality of these values:
2023-08-11T10:27:39.231 INFO:tasks.workunit.client.0.smithi088.stdout:  consume_count
2023-08-11T10:27:39.231 INFO:tasks.workunit.client.0.smithi088.stdout:    Which is: 0
2023-08-11T10:27:39.231 INFO:tasks.workunit.client.0.smithi088.stdout:  number_of_ops*number_of_elements*max_producer_count
2023-08-11T10:27:39.231 INFO:tasks.workunit.client.0.smithi088.stdout:    Which is: 69000
2023-08-11T10:27:39.286 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:38 smithi088 ceph-mon[178644]: pgmap v979: 137 pgs: 137 active+clean; 1.9 MiB data, 3.0 GiB used, 712 GiB / 715 GiB avail; 2.4 MiB/s rd, 7.6 MiB/s wr, 8.37k op/s
2023-08-11T10:27:39.286 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:38 smithi088 ceph-mon[182061]: pgmap v979: 137 pgs: 137 active+clean; 1.9 MiB data, 3.0 GiB used, 712 GiB / 715 GiB avail; 2.4 MiB/s rd, 7.6 MiB/s wr, 8.37k op/s
2023-08-11T10:27:39.919 INFO:tasks.workunit.client.0.smithi088.stdout:[  FAILED  ] TestCls2PCQueue.MultiProducer (10599 ms)

2023-08-11T10:27:39.919 INFO:tasks.workunit.client.0.smithi088.stdout:[ RUN      ] TestCls2PCQueue.AsyncConsumer
2023-08-11T10:27:41.184 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:40 smithi196 ceph-mon[163147]: pgmap v980: 137 pgs: 137 active+clean; 1.9 MiB data, 3.0 GiB used, 712 GiB / 715 GiB avail; 2.1 MiB/s rd, 6.7 MiB/s wr, 7.40k op/s
2023-08-11T10:27:41.184 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:40 smithi196 ceph-mon[163147]: osdmap e870: 8 total, 8 up, 8 in
2023-08-11T10:27:41.286 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:40 smithi088 ceph-mon[178644]: pgmap v980: 137 pgs: 137 active+clean; 1.9 MiB data, 3.0 GiB used, 712 GiB / 715 GiB avail; 2.1 MiB/s rd, 6.7 MiB/s wr, 7.40k op/s
2023-08-11T10:27:41.287 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:40 smithi088 ceph-mon[178644]: osdmap e870: 8 total, 8 up, 8 in
2023-08-11T10:27:41.287 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:40 smithi088 ceph-mon[182061]: pgmap v980: 137 pgs: 137 active+clean; 1.9 MiB data, 3.0 GiB used, 712 GiB / 715 GiB avail; 2.1 MiB/s rd, 6.7 MiB/s wr, 7.40k op/s
2023-08-11T10:27:41.287 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:40 smithi088 ceph-mon[182061]: osdmap e870: 8 total, 8 up, 8 in
2023-08-11T10:27:42.286 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[178644]: osdmap e871: 8 total, 8 up, 8 in
2023-08-11T10:27:42.287 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[178644]: from='client.? 172.21.15.88:0/2020995373' entity='client.admin' cmd=[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-18","app": "rados","yes_i_really_mean_it": true}]: dispatch
2023-08-11T10:27:42.287 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[178644]: from='mgr.34107 172.21.15.88:0/3981614991' entity='mgr.y' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
2023-08-11T10:27:42.287 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[178644]: from='client.? 172.21.15.88:0/2020995373' entity='client.admin' cmd='[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-18","app": "rados","yes_i_really_mean_it": true}]': finished
2023-08-11T10:27:42.288 INFO:journalctl@ceph.mon.a.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[178644]: osdmap e872: 8 total, 8 up, 8 in
2023-08-11T10:27:42.288 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[182061]: osdmap e871: 8 total, 8 up, 8 in
2023-08-11T10:27:42.288 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[182061]: from='client.? 172.21.15.88:0/2020995373' entity='client.admin' cmd=[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-18","app": "rados","yes_i_really_mean_it": true}]: dispatch
2023-08-11T10:27:42.289 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[182061]: from='mgr.34107 172.21.15.88:0/3981614991' entity='mgr.y' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
2023-08-11T10:27:42.289 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[182061]: from='client.? 172.21.15.88:0/2020995373' entity='client.admin' cmd='[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-18","app": "rados","yes_i_really_mean_it": true}]': finished
2023-08-11T10:27:42.289 INFO:journalctl@ceph.mon.c.smithi088.stdout:Aug 11 10:27:41 smithi088 ceph-mon[182061]: osdmap e872: 8 total, 8 up, 8 in
2023-08-11T10:27:42.435 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:41 smithi196 ceph-mon[163147]: osdmap e871: 8 total, 8 up, 8 in
2023-08-11T10:27:42.435 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:41 smithi196 ceph-mon[163147]: from='client.? 172.21.15.88:0/2020995373' entity='client.admin' cmd=[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-18","app": "rados","yes_i_really_mean_it": true}]: dispatch
2023-08-11T10:27:42.435 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:41 smithi196 ceph-mon[163147]: from='mgr.34107 172.21.15.88:0/3981614991' entity='mgr.y' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
2023-08-11T10:27:42.435 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:41 smithi196 ceph-mon[163147]: from='client.? 172.21.15.88:0/2020995373' entity='client.admin' cmd='[{"prefix": "osd pool application enable","pool": "test-rados-api-smithi088-219235-18","app": "rados","yes_i_really_mean_it": true}]': finished
2023-08-11T10:27:42.435 INFO:journalctl@ceph.mon.b.smithi196.stdout:Aug 11 10:27:41 smithi196 ceph-mon[163147]: osdmap e872: 8 total, 8 up, 8 in
2023-08-11T10:27:42.650 INFO:tasks.workunit.client.0.smithi088.stdout:/home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/17.2.6-882-gfd55b450/rpm/el8/BUILD/ceph-17.2.6-882-gfd55b450/src/test/cls_2pc_queue/test_cls_2pc_queue.cc:780: Failure
2023-08-11T10:27:42.650 INFO:tasks.workunit.client.0.smithi088.stdout:Expected equality of these values:
2023-08-11T10:27:42.650 INFO:tasks.workunit.client.0.smithi088.stdout:  0
2023-08-11T10:27:42.650 INFO:tasks.workunit.client.0.smithi088.stdout:  ioctx.operate(queue_name, &wop)
2023-08-11T10:27:42.650 INFO:tasks.workunit.client.0.smithi088.stdout:    Which is: -22
2023-08-11T10:27:42.942 INFO:tasks.workunit.client.0.smithi088.stdout:[  FAILED  ] TestCls2PCQueue.AsyncConsumer (3020 ms)

Subtasks 1 (1 open0 closed)

Bug #63355: test/cls_2pc_queue: fails during migration testsPending BackportAli Masarwa

Actions

Related issues 1 (1 open0 closed)

Copied to rgw - Backport #63498: reef: test/cls_2pc_queue: TestCls2PCQueue.MultiProducer and TestCls2PCQueue.AsyncConsumer failureNewYuval LifshitzActions
Actions

Also available in: Atom PDF