Bug #49428
closedceph_test_rados_api_snapshots fails with "rados_mon_command osd pool create failed with error -22"
0%
Description
/a/bhubbard-2021-02-22_23:51:15-rados-master-distro-basic-smithi/5904720
2021-02-23T00:19:00.254 INFO:tasks.ceph.osd.0.smithi124.stderr:2021-02-23T00:19:00.249+0000 7fc6b5614700 -1 received signal: Hangup from /usr/bin/python3 /bin/daemon-helper kill ceph-osd -f --cluster ceph -i 0 (PID: 35214) UID: 0 2021-02-23T00:19:00.349 INFO:tasks.ceph.osd.6.smithi181.stderr:2021-02-23T00:19:00.347+0000 7fb0d4277700 -1 received signal: Hangup from /usr/bin/python3 /bin/daemon-helper kill ceph-osd -f --cluster ceph -i 6 (PID: 35211) UID: 0 2021-02-23T00:19:00.453 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: Running main() from gmock_main.cc 2021-02-23T00:19:00.453 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [==========] Running 12 tests from 4 test suites. 2021-02-23T00:19:00.453 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [----------] Global test environment set-up. 2021-02-23T00:19:00.453 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [----------] 4 tests from LibRadosSnapshots 2021-02-23T00:19:00.454 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ RUN ] LibRadosSnapshots.SnapList 2021-02-23T00:19:00.454 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ OK ] LibRadosSnapshots.SnapList (1998 ms) 2021-02-23T00:19:00.454 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ RUN ] LibRadosSnapshots.SnapRemove 2021-02-23T00:19:00.454 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ OK ] LibRadosSnapshots.SnapRemove (2006 ms) 2021-02-23T00:19:00.455 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ RUN ] LibRadosSnapshots.Rollback 2021-02-23T00:19:00.455 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ OK ] LibRadosSnapshots.Rollback (2035 ms) 2021-02-23T00:19:00.455 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ RUN ] LibRadosSnapshots.SnapGetName 2021-02-23T00:19:00.455 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ OK ] LibRadosSnapshots.SnapGetName (1979 ms) 2021-02-23T00:19:00.455 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [----------] 4 tests from LibRadosSnapshots (8018 ms total) 2021-02-23T00:19:00.456 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: 2021-02-23T00:19:00.456 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [----------] 2 tests from LibRadosSnapshotsSelfManaged 2021-02-23T00:19:00.456 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ RUN ] LibRadosSnapshotsSelfManaged.Snap 2021-02-23T00:19:00.456 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ OK ] LibRadosSnapshotsSelfManaged.Snap (4347 ms) 2021-02-23T00:19:00.456 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ RUN ] LibRadosSnapshotsSelfManaged.Rollback 2021-02-23T00:19:00.457 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ OK ] LibRadosSnapshotsSelfManaged.Rollback (3862 ms) 2021-02-23T00:19:00.457 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [----------] 2 tests from LibRadosSnapshotsSelfManaged (8209 ms total) 2021-02-23T00:19:00.457 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: 2021-02-23T00:19:00.457 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [----------] 4 tests from LibRadosSnapshotsEC 2021-02-23T00:19:00.457 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: /home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/17.0.0-1024-ga3591378/rpm/el8/BUILD/ceph-17.0.0-1024-ga3591378/src/test/librados/TestCase.cc:161: Failure 2021-02-23T00:19:00.458 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: Expected equality of these values: 2021-02-23T00:19:00.458 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: "" 2021-02-23T00:19:00.458 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: create_one_ec_pool(pool_name, &s_cluster) 2021-02-23T00:19:00.458 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: Which is: "rados_mon_command osd pool create failed with error -22" 2021-02-23T00:19:00.459 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: [ RUN ] LibRadosSnapshotsEC.SnapList 2021-02-23T00:19:00.459 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: /home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/17.0.0-1024-ga3591378/rpm/el8/BUILD/ceph-17.0.0-1024-ga3591378/src/test/librados/TestCase.cc:172: Failure 2021-02-23T00:19:00.459 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: Expected equality of these values: 2021-02-23T00:19:00.459 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: 0 2021-02-23T00:19:00.460 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx) 2021-02-23T00:19:00.460 INFO:tasks.workunit.client.0.smithi124.stdout: api_snapshots: Which is: -107 2021-02-23T00:19:00.460 INFO:tasks.ceph.osd.1.smithi124.stderr:2021-02-23T00:19:00.451+0000 7f0adb3d5700 -1 received signal: Hangup from /usr/bin/python3 /bin/daemon-helper kill ceph-osd -f --cluster ceph -i 1 (PID: 35216) UID: 0 2021-02-23T00:19:00.489 INFO:tasks.workunit.client.0.smithi124.stdout:error in api_snapshots (43999) 2021-02-23T00:19:00.489 INFO:tasks.workunit.client.0.smithi124.stderr:bash: line 1: 44005 Segmentation fault (core dumped) ceph_test_rados_api_snapshots 2>&1
Updated by Brad Hubbard about 3 years ago
Here's the error from the mon log.
2021-02-23T00:19:00.093+0000 7f1b91d60700 0 mon.a@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"} v 0) v1 2021-02-23T00:19:00.093+0000 7f1b91d60700 20 is_capable service=osd command=osd pool create read write addr 172.21.15.124:0/3414301217 on cap allow * 2021-02-23T00:19:00.093+0000 7f1b91d60700 20 allow so far , doing grant allow * 2021-02-23T00:19:00.093+0000 7f1b91d60700 20 allow all 2021-02-23T00:19:00.093+0000 7f1b91d60700 10 mon.a@0(leader) e1 _allowed_command capable 2021-02-23T00:19:00.093+0000 7f1b91d60700 0 log_channel(audit) log [INF] : from='client.8453 172.21.15.124:0/3414301217' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"}]: dispatch 2021-02-23T00:19:00.093+0000 7f1b91d60700 1 -- [v2:172.21.15.124:3300/0,v1:172.21.15.124:6789/0] --> [v2:172.21.15.124:3300/0,v1:172.21.15.124:6789/0] -- log(1 entries from seq 505 at 2021-02-23T00:19:00.095129+0000) v1 -- 0x56185fc7bdc0 con 0x56185e598c00 2021-02-23T00:19:00.093+0000 7f1b91d60700 10 mon.a@0(leader).paxosservice(osdmap 1..57) dispatch 0x5618602a7c80 mon_command({"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"} v 0) v1 from client.8453 172.21.15.124:0/3414301217 con 0x561860c2fc00 2021-02-23T00:19:00.093+0000 7f1b91d60700 5 mon.a@0(leader).paxos(paxos active c 1..211) is_readable = 1 - now=2021-02-23T00:19:00.095167+0000 lease_expire=2021-02-23T00:19:05.092739+0000 has v0 lc 211 2021-02-23T00:19:00.093+0000 7f1b91d60700 10 mon.a@0(leader).osd e57 preprocess_query mon_command({"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"} v 0) v1 from client.8453 172.21.15.124:0/3414301217 2021-02-23T00:19:00.093+0000 7f1b91d60700 7 mon.a@0(leader).osd e57 prepare_update mon_command({"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"} v 0) v1 from client.8453 172.21.15.124:0/3414301217 2021-02-23T00:19:00.093+0000 7f1b91d60700 1 mon.a@0(leader).osd e57 implicitly use rule named after the pool: test-rados-api-smithi124-44005-9 2021-02-23T00:19:00.093+0000 7f1b91d60700 10 mon.a@0(leader).osd e57 prepare_pool_crush_rule returns -22 2021-02-23T00:19:00.093+0000 7f1b91d60700 2 mon.a@0(leader) e1 send_reply 0x56185ff7fef0 0x56186054eb60 mon_command_ack([{"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"}]=-22 cannot determine the erasure code plugin because there is no 'plugin' entry in the erasure_code_profile {} v57) v1 2021-02-23T00:19:00.093+0000 7f1b91d60700 1 -- [v2:172.21.15.124:3300/0,v1:172.21.15.124:6789/0] --> 172.21.15.124:0/3414301217 -- mon_command_ack([{"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"}]=-22 cannot determine the erasure code plugin because there is no 'plugin' entry in the erasure_code_profile {} v57) v1 -- 0x56186054eb60 con 0x561860c2fc00
Updated by Brad Hubbard about 3 years ago
TLDR skip to *** MON.A ****** below.
So this looks like a race. The calls seem to be serialized in the client code yet somehow we see them being processed out of order by mon.a with the 'rm' finishing after the 'set' and therefore the 'pool create' failing because the profile is empty (the error message should print the profile but prints nothing in this case indicating it's empty.
$ zgrep testprofile-test-rados-api-smithi124-44005-9 ceph.audit.log.gz|sed -n 4,8p 2021-02-23T00:18:59.624810+0000 mon.a (mon.0) 499 : audit [INF] from='client.8453 172.21.15.124:0/3414301217' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile set", "name": "testprofile-test-rados-api-smithi124-44005-9", "profile": [ "k=2", "m=1", "crush-failure-domain=osd"]}]: dispatch 2021-02-23T00:18:59.632775+0000 mon.a (mon.0) 500 : audit [INF] from='client.? ' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]: dispatch 2021-02-23T00:19:00.092935+0000 mon.a (mon.0) 502 : audit [INF] from='client.8453 172.21.15.124:0/3414301217' entity='client.admin' cmd='[{"prefix": "osd erasure-code-profile set", "name": "testprofile-test-rados-api-smithi124-44005-9", "profile": [ "k=2", "m=1", "crush-failure-domain=osd"]}]': finished 2021-02-23T00:19:00.093691+0000 mon.a (mon.0) 503 : audit [INF] from='client.? ' entity='client.admin' cmd='[{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]': finished 2021-02-23T00:19:00.095129+0000 mon.a (mon.0) 505 : audit [INF] from='client.8453 172.21.15.124:0/3414301217' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"}]: dispatch
I think failure injection is playing a major part here. Initially, the 'rm' command is sent to mon.c and immediately after we see a failure injection. I'll try to document the sequence as much as possible below.
Connection to mon.c and send 'rm' with socket failure injected.
2021-02-23T00:18:59.411+0000 7fc65f2e4540 10 start_mon_command cmd=[{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}] 2021-02-23T00:18:59.411+0000 7fc65f2e4540 10 monclient: _send_command 1 [{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}] 2021-02-23T00:18:59.411+0000 7fc65f2e4540 10 monclient: _send_mon_message to mon.c at v2:172.21.15.124:3301/0 2021-02-23T00:18:59.411+0000 7fc65f2e4540 1 -- 172.21.15.124:0/3414301217 --> [v2:172.21.15.124:3301/0,v1:172.21.15.124:6790/0] -- mon_command({"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"} v 0) v1 -- 0x558bf8e3af60 con 0x558bf8e814b0 2021-02-23T00:18:59.412+0000 7fc64eddb700 4 set_mon_vals no callback set 2021-02-23T00:18:59.413+0000 7fc64d5d8700 0 -- 172.21.15.124:0/3414301217 >> [v2:172.21.15.124:3301/0,v1:172.21.15.124:6790/0] conn(0x558bf8e814b0 msgr2=0x558bf8d80a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure
Reconnect to mon.a and send 'rm' again.
2021-02-23T00:18:59.414+0000 7fc65083e700 10 monclient: _finish_hunting 0 2021-02-23T00:18:59.414+0000 7fc65083e700 1 monclient: found mon.a 2021-02-23T00:18:59.414+0000 7fc65083e700 20 monclient: _un_backoff reopen_interval_multipler now 1 2021-02-23T00:18:59.414+0000 7fc65083e700 10 monclient: _send_mon_message to mon.a at v2:172.21.15.124:3300/0 2021-02-23T00:18:59.414+0000 7fc65083e700 1 -- 172.21.15.124:0/3414301217 --> [v2:172.21.15.124:3300/0,v1:172.21.15.124:6789/0] -- mon_subscribe({config=0+,mgrmap=0+,monmap=2+,osdmap=0}) v3 -- 0x7fc638015360 con 0x558bf8e31bb0 2021-02-23T00:18:59.414+0000 7fc65083e700 10 monclient: _send_command 1 [{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}] 2021-02-23T00:18:59.414+0000 7fc65083e700 10 monclient: _send_mon_message to mon.a at v2:172.21.15.124:3300/0
mon.0 (a) acks the command.
2021-02-23T00:18:59.568+0000 7fc63f7fe700 1 -- 172.21.15.124:0/3414301217 <== mon.0 v2:172.21.15.124:3300/0 5 ==== mon_command_ack([{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]=0 erasure-code-profile testprofile-test-rados-api-smithi124-44005-9 does not exist v56) v1 ==== 211+0+0 (secure 0 0 0) 0x7fc640072c10 con 0x558bf8e31bb0 2021-02-23T00:18:59.568+0000 7fc63f7fe700 10 monclient: handle_mon_command_ack 1 [{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}] 2021-02-23T00:18:59.568+0000 7fc63f7fe700 10 monclient: _finish_command 1 = system:0 erasure-code-profile testprofile-test-rados-api-smithi124-44005-9 does not exist
Send 'profile set' command to mon.a and get ack.
2021-02-23T00:18:59.611+0000 7fc65f2e4540 10 start_mon_command cmd=[{"prefix": "osd erasure-code-profile set", "name": "testprofile-test-rados-api-smithi124-44005-9", "profile": [ "k=2", "m=1", "crush-failure-domain=osd"]}] 2021-02-23T00:18:59.611+0000 7fc65f2e4540 10 monclient: _send_command 3 [{"prefix": "osd erasure-code-profile set", "name": "testprofile-test-rados-api-smithi124-44005-9", "profile": [ "k=2", "m=1", "crush-failure-domain=osd"]}] 2021-02-23T00:18:59.611+0000 7fc65f2e4540 10 monclient: _send_mon_message to mon.a at v2:172.21.15.124:3300/0 2021-02-23T00:18:59.611+0000 7fc65f2e4540 1 -- 172.21.15.124:0/3414301217 --> [v2:172.21.15.124:3300/0,v1:172.21.15.124:6789/0] -- mon_command({"prefix": "osd erasure-code-profile set", "name": "testprofile-test-rados-api-smithi124-44005-9", "profile": [ "k=2", "m=1", "crush-failure-domain=osd"]} v 0) v1 -- 0x558bf8e3a440 con 0x558bf8e31bb0 2021-02-23T00:19:00.091+0000 7fc63f7fe700 1 -- 172.21.15.124:0/3414301217 <== mon.0 v2:172.21.15.124:3300/0 7 ==== mon_command_ack([{"prefix": "osd erasure-code-profile set", "name": "testprofile-test-rados-api-smithi124-44005-9", "profile": [ "k=2", "m=1", "crush-failure-domain=osd"]}]=0 v57) v1 ==== 188+0+0 (secure 0 0 0) 0x7fc640022010 con 0x558bf8e31bb0 2021-02-23T00:19:00.091+0000 7fc63f7fe700 10 monclient: handle_mon_command_ack 3 [{"prefix": "osd erasure-code-profile set", "name": "testprofile-test-rados-api-smithi124-44005-9", "profile": [ "k=2", "m=1", "crush-failure-domain=osd"]}] 2021-02-23T00:19:00.091+0000 7fc63f7fe700 10 monclient: _finish_command 3 = system:0
Send out pool create and get error.
2021-02-23T00:19:00.091+0000 7fc65f2e4540 10 start_mon_command cmd=[{"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"}] 2021-02-23T00:19:00.091+0000 7fc65f2e4540 10 monclient: _send_command 4 [{"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"}] 2021-02-23T00:19:00.091+0000 7fc65f2e4540 10 monclient: _send_mon_message to mon.a at v2:172.21.15.124:3300/0 2021-02-23T00:19:00.091+0000 7fc65f2e4540 1 -- 172.21.15.124:0/3414301217 --> [v2:172.21.15.124:3300/0,v1:172.21.15.124:6789/0] -- mon_command({"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"} v 0) v1 -- 0x558bf8e34510 con 0x558bf8e31bb0 2021-02-23T00:19:00.094+0000 7fc63f7fe700 1 -- 172.21.15.124:0/3414301217 <== mon.0 v2:172.21.15.124:3300/0 8 ==== mon_command_ack([{"prefix": "osd pool create", "pool": "test-rados-api-smithi124-44005-9", "pool_type":"erasure", "pg_num":8, "pgp_num":8, "erasure_code_profile":"testprofile-test-rados-api-smithi124-44005-9"}]=-22 cannot determine the erasure code plugin because there is no 'plugin' entry in the erasure_code_profile {} v57) v1 ==== 332+0+0 (secure 0 0 0) 0x7fc640022240 con 0x558bf8e31bb0
Meanwhile in the audit log.
mon.c receives 'rm' command.
2021-02-23T00:18:59.414603+0000 mon.c (mon.2) 79 : audit [INF] from='client.? 172.21.15.124:0/3414301217' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]: dispatch
mon.a receives 'rm' command.
2021-02-23T00:18:59.565685+0000 mon.a (mon.0) 497 : audit [INF] from='client.8453 172.21.15.124:0/3414301217' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]: dispatch
mon.a reports 'rm' command finished after the pool create error.
2021-02-23T00:19:00.093691+0000 mon.a (mon.0) 503 : audit [INF] from='client.? ' entity='client.admin' cmd='[{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]': finished
Around the same time on mon.c we see the following.
2021-02-23T00:19:00.097+0000 7f582b5af700 1 -- [v2:172.21.15.124:3301/0,v1:172.21.15.124:6790/0] <== mon.0 v2:172.21.15.124:3300/0 2082 ==== route(mon_command_ack([{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]=0 v57) v1 tid 1226) v3 ==== 278+0+0 (secure 0 0 0) 0x555752e17e40 con 0x5557521e2800 2021-02-23T00:19:00.097+0000 7f582b5af700 20 mon.c@2(peon) e1 _ms_dispatch existing session 0x555752254d80 for mon.0 2021-02-23T00:19:00.097+0000 7f582b5af700 20 mon.c@2(peon) e1 entity caps allow * 2021-02-23T00:19:00.097+0000 7f582b5af700 20 is_capable service=mon command= read addr v2:172.21.15.124:3300/0 on cap allow * 2021-02-23T00:19:00.097+0000 7f582b5af700 20 allow so far , doing grant allow * 2021-02-23T00:19:00.097+0000 7f582b5af700 20 allow all 2021-02-23T00:19:00.097+0000 7f582b5af700 20 is_capable service=mon command= exec addr v2:172.21.15.124:3300/0 on cap allow * 2021-02-23T00:19:00.097+0000 7f582b5af700 20 allow so far , doing grant allow * 2021-02-23T00:19:00.097+0000 7f582b5af700 20 allow all 2021-02-23T00:19:00.097+0000 7f582b5af700 10 mon.c@2(peon) e1 handle_route tid 1226 mon_command_ack([{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]=0 v57) v1 2021-02-23T00:19:00.097+0000 7f582b5af700 10 mon.c@2(peon) e1 don't have routed request tid 1226
We see what appears to be further confirmation of this in mon.c's log.
2021-02-23T00:19:00.638+0000 7f582b5af700 7 mon.c@2(peon).log v89 update_from_paxos applying incremental log 89 2021-02-23T00:19:00.092935+0000 mon.a (mon.0) 502 : audit [INF] from='client.8453 172.21.15.124:0/3414301217' entity='client.admin' cmd='[{"prefix": "osd erasure-code-profile set", "name": "testprofile-test-rados-api-smithi124-44005-9", "profile": [ "k=2", "m=1", "crush-failure-domain=osd"]}]': finished 2021-02-23T00:19:00.638+0000 7f582b5af700 20 mon.c@2(peon).log v89 update_from_paxos logging for channel 'audit' to file '/var/log/ceph/ceph.audit.log' 2021-02-23T00:19:00.638+0000 7f582b5af700 7 mon.c@2(peon).log v89 update_from_paxos applying incremental log 89 2021-02-23T00:19:00.093691+0000 mon.a (mon.0) 503 : audit [INF] from='client.? ' entity='client.admin' cmd='[{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]': finished
- MON.A ******
While on mon.a (which provides the definitive data set and probably renders most of the above redundant)
Receive the 'rm' command from the client.
2021-02-23T00:18:59.551+0000 7f1b91d60700 10 mon.a@0(leader).log v87 logging 2021-02-23T00:18:59.414603+0000 mon.c (mon.2) 79 : audit [INF] from='client.? 172.21.15.124:0/3414301217' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]: dispatch
Return DNE (expected response).
2021-02-23T00:18:59.568+0000 7f1b91d60700 2 mon.a@0(leader) e1 send_reply 0x56185fc6b4a0 0x561860f9b380 mon_command_ack([{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]=0 erasure-code-profile testprofile-test-rados-api-smithi124-44005-9 does not exist v56) v1 2021-02-23T00:18:59.568+0000 7f1b91d60700 1 -- [v2:172.21.15.124:3300/0,v1:172.21.15.124:6789/0] --> 172.21.15.124:0/3414301217 -- mon_command_ack([{"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"}]=0 erasure-code-profile testprofile-test-rados-api-smithi124-44005-9 does not exist v56) v1 -- 0x561860f9b380 con 0x561860c2fc00
Receive the 'ec profile set' command, set the plugin to 'jerasure'.
2021-02-23T00:18:59.623+0000 7f1b91d60700 0 mon.a@0(leader) e1 handle_command mon_command({"prefix": "osd erasure-code-profile set", "name": "testprofile-test-rados-api-smithi124-44005-9", "profile": [ "k=2", "m=1", "crush-failure-domain=osd"]} v 0) v1 ... 2021-02-23T00:18:59.628+0000 7f1b91d60700 20 mon.a@0(leader).osd e56 erasure code profile set testprofile-test-rados-api-smithi124-44005-9={crush-device-class=,crush-failure-domain=osd,crush-root=default,jerasure-per-chunk-alignment=false,k=2,m=1,plugin=jerasure,ruleset-failure-domain=osd,technique=reed_sol_van,w=8}
Receive a forwarded 'rm' command from mon.c
2021-02-23T00:18:59.630+0000 7f1b91d60700 1 -- [v2:172.21.15.124:3300/0,v1:172.21.15.124:6789/0] <== mon.2 v2:172.21.15.124:3301/0 1856 ==== forward(mon_command({"prefix": "osd erasure-code-profile rm", "name": "testprofile-test-rados-api-smithi124-44005-9"} v 0) v1 caps allow * tid 1226 con_features 4540138303579357183) v4 ==== 339+0+0 (unknown 0 0 0) 0x561860312480 con 0x56185e599c00
Because of this 'ec profile set' creation is cancelled.
2021-02-23T00:18:59.631+0000 7f1b91d60700 20 mon.a@0(leader).osd e56 erasure code profile rm testprofile-test-rados-api-smithi124-44005-9: creation canceled
The code.
src/mon/OSDMonitor.cc: 10917 } else if (prefix == "osd erasure-code-profile rm") { 10918 string name; 10919 cmd_getval(cmdmap, "name", name); 10920 10921 if (erasure_code_profile_in_use(pending_inc.new_pools, name, &ss)) 10922 goto wait; 10923 10924 if (erasure_code_profile_in_use(osdmap.pools, name, &ss)) { 10925 err = -EBUSY; 10926 goto reply; 10927 } 10928 10929 if (osdmap.has_erasure_code_profile(name) || 10930 »·······pending_inc.new_erasure_code_profiles.count(name)) { 10931 if (osdmap.has_erasure_code_profile(name)) { 10932 »·······pending_inc.old_erasure_code_profiles.push_back(name); 10933 } else { 10934 »·······dout(20) << "erasure code profile rm " << name << ": creation canceled" << dendl; 10935 »·······pending_inc.new_erasure_code_profiles.erase(name); 10936 }
So the socket failure caused there to effectively be two 'rm' commands and the 'first' command ended up being forwarded to the relevant mon well after the 'second' command had been completed and late enough that it actually caused cancellation of the creation of the profile which the test code assumes will be created well after the rm has completed.
This kind of makes a mockery of this, or any similar test, making any assumptions about the order of execution of commands it issues via rados_mon_command().
Updated by Brad Hubbard about 3 years ago
/a/nojha-2021-04-15_20:05:27-rados-wip-50217-distro-basic-smithi/6049636
$ find . -name ceph-mon.\*.log.gz -print -exec zgrep "creation canceled" {} \; ./remote/smithi167/log/ceph-mon.g.log.gz ./remote/smithi167/log/ceph-mon.f.log.gz ./remote/smithi167/log/ceph-mon.h.log.gz ./remote/smithi167/log/ceph-mon.i.log.gz 2021-04-15T22:58:09.541+0000 7f42d60dd700 20 mon.i@7(leader).osd e21 erasure code profile rm testprofile-test-rados-api-smithi167-18396-10: creation canceled ./remote/smithi039/log/ceph-mon.d.log.gz ./remote/smithi039/log/ceph-mon.a.log.gz ./remote/smithi039/log/ceph-mon.b.log.gz ./remote/smithi039/log/ceph-mon.e.log.gz ./remote/smithi039/log/ceph-mon.c.log.gz
Updated by Brad Hubbard about 3 years ago
/a/sage-2021-03-28_19:04:26-rados-wip-sage2-testing-2021-03-28-0933-pacific-distro-basic-smithi/6007274
Updated by Laura Flores about 1 year ago
- Translation missing: en.field_tag_list set to test-failure
Seems like a similar failure:
/a/yuriw-2023-02-16_22:44:43-rados-wip-yuri-testing-2023-02-16-0839-distro-default-smithi/7177511
2023-02-17T11:58:48.411 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: /build/ceph-18.0.0-2342-g35d33aea/src/test/librados/TestCase.cc:168: Failure
2023-02-17T11:58:48.413 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: Expected equality of these values:
2023-02-17T11:58:48.414 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: ""
2023-02-17T11:58:48.414 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: create_one_ec_pool(pool_name, &s_cluster)
2023-02-17T11:58:48.414 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: Which is: "rados_mon_command: erasure-code-profile rm testprofile-LibRadosSnapshotsEC_smithi148-30258-9 failed with error -1"
2023-02-17T11:58:48.414 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: [ RUN ] LibRadosSnapshotsEC.SnapList
2023-02-17T11:58:48.415 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: /build/ceph-18.0.0-2342-g35d33aea/src/test/librados/TestCase.cc:179: Failure
2023-02-17T11:58:48.415 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: Expected equality of these values:
2023-02-17T11:58:48.415 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: 0
2023-02-17T11:58:48.415 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-17T11:58:48.416 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: Which is: -107
2023-02-17T11:58:48.416 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: [ FAILED ] LibRadosSnapshotsEC.SnapList (0 ms)
2023-02-17T11:58:48.416 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: [ RUN ] LibRadosSnapshotsEC.SnapRemove
2023-02-17T11:58:48.417 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: /build/ceph-18.0.0-2342-g35d33aea/src/test/librados/TestCase.cc:179: Failure
2023-02-17T11:58:48.417 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: Expected equality of these values:
2023-02-17T11:58:48.417 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: 0
2023-02-17T11:58:48.418 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-17T11:58:48.418 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: Which is: -107
2023-02-17T11:58:48.418 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: [ FAILED ] LibRadosSnapshotsEC.SnapRemove (0 ms)
2023-02-17T11:58:48.419 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: [ RUN ] LibRadosSnapshotsEC.Rollback
2023-02-17T11:58:48.419 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: /build/ceph-18.0.0-2342-g35d33aea/src/test/librados/TestCase.cc:179: Failure
2023-02-17T11:58:48.419 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: Expected equality of these values:
2023-02-17T11:58:48.419 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: 0
2023-02-17T11:58:48.420 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-17T11:58:48.420 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: Which is: -107
2023-02-17T11:58:48.420 INFO:tasks.workunit.client.0.smithi148.stdout: api_snapshots: [ FAILED ] LibRadosSnapshotsEC.Rollback (0 ms)
There is a coredump available.
Updated by Laura Flores about 1 year ago
Bunch of tests from LibRadosIoEC failing from "rados_mon_command osd pool create failed with error -22"
/a/yuriw-2023-02-24_17:50:19-rados-main-distro-default-smithi/7186883
2023-02-25T01:26:10.649 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [----------] 10 tests from LibRadosIoEC
2023-02-25T01:26:10.649 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: /home/jenkins-build/build/workspace/ceph-dev-new-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/18.0.0-2514-gf9d812a5/rpm/el8/BUILD/ceph-18.0.0-2514-gf9d812a5/src/test/librados/TestCase.cc:171: Failure
2023-02-25T01:26:10.649 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Expected equality of these values:
2023-02-25T01:26:10.650 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: ""
2023-02-25T01:26:10.650 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: create_one_ec_pool(pool_name, &s_cluster)
2023-02-25T01:26:10.650 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Which is: "rados_mon_command osd pool create failed with error -22"
2023-02-25T01:26:10.650 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ RUN ] LibRadosIoEC.SimpleWrite
2023-02-25T01:26:10.650 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: /home/jenkins-build/build/workspace/ceph-dev-new-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/18.0.0-2514-gf9d812a5/rpm/el8/BUILD/ceph-18.0.0-2514-gf9d812a5/src/test/librados/TestCase.cc:184: Failure
2023-02-25T01:26:10.650 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Expected equality of these values:
2023-02-25T01:26:10.651 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: 0
2023-02-25T01:26:10.651 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-25T01:26:10.651 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Which is: -107
2023-02-25T01:26:10.651 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ FAILED ] LibRadosIoEC.SimpleWrite (1 ms)
2023-02-25T01:26:10.651 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ RUN ] LibRadosIoEC.RoundTrip
2023-02-25T01:26:10.651 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: /home/jenkins-build/build/workspace/ceph-dev-new-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/18.0.0-2514-gf9d812a5/rpm/el8/BUILD/ceph-18.0.0-2514-gf9d812a5/src/test/librados/TestCase.cc:184: Failure
2023-02-25T01:26:10.651 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Expected equality of these values:
2023-02-25T01:26:10.652 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: 0
2023-02-25T01:26:10.652 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-25T01:26:10.652 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Which is: -107
2023-02-25T01:26:10.652 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ FAILED ] LibRadosIoEC.RoundTrip (0 ms)
2023-02-25T01:26:10.652 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ RUN ] LibRadosIoEC.OverlappingWriteRoundTrip
2023-02-25T01:26:10.652 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: /home/jenkins-build/build/workspace/ceph-dev-new-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/18.0.0-2514-gf9d812a5/rpm/el8/BUILD/ceph-18.0.0-2514-gf9d812a5/src/test/librados/TestCase.cc:184: Failure
2023-02-25T01:26:10.652 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Expected equality of these values:
2023-02-25T01:26:10.653 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: 0
2023-02-25T01:26:10.653 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-25T01:26:10.653 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Which is: -107
2023-02-25T01:26:10.653 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ FAILED ] LibRadosIoEC.OverlappingWriteRoundTrip (0 ms)
2023-02-25T01:26:10.653 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ RUN ] LibRadosIoEC.WriteFullRoundTrip
2023-02-25T01:26:10.653 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: /home/jenkins-build/build/workspace/ceph-dev-new-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/18.0.0-2514-gf9d812a5/rpm/el8/BUILD/ceph-18.0.0-2514-gf9d812a5/src/test/librados/TestCase.cc:184: Failure
2023-02-25T01:26:10.653 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Expected equality of these values:
2023-02-25T01:26:10.654 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: 0
2023-02-25T01:26:10.654 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-25T01:26:10.654 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Which is: -107
2023-02-25T01:26:10.654 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ FAILED ] LibRadosIoEC.WriteFullRoundTrip (0 ms)
2023-02-25T01:26:10.654 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ RUN ] LibRadosIoEC.AppendRoundTrip
2023-02-25T01:26:10.654 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: /home/jenkins-build/build/workspace/ceph-dev-new-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/18.0.0-2514-gf9d812a5/rpm/el8/BUILD/ceph-18.0.0-2514-gf9d812a5/src/test/librados/TestCase.cc:184: Failure
2023-02-25T01:26:10.654 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Expected equality of these values:
2023-02-25T01:26:10.655 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: 0
2023-02-25T01:26:10.655 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-25T01:26:10.655 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Which is: -107
2023-02-25T01:26:10.655 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ FAILED ] LibRadosIoEC.AppendRoundTrip (0 ms)
2023-02-25T01:26:10.655 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ RUN ] LibRadosIoEC.TruncTest
2023-02-25T01:26:10.655 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: /home/jenkins-build/build/workspace/ceph-dev-new-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/18.0.0-2514-gf9d812a5/rpm/el8/BUILD/ceph-18.0.0-2514-gf9d812a5/src/test/librados/TestCase.cc:184: Failure
2023-02-25T01:26:10.655 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Expected equality of these values:
2023-02-25T01:26:10.655 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: 0
2023-02-25T01:26:10.656 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-25T01:26:10.656 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Which is: -107
2023-02-25T01:26:10.656 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ FAILED ] LibRadosIoEC.TruncTest (0 ms)
2023-02-25T01:26:10.656 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ RUN ] LibRadosIoEC.RemoveTest
2023-02-25T01:26:10.656 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: /home/jenkins-build/build/workspace/ceph-dev-new-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/18.0.0-2514-gf9d812a5/rpm/el8/BUILD/ceph-18.0.0-2514-gf9d812a5/src/test/librados/TestCase.cc:184: Failure
2023-02-25T01:26:10.656 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Expected equality of these values:
2023-02-25T01:26:10.656 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: 0
2023-02-25T01:26:10.657 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: rados_ioctx_create(cluster, pool_name.c_str(), &ioctx)
2023-02-25T01:26:10.657 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Which is: -107
2023-02-25T01:26:10.657 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ FAILED ] LibRadosIoEC.RemoveTest (0 ms)
2023-02-25T01:26:10.657 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: [ RUN ] LibRadosIoEC.XattrsRoundTrip
2023-02-25T01:26:10.657 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: /home/jenkins-build/build/workspace/ceph-dev-new-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos8/DIST/centos8/MACHINE_SIZE/gigantic/release/18.0.0-2514-gf9d812a5/rpm/el8/BUILD/ceph-18.0.0-2514-gf9d812a5/src/test/librados/TestCase.cc:184: Failure
2023-02-25T01:26:10.657 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: Expected equality of these values:
2023-02-25T01:26:10.657 INFO:tasks.workunit.client.0.smithi008.stdout: api_io: 0
Updated by Radoslaw Zarzynski about 1 year ago
- Assignee set to Brad Hubbard
- Priority changed from Normal to Low
Updated by Laura Flores 10 months ago
This one looks like a dupe of https://tracker.ceph.com/issues/50042. Let's close it for now, but re-open if necessary.
Updated by Laura Flores 10 months ago
- Is duplicate of Bug #50042: rados/test.sh: api_watch_notify failures added
Updated by Nitzan Mordechai 4 months ago
- Related to Bug #63789: LibRadosIoEC test failure added