Ceph : Issues
https://tracker.ceph.com/
https://tracker.ceph.com/favicon.ico
2021-08-02T14:34:50Z
Ceph
Redmine
rgw - Bug #52000 (Resolved): cephadm deployed RGW prints: "rgw is configured to optionally allow ...
https://tracker.ceph.com/issues/52000
2021-08-02T14:34:50Z
Sebastian Wagner
<blockquote>
<p>WARNING:<br />rgw is configured to optionally allow insecure connections to the monitors<br />(auth_supported, ms_mon_client_mode), ssl certificates stored at the monitor<br />configuration could leak</p>
</blockquote>
<p>Downstream: <a class="external" href="https://bugzilla.redhat.com/show_bug.cgi?id=1981682">https://bugzilla.redhat.com/show_bug.cgi?id=1981682</a></p>
teuthology - Bug #47441 (Closed): teuthology/task/install: verify_package_version: RuntimeError: ...
https://tracker.ceph.com/issues/47441
2020-09-14T14:25:51Z
Sebastian Wagner
<pre>
2020-09-14T13:32:56.135 INFO:teuthology.packaging:The installed version of ceph is 16.0.0-5509.g7f41e68.el8
2020-09-14T13:32:56.136 ERROR:teuthology.contextutil:Saw exception from nested tasks
Traceback (most recent call last):
File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/contextutil.py", line 31, in nested
vars.append(enter())
File "/usr/lib/python3.6/contextlib.py", line 81, in __enter__
return next(self.gen)
File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/task/install/__init__.py", line 218, in install
install_packages(ctx, package_list, config)
File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/task/install/__init__.py", line 87, in install_packages
verify_package_version(ctx, config, remote)
File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/task/install/__init__.py", line 61, in verify_package_version
pkg=pkg_to_check
RuntimeError: ceph version 16.0.0-5509.g7f41e68c8af was not installed, found 16.0.0-5509.g7f41e68.el8.
</pre>
<p>Looks like the builds were duplicated: See <a class="external" href="https://shaman.ceph.com/repos/ceph/wip-swagner-testing-2020-09-14-1230/7f41e68c8afa3f6a917ca548770374067fdb433f/">https://shaman.ceph.com/repos/ceph/wip-swagner-testing-2020-09-14-1230/7f41e68c8afa3f6a917ca548770374067fdb433f/</a></p>
rbd - Bug #46875 (New): TestLibRBD.TestPendingAio: test_librbd.cc:4539: Failure or SIGSEGV
https://tracker.ceph.com/issues/46875
2020-08-10T01:03:45Z
Sebastian Wagner
<pre>
[ RUN ] TestLibRBD.TestPendingAio
using new format!
/home/jenkins-build/build/workspace/ceph-pull-requests/src/test/librbd/test_librbd.cc:4539: Failure
Expected equality of these values:
1
rbd_aio_is_complete(comps[i])
Which is: 0
[ FAILED ] TestLibRBD.TestPendingAio (68 ms)
</pre>
<p><a class="external" href="https://jenkins.ceph.com/job/ceph-pull-requests/57209/consoleFull#-361705261e840cee4-f4a4-4183-81dd-42855615f2c1">https://jenkins.ceph.com/job/ceph-pull-requests/57209/consoleFull#-361705261e840cee4-f4a4-4183-81dd-42855615f2c1</a></p>
rgw-testing - Bug #46734 (Resolved): unittest_rgw_dmclock_scheduler: Queue.SyncRequest: ***Timeou...
https://tracker.ceph.com/issues/46734
2020-07-28T10:46:32Z
Sebastian Wagner
<pre>
204/204 Test #183: unittest_rgw_dmclock_scheduler ............***Timeout 3600.01 sec
did not load config file, using default settings.
[==========] Running 8 tests from 1 test suite.
[----------] Global test environment set-up.
[----------] 8 tests from Queue
[ RUN ] Queue.SyncRequest
2020-07-27T20:34:39.555+0000 7fec06b58c80 -1 Errors while parsing config file!
2020-07-27T20:34:39.555+0000 7fec06b58c80 -1 parse_file: filesystem error: cannot get file size: No such file or directory [ceph.conf]
2020-07-27T20:34:39.555+0000 7fec06b58c80 -1 Errors while parsing config file!
2020-07-27T20:34:39.555+0000 7fec06b58c80 -1 parse_file: filesystem error: cannot get file size: No such file or directory [ceph.conf]
99% tests passed, 1 tests failed out of 204
Total Test time (real) = 3620.01 sec
The following tests FAILED:
183 - unittest_rgw_dmclock_scheduler (Timeout)
Errors while running CTest
Build step 'Execute shell' marked build as failure
</pre>
<p><a class="external" href="https://jenkins.ceph.com/job/ceph-pull-requests/56416/consoleFull#1569702623e840cee4-f4a4-4183-81dd-42855615f2c1">https://jenkins.ceph.com/job/ceph-pull-requests/56416/consoleFull#1569702623e840cee4-f4a4-4183-81dd-42855615f2c1</a></p>
sepia - Bug #46336 (New): https://download-cc-rdu01.fedoraproject.org is unreliable
https://tracker.ceph.com/issues/46336
2020-07-03T09:58:16Z
Sebastian Wagner
<pre>
2020-07-03T09:05:49.488 INFO:teuthology.orchestra.run.smithi058:> sudo yum -y install ceph-test
2020-07-03T09:05:49.626 INFO:teuthology.orchestra.run.smithi195.stdout:Transaction test succeeded.
2020-07-03T09:05:49.627 INFO:teuthology.orchestra.run.smithi195.stdout:Running transaction
2020-07-03T09:05:49.924 INFO:teuthology.orchestra.run.smithi058.stdout:Last metadata expiration check: 0:00:36 ago on Fri 03 Jul 2020 09:05:13 AM UTC.
2020-07-03T09:05:50.065 INFO:teuthology.orchestra.run.smithi195.stdout: Preparing : 1/1
2020-07-03T09:05:50.238 INFO:teuthology.orchestra.run.smithi195.stdout: Installing : libxslt-1.1.32-3.el8.x86_64 1/6
2020-07-03T09:05:50.310 INFO:teuthology.orchestra.run.smithi058.stdout:Dependencies resolved.
2020-07-03T09:05:50.311 INFO:teuthology.orchestra.run.smithi058.stdout:================================================================================
2020-07-03T09:05:50.311 INFO:teuthology.orchestra.run.smithi058.stdout: Package Arch Version Repository Size
2020-07-03T09:05:50.312 INFO:teuthology.orchestra.run.smithi058.stdout:================================================================================
2020-07-03T09:05:50.312 INFO:teuthology.orchestra.run.smithi058.stdout:Installing:
2020-07-03T09:05:50.312 INFO:teuthology.orchestra.run.smithi058.stdout: ceph-test x86_64 2:16.0.0-3122.ge1d6abcdc6f.el8 ceph 45 M
2020-07-03T09:05:50.313 INFO:teuthology.orchestra.run.smithi058.stdout:Installing dependencies:
2020-07-03T09:05:50.313 INFO:teuthology.orchestra.run.smithi058.stdout: jq x86_64 1.5-12.el8 CentOS-AppStream 161 k
2020-07-03T09:05:50.313 INFO:teuthology.orchestra.run.smithi058.stdout: oniguruma x86_64 6.8.2-1.el8 CentOS-AppStream 188 k
2020-07-03T09:05:50.314 INFO:teuthology.orchestra.run.smithi058.stdout: socat x86_64 1.7.3.2-6.el8 CentOS-AppStream 298 k
2020-07-03T09:05:50.314 INFO:teuthology.orchestra.run.smithi058.stdout: libxslt x86_64 1.1.32-3.el8 CentOS-Base 249 k
2020-07-03T09:05:50.314 INFO:teuthology.orchestra.run.smithi058.stdout: xmlstarlet x86_64 1.6.1-11.el8 epel 69 k
2020-07-03T09:05:50.314 INFO:teuthology.orchestra.run.smithi058.stdout:
2020-07-03T09:05:50.315 INFO:teuthology.orchestra.run.smithi058.stdout:Transaction Summary
2020-07-03T09:05:50.315 INFO:teuthology.orchestra.run.smithi058.stdout:================================================================================
2020-07-03T09:05:50.315 INFO:teuthology.orchestra.run.smithi058.stdout:Install 6 Packages
2020-07-03T09:05:50.316 INFO:teuthology.orchestra.run.smithi058.stdout:
2020-07-03T09:05:50.317 INFO:teuthology.orchestra.run.smithi058.stdout:Total download size: 46 M
2020-07-03T09:05:50.317 INFO:teuthology.orchestra.run.smithi058.stdout:Installed size: 194 M
2020-07-03T09:05:50.317 INFO:teuthology.orchestra.run.smithi058.stdout:Downloading Packages:
2020-07-03T09:05:50.348 INFO:teuthology.orchestra.run.smithi058.stdout:[MIRROR] jq-1.5-12.el8.x86_64.rpm: Status code: 503 for https://download-cc-rdu01.fedoraproject.org/pub/centos/8/AppStream/x86_64/os/Packages/jq-1.5-12.el8.x86_64.rpm
2020-07-03T09:05:50.348 INFO:teuthology.orchestra.run.smithi058.stdout:[MIRROR] oniguruma-6.8.2-1.el8.x86_64.rpm: Status code: 503 for https://download-cc-rdu01.fedoraproject.org/pub/centos/8/AppStream/x86_64/os/Packages/oniguruma-6.8.2-1.el8.x86_64.rpm
2020-07-03T09:05:50.394 INFO:teuthology.orchestra.run.smithi195.stdout: Installing : xmlstarlet-1.6.1-11.el8.x86_64 2/6
2020-07-03T09:05:50.454 INFO:teuthology.orchestra.run.smithi058.stdout:(1/6): jq-1.5-12.el8.x86_64.rpm 1.1 MB/s | 161 kB 00:00
2020-07-03T09:05:50.463 INFO:teuthology.orchestra.run.smithi058.stdout:(2/6): oniguruma-6.8.2-1.el8.x86_64.rpm 1.2 MB/s | 188 kB 00:00
2020-07-03T09:05:50.487 INFO:teuthology.orchestra.run.smithi058.stdout:[MIRROR] socat-1.7.3.2-6.el8.x86_64.rpm: Status code: 503 for https://download-cc-rdu01.fedoraproject.org/pub/centos/8/AppStream/x86_64/os/Packages/socat-1.7.3.2-6.el8.x86_64.rpm
2020-07-03T09:05:50.491 INFO:teuthology.orchestra.run.smithi058.stdout:[MIRROR] libxslt-1.1.32-3.el8.x86_64.rpm: Status code: 503 for https://download-cc-rdu01.fedoraproject.org/pub/centos/8/BaseOS/x86_64/os/Packages/libxslt-1.1.32-3.el8.x86_64.rpm
2020-07-03T09:05:50.492 INFO:teuthology.orchestra.run.smithi058.stdout:[MIRROR] socat-1.7.3.2-6.el8.x86_64.rpm: Status code: 404 for http://mirror.linux.duke.edu/pub/centos/8/AppStream/x86_64/os/Packages/socat-1.7.3.2-6.el8.x86_64.rpm
2020-07-03T09:05:50.502 INFO:teuthology.orchestra.run.smithi058.stdout:[MIRROR] socat-1.7.3.2-6.el8.x86_64.rpm: Status code: 404 for http://packages.oit.ncsu.edu/centos/8/AppStream/x86_64/os/Packages/socat-1.7.3.2-6.el8.x86_64.rpm
2020-07-03T09:05:50.508 INFO:teuthology.orchestra.run.smithi058.stdout:[MIRROR] libxslt-1.1.32-3.el8.x86_64.rpm: Status code: 404 for http://mirror.linux.duke.edu/pub/centos/8/BaseOS/x86_64/os/Packages/libxslt-1.1.32-3.el8.x86_64.rpm
2020-07-03T09:05:50.508 INFO:teuthology.orchestra.run.smithi058.stdout:[MIRROR] libxslt-1.1.32-3.el8.x86_64.rpm: Status code: 404 for http://packages.oit.ncsu.edu/centos/8/BaseOS/x86_64/os/Packages/libxslt-1.1.32-3.el8.x86_64.rpm
2020-07-03T09:05:50.539 INFO:teuthology.orchestra.run.smithi058.stdout:[MIRROR] socat-1.7.3.2-6.el8.x86_64.rpm: Status code: 404 for http://distro.ibiblio.org/centos/8/AppStream/x86_64/os/Packages/socat-1.7.3.2-6.el8.x86_64.rpm
2020-07-03T09:05:50.539 INFO:teuthology.orchestra.run.smithi058.stdout:[FAILED] socat-1.7.3.2-6.el8.x86_64.rpm: No more mirrors to try - All mirrors were already tried without success
2020-07-03T09:05:50.541 INFO:teuthology.orchestra.run.smithi058.stdout:
2020-07-03T09:05:50.542 INFO:teuthology.orchestra.run.smithi058.stdout:The downloaded packages were saved in cache until the next successful transaction.
2020-07-03T09:05:50.542 INFO:teuthology.orchestra.run.smithi058.stdout:You can remove cached packages by executing 'dnf clean packages'.
2020-07-03T09:05:50.555 INFO:teuthology.orchestra.run.smithi195.stdout: Installing : socat-1.7.3.2-6.el8.x86_64 3/6
2020-07-03T09:05:50.615 INFO:teuthology.orchestra.run.smithi058.stderr:Error: Error downloading packages:
2020-07-03T09:05:50.615 INFO:teuthology.orchestra.run.smithi058.stderr: Cannot download Packages/socat-1.7.3.2-6.el8.x86_64.rpm: All mirrors were tried
2
</pre>
<p><a class="external" href="https://pulpito.ceph.com/swagner-2020-07-03_08:12:34-rados:cephadm-wip-swagner-testing-2020-07-02-1034-distro-basic-smithi/">https://pulpito.ceph.com/swagner-2020-07-03_08:12:34-rados:cephadm-wip-swagner-testing-2020-07-02-1034-distro-basic-smithi/</a></p>
teuthology - Bug #46300 (Resolved): SELinux: denied { module_request } for comm="ksmtuned" kmod=...
https://tracker.ceph.com/issues/46300
2020-07-01T13:11:21Z
Sebastian Wagner
<p><a class="external" href="https://pulpito.ceph.com/swagner-2020-07-01_10:16:24-rados:cephadm-wip-swagner3-testing-2020-07-01-1013-distro-basic-smithi/5194327/">https://pulpito.ceph.com/swagner-2020-07-01_10:16:24-rados:cephadm-wip-swagner3-testing-2020-07-01-1013-distro-basic-smithi/5194327/</a></p>
<p>Saw this today in a PR run:<br /><pre>
2020-07-01T11:23:01.692 INFO:teuthology.orchestra.run.smithi071:> sudo grep -a 'avc: .*denied' /var/log/audit/audit.log | grep -av '\(comm="dmidecode"\|chronyd.service\|name="cephtest"\|scontext=system_u:system_r:nrpe_t:s0\|scontext=system_u:system_r:pcp_pmlogger_t\|scontext=system_u:system_r:pcp_pmcd_t:s0\|comm="rhsmd"\|scontext=system_u:system_r:syslogd_t:s0\|tcontext=system_u:system_r:nrpe_t:s0\|comm="updatedb"\|comm="smartd"\|comm="rhsmcertd-worke"\|comm="setroubleshootd"\|comm="rpm"\|tcontext=system_u:object_r:container_runtime_exec_t:s0\|scontext=system_u:system_r:logrotate_t:s0\)'
2020-07-01T11:23:01.722 DEBUG:teuthology.orchestra.run:got remote process result: 1
2020-07-01T11:23:01.723 ERROR:teuthology.run_tasks:Manager failed: selinux
Traceback (most recent call last):
File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/run_tasks.py", line 171, in run_tasks
suppress = manager.__exit__(*exc_info)
File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/task/__init__.py", line 136, in __exit__
self.teardown()
File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/task/selinux.py", line 158, in teardown
self.get_new_denials()
File "/home/teuthworker/src/git.ceph.com_git_teuthology_master/teuthology/task/selinux.py", line 208, in get_new_denials
denials=new_denials[remote.name])
teuthology.exceptions.SELinuxError: SELinux denials found on ubuntu@smithi174.front.sepia.ceph.com: ['type=AVC msg=audit(1593601294.109:4683): avc: denied { module_request } for pid=18957 comm="ksmtuned" kmod="binfmt-464c" scontext=system_u:system_r:ksmtuned_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=system permissive=1']
</pre></p>
<p>no clue where this comes from. Might be related to the PRs I tested, but seems unrelated:</p>
<ul>
<li><a class="external" href="https://github.com/ceph/ceph/pull/35850">https://github.com/ceph/ceph/pull/35850</a></li>
<li><a class="external" href="https://github.com/ceph/ceph/pull/35846">https://github.com/ceph/ceph/pull/35846</a></li>
<li><a class="external" href="https://github.com/ceph/ceph/pull/35816">https://github.com/ceph/ceph/pull/35816</a></li>
<li><a class="external" href="https://github.com/ceph/ceph/pull/35747">https://github.com/ceph/ceph/pull/35747</a></li>
</ul>
sepia - Bug #46299 (Closed): Trying to pull docker.io/prom/prometheus:v2.18.1: too many request t...
https://tracker.ceph.com/issues/46299
2020-07-01T10:35:53Z
Sebastian Wagner
<p><a class="external" href="https://pulpito.ceph.com/swagner-2020-07-01_09:26:21-rados:cephadm-wip-swagner-testing-2020-07-01-0956-distro-basic-smithi/5194228/">https://pulpito.ceph.com/swagner-2020-07-01_09:26:21-rados:cephadm-wip-swagner-testing-2020-07-01-0956-distro-basic-smithi/5194228/</a></p>
<pre>
2020-07-01T10:04:28.253 INFO:tasks.cephadm:Adding local image mirror vossi04.front.sepia.ceph.com:5000
2020-07-01T10:04:28.301 DEBUG:teuthology.orchestra.remote:smithi189:/etc/containers/registries.conf is 4KB
2020-07-01T10:04:28.340 INFO:teuthology.orchestra.run.smithi189:> sudo sh -c 'cat > /etc/containers/registries.conf'
2020-07-01T10:04:28.400 DEBUG:teuthology.orchestra.remote:smithi205:/etc/containers/registries.conf is 4KB
2020-07-01T10:04:28.447 INFO:teuthology.orchestra.run.smithi205:> sudo sh -c 'cat > /etc/containers/registries.conf'
...
2020-07-01T10:14:26.759 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: debug 2020-07-01T10:14:26.700+0000 7f9405ffb700 -1 log_channel(cephadm) log [ERR] : cephadm exited with an error code: 1, stderr:INFO:cephadm:Deploy daemon prometheus.a ...
2020-07-01T10:14:26.759 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:Verifying port 9095 ...
2020-07-01T10:14:26.760 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:Non-zero exit code 125 from /bin/podman run --rm --net=host --ipc=host -e CONTAINER_IMAGE=prom/prometheus:v2.18.1 -e NODE_NAME=smithi205 --entrypoint stat prom/prometheus:v2.18.1 -c %u %g /etc/prometheus
2020-07-01T10:14:26.760 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr Trying to pull registry.access.redhat.com/prom/prometheus:v2.18.1...
2020-07-01T10:14:26.760 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr name unknown: Repo not found
2020-07-01T10:14:26.760 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr Trying to pull registry.fedoraproject.org/prom/prometheus:v2.18.1...
2020-07-01T10:14:26.760 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr manifest unknown: manifest unknown
2020-07-01T10:14:26.761 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr Trying to pull registry.centos.org/prom/prometheus:v2.18.1...
2020-07-01T10:14:26.761 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr manifest unknown: manifest unknown
2020-07-01T10:14:26.761 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr Trying to pull docker.io/prom/prometheus:v2.18.1...
2020-07-01T10:14:26.761 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr time="2020-07-01T10:10:18Z" level=error msg="HEADER map[Cache-Control:[no-cache] Content-Type:[application/json] Retry-After:[60]]"
2020-07-01T10:14:26.761 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr time="2020-07-01T10:11:20Z" level=error msg="HEADER map[Cache-Control:[no-cache] Content-Type:[application/json] Retry-After:[60]]"
2020-07-01T10:14:26.762 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr time="2020-07-01T10:12:22Z" level=error msg="HEADER map[Cache-Control:[no-cache] Content-Type:[application/json] Retry-After:[60]]"
2020-07-01T10:14:26.763 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr time="2020-07-01T10:13:24Z" level=error msg="HEADER map[Cache-Control:[no-cache] Content-Type:[application/json] Retry-After:[60]]"
2020-07-01T10:14:26.763 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr too many request to registry
2020-07-01T10:14:26.763 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr Error: unable to pull prom/prometheus:v2.18.1: 4 errors occurred:
2020-07-01T10:14:26.763 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr * Error initializing source docker://registry.access.redhat.com/prom/prometheus:v2.18.1: Error reading manifest v2.18.1 in registry.access.redhat.com/prom/prometheus: name unknown: Repo not found
2020-07-01T10:14:26.763 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr * Error initializing source docker://registry.fedoraproject.org/prom/prometheus:v2.18.1: Error reading manifest v2.18.1 in registry.fedoraproject.org/prom/prometheus: manifest unknown: manifest unknown
2020-07-01T10:14:26.764 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr * Error initializing source docker://registry.centos.org/prom/prometheus:v2.18.1: Error reading manifest v2.18.1 in registry.centos.org/prom/prometheus: manifest unknown: manifest unknown
2020-07-01T10:14:26.764 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr * Error parsing image configuration: too many request to registry
2020-07-01T10:14:26.764 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: INFO:cephadm:stat:stderr
2020-07-01T10:14:26.764 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: Traceback (most recent call last):
2020-07-01T10:14:26.764 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: File "<stdin>", line 4847, in <module>
2020-07-01T10:14:26.765 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: File "<stdin>", line 1187, in _default_image
2020-07-01T10:14:26.765 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: File "<stdin>", line 2886, in command_deploy
2020-07-01T10:14:26.765 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: File "<stdin>", line 2818, in extract_uid_gid_monitoring
2020-07-01T10:14:26.765 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: File "<stdin>", line 1803, in extract_uid_gid
2020-07-01T10:14:26.766 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: File "<stdin>", line 2280, in run
2020-07-01T10:14:26.766 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: File "<stdin>", line 866, in call_throws
2020-07-01T10:14:26.766 INFO:journalctl@ceph.mgr.x.smithi205.stdout:Jul 01 10:14:26 smithi205 bash[31753]: RuntimeError: Failed command: /bin/podman run --rm --net=host --ipc=host -e CONTAINER_IMAGE=prom/prometheus:v2.18.1 -e NODE_NAME=smithi205 --entrypoint stat prom/prometheus:v2.18.1 -c %u %g /etc/prometheus
</pre>
<p>Changing the registry mirror to `docker-mirror.front.sepia.ceph.com:5000` should work. I just can't do that myself.</p>
teuthology - Bug #45583 (New): teuthology-suite: "--subset" combined with "--filter" generates du...
https://tracker.ceph.com/issues/45583
2020-05-18T11:03:34Z
Sebastian Wagner
<p><a class="external" href="http://pulpito.ceph.com/swagner-2020-05-18_08:24:15-rados-wip-swagner-testing-2020-05-15-2348-distro-basic-smithi/">http://pulpito.ceph.com/swagner-2020-05-18_08:24:15-rados-wip-swagner-testing-2020-05-15-2348-distro-basic-smithi/</a></p>
<p>scheduled via</p>
<pre>
teuthology-suite -k distro --priority 75 --suite rados --filter cephadm --subset 1135/9999 --email swagner@suse.com --ceph wip-swagner-testing-2020-05-15-2348 --machine-type smithi
</pre>
<p>scheduled</p>
<ul>
<li><a class="external" href="http://pulpito.ceph.com/swagner-2020-05-18_08:24:15-rados-wip-swagner-testing-2020-05-15-2348-distro-basic-smithi/5066708">http://pulpito.ceph.com/swagner-2020-05-18_08:24:15-rados-wip-swagner-testing-2020-05-15-2348-distro-basic-smithi/5066708</a> </li>
<li><a class="external" href="http://pulpito.ceph.com/swagner-2020-05-18_08:24:15-rados-wip-swagner-testing-2020-05-15-2348-distro-basic-smithi/5066741">http://pulpito.ceph.com/swagner-2020-05-18_08:24:15-rados-wip-swagner-testing-2020-05-15-2348-distro-basic-smithi/5066741</a></li>
</ul>
<p>both having the description:</p>
<pre>
rados/cephadm/upgrade/{1-start.yaml 2-start-upgrade.yaml 3-wait.yaml distro$/{rhel_8.0.yaml} fixed-2.yaml}
</pre>
teuthology - Bug #45442 (New): ubuntu 20.02: Hang on: "The following packages will be REMOVED:"
https://tracker.ceph.com/issues/45442
2020-05-08T07:43:16Z
Sebastian Wagner
<p><a class="external" href="http://pulpito.ceph.com/swagner-2020-05-07_16:10:50-rados-wip-swagner2-testing-2020-05-07-1308-distro-basic-smithi/5030975/">http://pulpito.ceph.com/swagner-2020-05-07_16:10:50-rados-wip-swagner2-testing-2020-05-07-1308-distro-basic-smithi/5030975/</a></p>
<pre>
2020-05-07T17:31:41.061 INFO:teuthology.orchestra.run.smithi086:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done
2020-05-07T17:31:41.179 INFO:teuthology.orchestra.run.smithi086.stdout:Reading package lists...
2020-05-07T17:31:41.315 INFO:teuthology.orchestra.run.smithi086.stdout:Building dependency tree...
2020-05-07T17:31:41.315 INFO:teuthology.orchestra.run.smithi086.stdout:Reading state information...
2020-05-07T17:31:41.442 INFO:teuthology.orchestra.run.smithi086.stdout:The following packages were automatically installed and are no longer required:
2020-05-07T17:31:41.443 INFO:teuthology.orchestra.run.smithi086.stdout: ceph-mon ceph-osd libboost-iostreams1.71.0
2020-05-07T17:31:41.443 INFO:teuthology.orchestra.run.smithi086.stdout:Use 'sudo apt autoremove' to remove them.
2020-05-07T17:31:41.455 INFO:teuthology.orchestra.run.smithi086.stdout:The following packages will be REMOVED:
2020-05-07T17:31:41.456 INFO:teuthology.orchestra.run.smithi086.stdout: ceph*
2020-05-08T05:03:17.376 DEBUG:teuthology.exit:Got signal 15; running 2 handlers...
2020-05-08T05:03:17.396 DEBUG:teuthology.task.console_log:Killing console logger for smithi086
</pre>
<p>Looks like as if <code>-y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold"</code> is still not enough.</p>
sepia - Bug #45009 (Closed): https://download.ceph.com/keys/release.asc: ignored as the file has ...
https://tracker.ceph.com/issues/45009
2020-04-09T07:47:51Z
Sebastian Wagner
<p><a class="external" href="https://download.ceph.com/keys/release.asc">https://download.ceph.com/keys/release.asc</a> is a file format that is not understood by apt:</p>
<pre>
root@buster:~# wget https://download.ceph.com/keys/release.asc
root@buster:~# file release.asc
release.asc: PGP public key block Public-Key (old)
root@buster:~# cp release.asc /etc/apt/trusted.gpg
root@buster:~# apt update
Hit:1 http://httpredir.debian.org/debian buster InRelease
Hit:2 https://download.ceph.com/debian-octopus buster InRelease
Err:2 https://download.ceph.com/debian-octopus buster InRelease
The following signatures couldn't be verified because the public key is not available: NO_PUBKEY E84AC2C0460F3994
Reading package lists... Done
Building dependency tree
Reading state information... Done
All packages are up to date.
W: http://httpredir.debian.org/debian/dists/buster/InRelease: The key(s) in the keyring /etc/apt/trusted.gpg are ignored as the file has an unsupported filetype.
W: https://download.ceph.com/debian-octopus/dists/buster/InRelease: The key(s) in the keyring /etc/apt/trusted.gpg are ignored as the file has an unsupported filetype.
W: An error occurred during the signature verification. The repository is not updated and the previous index files will be used. GPG error: https://download.ceph.com/debian-octopus buster InRelease: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY E84AC2C0460F3994
W: Failed to fetch https://download.ceph.com/debian-octopus/dists/buster/InRelease The following signatures couldn't be verified because the public key is not available: NO_PUBKEY E84AC2C0460F3994
W: Some index files failed to download. They have been ignored, or old ones used instead.
</pre>
<p>However, when converting this to GPG v4, it works:</p>
<pre>
root@buster:~# apt-key add release.asc
root@buster:~# file /etc/apt/trusted.gpg
/etc/apt/trusted.gpg: PGP/GPG key public ring (v4) created Tue Sep 15 20:56:41 2015 RSA (Encrypt or Sign) 4096 bits MPI=0xcbaa7e8ef94169f9...
root@buster:~# apt update
Hit:1 http://httpredir.debian.org/debian buster InRelease
Get:2 https://download.ceph.com/debian-octopus buster InRelease [8557 B]
Get:3 https://download.ceph.com/debian-octopus buster/main amd64 Packages [15.7 kB]
Fetched 24.2 kB in 4s (6765 B/s)
Reading package lists... Done
Building dependency tree
Reading state information... Done
All packages are up to date.
root@buster:~# apt-key list
/etc/apt/trusted.gpg
--------------------
pub rsa4096 2015-09-15 [SC]
08B7 3419 AC32 B4E9 66C1 A330 E84A C2C0 460F 3994
uid [ unknown] Ceph.com (release key) <security@ceph.com>
</pre>
<p>This has an impact on cephadm, which needs to install gnupg on <strong>all</strong> cluster machines in order to convert the key to GPG v4.</p>
<p>Can we provide a key in the correct format?</p>
teuthology - Bug #44181 (New): Error in syslog: task.internal.syslog: random "*BUG*" in log message
https://tracker.ceph.com/issues/44181
2020-02-18T10:33:27Z
Sebastian Wagner
<p><a class="external" href="http://pulpito.ceph.com/sage-2020-02-18_02:48:28-rados-wip-sage4-testing-2020-02-17-1727-distro-basic-smithi/4776502">http://pulpito.ceph.com/sage-2020-02-18_02:48:28-rados-wip-sage4-testing-2020-02-17-1727-distro-basic-smithi/4776502</a></p>
<p>This job failure was caused by</p>
<p><a class="external" href="https://github.com/ceph/teuthology/blob/291d40053a7a5caedc1d683f08d25399bc3b9ccd/teuthology/task/internal/syslog.py#L98-L144">https://github.com/ceph/teuthology/blob/291d40053a7a5caedc1d683f08d25399bc3b9ccd/teuthology/task/internal/syslog.py#L98-L144</a></p>
<pre>
2020-02-18T06:48:18.388 ERROR:teuthology.task.internal.syslog:Error in syslog on ubuntu@smithi060.front.sepia.ceph.com: /home/ubuntu/cephtest/archive/syslog/misc.log:2020-02-18T06:42:25.361371+00:00 smithi060 bash[10468]: audit 2020-02-18T06:42:24.442267+0000 mon.a (mon.0) 82 : audit [INF] from='mgr.14124 172.21.15.60:0/1' entity='mgr.y' cmd=[{"prefix":"config-key set","key":"mgr/dashboard/key","val":"-----BEGIN PRIVATE KEY-----\nMIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQCnskmhDB10Jk6M\nXNpzP+7hOWVV7TYIeAGSapYoNcgQcPrQU0STPGuyUORmnKO4taTVz8EBvPL4p6Mv\niZFEIhL2OL07UexgDqKaAD4lne/KIhYQJVtkqPu/TYYemxa6xyl/V4LGPrSGYx0C\n8huP7RsqEMNLNMr/wG34hG7LCdGtcWk8aylma8XrXgukEHMsJJIeb+ZZKw3AnZCT\nbO++2B+V5DPtE4LId4x3G1PrVumH/whd6ciTtcImFspCRQgwlaPnDLf68bFXF67G\nBWJZZRFoTCc73fu/rUW1vGYk7WFiVi52WVbpgYrPc/AhWNOaH6d0xooBoohRjAYh\n7GDdVa+LAgMBAAECggEAM1HEZpymht0SPLJNx+dQ22wNLvahCoZvNLeZrESJLT7m\nAsr4uXZMHw3SV/SnxecQwr4JetawJJhowCuBYTBsTR2gC39OrzbLXAWm/ywOLfWw\netBz36I3KJw45zTfB9nbQTUuuCyIYngCcNxWwvz0yzLGEUXeudXR0bP1k/01RbZo\nhGe8oQSJzSN4zmfQtx/rSGCXJr23HUjPs0mVHqml2bZL9UZcuKu5RuN8PWSo0aOM\nZwGYa/1pcoo1OsN3XtujY9tU0Ykrd3rteAARMIBFzrktaWWhSdaiOQS0fYAnyGrX\njI7cjlsbtJfTt741wF0hmCZIGS40+HWTmwCTkapWwQKBgQDQkX4ZHyvFD96W81rz\nXLIdSEfgv0+andTC2v5kvlk4cxIYgic2g1R59gekZioOpVIQG/eCwiSFW5ndqjzI\nSGMj2bflL8vXv5q4EX+TL3W5LWOnR8k7FVxJpPsJbbyX93qbpcU/oKNSt5VDUPaL\neooha6lDP+HEAdfWHqe1PxP+9wKBgQDN1UzVWU8ur2tdlql2BVrwfi/J32/ZUFQG\nCPvC9RMdavZjKITu2Rg5LA6kYOnJ9MvVTU59Mf2c+6kKWQTaRhqTqhfAJYtjvmoC\naTGm7HGPywEOeMphF+LAb23DNcCzQFhBVduOfL8MSkTjJOjmaxyYc2qs+ts87NMt\nqCENAaPrDQKBgHvuV/1ZdkqsOVl81QhShku8DWnQg96d9jSqqAr4yE8woQoLHH3Z\n37JwrO3U/xygw3hrBdGextCvM2hxpZhk2vQMhKcclYVnhunlC+dLhio4fESD9WC0\nOphP/hMGL9Ak76fZArHiI+ocyAat7zHF6JofPP6G0QIFDlle8cxS5PDVAoGBAMRB\nByQ5JkV2HqG6YFNWYdICDuClOQj0DVk/wYSulY4sCUacQLtXpUAF4OQcP20/CgaT\n0i2Ot6ixTwi9veG8i+SVflXHtnLhAETSNfRZZyHaRmSdCSGwW5Rt6jMBkn2W8U9C\nZLgj+yjlu270J1hjcn1tNp4+BUG+8M+Mig7TrI4VAoGAYYltCD4bc2bBAPWnF6nk\nqrx16kKg0kjNdhATkBWt76jpsJYRmyo5NALLaB5/k0dS7ftuTmGEZLSnNyl44O2B\n7QH6PaRoP0hX5LtLwSZhiJxd6tDrfwMFzpVGiJHeUNKGS/GKQzlvlxUJb2aOhNWu\nMgFlLWfPOMgxiRpwUhtg0Is=\n-----END PRIVATE KEY-----\n"}]: dispatch
</pre>
<p>Unfortunately this log message contains "<code>BUG</code>" somewhere in the key.</p>
rbd - Bug #43274 (Need More Info): unittest_rbd_mirror: Exception: SegFault
https://tracker.ceph.com/issues/43274
2019-12-12T09:30:06Z
Sebastian Wagner
<p>Unfortunately, I don't know what exactly went wrong:</p>
<pre>
185/191 Test #184: unittest_rbd_mirror .....................***Exception: SegFault 11.74 sec
[==========] Running 279 tests from 34 test suites.
[----------] Global test environment set-up.
[----------] 13 tests from TestMockImageMap
[ RUN ] TestMockImageMap.SetLocalImages
seed 1526
[ OK ] TestMockImageMap.SetLocalImages (8 ms)
[ RUN ] TestMockImageMap.AddRemoveLocalImage
[ OK ] TestMockImageMap.AddRemoveLocalImage (25 ms)
[ RUN ] TestMockImageMap.AddRemoveRemoteImage
[ OK ] TestMockImageMap.AddRemoveRemoteImage (15 ms)
[ RUN ] TestMockImageMap.AddRemoveRemoteImageDuplicateNotification
[ OK ] TestMockImageMap.AddRemoveRemoteImageDuplicateNotification (5 ms)
[ RUN ] TestMockImageMap.AcquireImageErrorRetry
[ OK ] TestMockImageMap.AcquireImageErrorRetry (2 ms)
[ RUN ] TestMockImageMap.RemoveRemoteAndLocalImage
[ OK ] TestMockImageMap.RemoveRemoteAndLocalImage (2 ms)
[ RUN ] TestMockImageMap.AddInstance
[ OK ] TestMockImageMap.AddInstance (4 ms)
[ RUN ] TestMockImageMap.RemoveInstance
[ OK ] TestMockImageMap.RemoveInstance (7 ms)
[ RUN ] TestMockImageMap.AddInstancePingPongImageTest
[ OK ] TestMockImageMap.AddInstancePingPongImageTest (34 ms)
[ RUN ] TestMockImageMap.RemoveInstanceWithRemoveImage
[ OK ] TestMockImageMap.RemoveInstanceWithRemoveImage (23 ms)
[ RUN ] TestMockImageMap.AddErrorAndRemoveImage
[ OK ] TestMockImageMap.AddErrorAndRemoveImage (35 ms)
[ RUN ] TestMockImageMap.MirrorUUIDUpdated
[ OK ] TestMockImageMap.MirrorUUIDUpdated (44 ms)
[ RUN ] TestMockImageMap.RebalanceImageMap
[ OK ] TestMockImageMap.RebalanceImageMap (40 ms)
[----------] 13 tests from TestMockImageMap (244 ms total)
[----------] 14 tests from TestMockImageReplayer
[ RUN ] TestMockImageReplayer.StartStop
Failed to load class: cas (/home/jenkins-build/build/workspace/ceph-pull-requests/build/lib/libcls_cas.so): /home/jenkins-build/build/workspace/ceph-pull-requests/build/lib/libcls_cas.so: undefined symbol: _Z13cls_has_chunkPvNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE
Failed to load class: log (/home/jenkins-build/build/workspace/ceph-pull-requests/build/lib/libcls_log.so): /home/jenkins-build/build/workspace/ceph-pull-requests/build/lib/libcls_log.so: undefined symbol: _Z24cls_cxx_map_write_headerPvPN4ceph6buffer7v14_2_04listE
Failed to load class: rgw (/home/jenkins-build/build/workspace/ceph-pull-requests/build/lib/libcls_rgw.so): /home/jenkins-build/build/workspace/ceph-pull-requests/build/lib/libcls_rgw.so: undefined symbol: _Z19cls_current_versionPv
Failed to load class: user (/home/jenkins-build/build/workspace/ceph-pull-requests/build/lib/libcls_user.so): /home/jenkins-build/build/workspace/ceph-pull-requests/build/lib/libcls_user.so: undefined symbol: _Z24cls_cxx_map_write_headerPvPN4ceph6buffer7v14_2_04listE
[ OK ] TestMockImageReplayer.StartStop (317 ms)
[ RUN ] TestMockImageReplayer.LocalImagePrimary
[ OK ] TestMockImageReplayer.LocalImagePrimary (146 ms)
[ RUN ] TestMockImageReplayer.LocalImageDNE
[ OK ] TestMockImageReplayer.LocalImageDNE (196 ms)
[ RUN ] TestMockImageReplayer.PrepareLocalImageError
[ OK ] TestMockImageReplayer.PrepareLocalImageError (194 ms)
[ RUN ] TestMockImageReplayer.GetRemoteImageIdDNE
[ OK ] TestMockImageReplayer.GetRemoteImageIdDNE (174 ms)
[ RUN ] TestMockImageReplayer.GetRemoteImageIdNonLinkedDNE
[ OK ] TestMockImageReplayer.GetRemoteImageIdNonLinkedDNE (224 ms)
[ RUN ] TestMockImageReplayer.GetRemoteImageIdError
[ OK ] TestMockImageReplayer.GetRemoteImageIdError (228 ms)
[ RUN ] TestMockImageReplayer.BootstrapError
[ OK ] TestMockImageReplayer.BootstrapError (154 ms)
[ RUN ] TestMockImageReplayer.StopBeforeBootstrap
[ OK ] TestMockImageReplayer.StopBeforeBootstrap (215 ms)
[ RUN ] TestMockImageReplayer.StartExternalReplayError
[ OK ] TestMockImageReplayer.StartExternalReplayError (152 ms)
[ RUN ] TestMockImageReplayer.StopError
[ OK ] TestMockImageReplayer.StopError (169 ms)
[ RUN ] TestMockImageReplayer.Replay
[ OK ] TestMockImageReplayer.Replay (177 ms)
[ RUN ] TestMockImageReplayer.DecodeError
[ OK ] TestMockImageReplayer.DecodeError (157 ms)
[ RUN ] TestMockImageReplayer.DelayedReplay
[ OK ] TestMockImageReplayer.DelayedReplay (2153 ms)
[----------] 14 tests from TestMockImageReplayer (4663 ms total)
[----------] 5 tests from TestMockImageSync
[ RUN ] TestMockImageSync.SimpleSync
[ OK ] TestMockImageSync.SimpleSync (198 ms)
[ RUN ] TestMockImageSync.RestartSync
[ OK ] TestMockImageSync.RestartSync (173 ms)
[ RUN ] TestMockImageSync.CancelNotifySyncRequest
[ OK ] TestMockImageSync.CancelNotifySyncRequest (159 ms)
[ RUN ] TestMockImageSync.CancelImageCopy
[ OK ] TestMockImageSync.CancelImageCopy (195 ms)
[ RUN ] TestMockImageSync.CancelAfterCopyImage
[ OK ] TestMockImageSync.CancelAfterCopyImage (166 ms)
[----------] 5 tests from TestMockImageSync (898 ms total)
[----------] 3 tests from TestMockInstanceReplayer
[ RUN ] TestMockInstanceReplayer.AcquireReleaseImage
[ OK ] TestMockInstanceReplayer.AcquireReleaseImage (16 ms)
[ RUN ] TestMockInstanceReplayer.RemoveFinishedImage
[ OK ] TestMockInstanceReplayer.RemoveFinishedImage (24 ms)
[ RUN ] TestMockInstanceReplayer.Reacquire
[ OK ] TestMockInstanceReplayer.Reacquire (2 ms)
[----------] 3 tests from TestMockInstanceReplayer (42 ms total)
[----------] 11 tests from TestMockInstanceWatcher
[ RUN ] TestMockInstanceWatcher.InitShutdown
[ OK ] TestMockInstanceWatcher.InitShutdown (23 ms)
[ RUN ] TestMockInstanceWatcher.InitError
[ OK ] TestMockInstanceWatcher.InitError (18 ms)
[ RUN ] TestMockInstanceWatcher.ShutdownError
[ OK ] TestMockInstanceWatcher.ShutdownError (15 ms)
[ RUN ] TestMockInstanceWatcher.Remove
[ OK ] TestMockInstanceWatcher.Remove (16 ms)
[ RUN ] TestMockInstanceWatcher.RemoveNoent
[ OK ] TestMockInstanceWatcher.RemoveNoent (12 ms)
[ RUN ] TestMockInstanceWatcher.ImageAcquireRelease
[ OK ] TestMockInstanceWatcher.ImageAcquireRelease (36 ms)
[ RUN ] TestMockInstanceWatcher.PeerImageRemoved
[ OK ] TestMockInstanceWatcher.PeerImageRemoved (36 ms)
[ RUN ] TestMockInstanceWatcher.ImageAcquireReleaseCancel
[ OK ] TestMockInstanceWatcher.ImageAcquireReleaseCancel (31 ms)
[ RUN ] TestMockInstanceWatcher.PeerImageAcquireWatchDNE
[ OK ] TestMockInstanceWatcher.PeerImageAcquireWatchDNE (17 ms)
[ RUN ] TestMockInstanceWatcher.PeerImageReleaseWatchDNE
[ OK ] TestMockInstanceWatcher.PeerImageReleaseWatchDNE (32 ms)
[ RUN ] TestMockInstanceWatcher.PeerImageRemovedCancel
[ OK ] TestMockInstanceWatcher.PeerImageRemovedCancel (12 ms)
[----------] 11 tests from TestMockInstanceWatcher (250 ms total)
[----------] 11 tests from TestMockInstanceWatcher_NotifySync
[ RUN ] TestMockInstanceWatcher_NotifySync.StartStopOnLeader
[ OK ] TestMockInstanceWatcher_NotifySync.StartStopOnLeader (48 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.CancelStartedOnLeader
[ OK ] TestMockInstanceWatcher_NotifySync.CancelStartedOnLeader (49 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.StartStopOnNonLeader
[ OK ] TestMockInstanceWatcher_NotifySync.StartStopOnNonLeader (36 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.CancelStartedOnNonLeader
[ OK ] TestMockInstanceWatcher_NotifySync.CancelStartedOnNonLeader (41 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.CancelWaitingOnNonLeader
[ OK ] TestMockInstanceWatcher_NotifySync.CancelWaitingOnNonLeader (46 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.InFlightPrevNotification
[ OK ] TestMockInstanceWatcher_NotifySync.InFlightPrevNotification (46 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.NoInFlightReleaseAcquireLeader
[ OK ] TestMockInstanceWatcher_NotifySync.NoInFlightReleaseAcquireLeader (46 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.StartedOnLeaderReleaseLeader
[ OK ] TestMockInstanceWatcher_NotifySync.StartedOnLeaderReleaseLeader (34 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.WaitingOnLeaderReleaseLeader
[ OK ] TestMockInstanceWatcher_NotifySync.WaitingOnLeaderReleaseLeader (46 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.StartedOnNonLeaderAcquireLeader
[ OK ] TestMockInstanceWatcher_NotifySync.StartedOnNonLeaderAcquireLeader (29 ms)
[ RUN ] TestMockInstanceWatcher_NotifySync.WaitingOnNonLeaderAcquireLeader
[ OK ] TestMockInstanceWatcher_NotifySync.WaitingOnNonLeaderAcquireLeader (34 ms)
[----------] 11 tests from TestMockInstanceWatcher_NotifySync (456 ms total)
[----------] 4 tests from TestMockLeaderWatcher
[ RUN ] TestMockLeaderWatcher.InitShutdown
[ OK ] TestMockLeaderWatcher.InitShutdown (33 ms)
[ RUN ] TestMockLeaderWatcher.InitReleaseShutdown
[ OK ] TestMockLeaderWatcher.InitReleaseShutdown (19 ms)
[ RUN ] TestMockLeaderWatcher.AcquireError
[ OK ] TestMockLeaderWatcher.AcquireError (12 ms)
[ RUN ] TestMockLeaderWatcher.Break
[ OK ] TestMockLeaderWatcher.Break (2012 ms)
[----------] 4 tests from TestMockLeaderWatcher (2076 ms total)
[----------] 12 tests from TestMockMirrorStatusUpdater
[ RUN ] TestMockMirrorStatusUpdater.InitShutDown
[ OK ] TestMockMirrorStatusUpdater.InitShutDown (13 ms)
[ RUN ] TestMockMirrorStatusUpdater.InitStatusWatcherError
[ OK ] TestMockMirrorStatusUpdater.InitStatusWatcherError (26 ms)
[ RUN ] TestMockMirrorStatusUpdater.ShutDownStatusWatcherError
[ OK ] TestMockMirrorStatusUpdater.ShutDownStatusWatcherError (14 ms)
[ RUN ] TestMockMirrorStatusUpdater.SmallBatch
[ OK ] TestMockMirrorStatusUpdater.SmallBatch (24 ms)
[ RUN ] TestMockMirrorStatusUpdater.LargeBatch
[ OK ] TestMockMirrorStatusUpdater.LargeBatch (30 ms)
[ RUN ] TestMockMirrorStatusUpdater.OverwriteStatus
[ OK ] TestMockMirrorStatusUpdater.OverwriteStatus (11 ms)
[ RUN ] TestMockMirrorStatusUpdater.OverwriteStatusInFlight
[ OK ] TestMockMirrorStatusUpdater.OverwriteStatusInFlight (7 ms)
[ RUN ] TestMockMirrorStatusUpdater.ImmediateUpdate
[ OK ] TestMockMirrorStatusUpdater.ImmediateUpdate (9 ms)
[ RUN ] TestMockMirrorStatusUpdater.RemoveIdleStatus
[ OK ] TestMockMirrorStatusUpdater.RemoveIdleStatus (20 ms)
[ RUN ] TestMockMirrorStatusUpdater.RemoveInFlightStatus
[ OK ] TestMockMirrorStatusUpdater.RemoveInFlightStatus (9 ms)
[ RUN ] TestMockMirrorStatusUpdater.ShutDownWhileUpdating
[ OK ] TestMockMirrorStatusUpdater.ShutDownWhileUpdating (14 ms)
[ RUN ] TestMockMirrorStatusUpdater.MirrorPeerSitePing
[ OK ] TestMockMirrorStatusUpdater.MirrorPeerSitePing (24 ms)
[----------] 12 tests from TestMockMirrorStatusUpdater (201 ms total)
[----------] 6 tests from TestMockNamespaceReplayer
[ RUN ] TestMockNamespaceReplayer.Init_LocalMirrorStatusUpdaterError
[ OK ] TestMockNamespaceReplayer.Init_LocalMirrorStatusUpdaterError (55 ms)
[ RUN ] TestMockNamespaceReplayer.Init_RemoteMirrorStatusUpdaterError
[ OK ] TestMockNamespaceReplayer.Init_RemoteMirrorStatusUpdaterError (32 ms)
[ RUN ] TestMockNamespaceReplayer.Init_InstanceReplayerError
[ OK ] TestMockNamespaceReplayer.Init_InstanceReplayerError (12 ms)
[ RUN ] TestMockNamespaceReplayer.Init_InstanceWatcherError
[ OK ] TestMockNamespaceReplayer.Init_InstanceWatcherError (20 ms)
[ RUN ] TestMockNamespaceReplayer.Init
[ OK ] TestMockNamespaceReplayer.Init (16 ms)
[ RUN ] TestMockNamespaceReplayer.AcuqireLeader
[ OK ] TestMockNamespaceReplayer.AcuqireLeader (9 ms)
[----------] 6 tests from TestMockNamespaceReplayer (144 ms total)
[----------] 4 tests from TestMockPoolReplayer
[ RUN ] TestMockPoolReplayer.ConfigKeyOverride
[ OK ] TestMockPoolReplayer.ConfigKeyOverride (47 ms)
[ RUN ] TestMockPoolReplayer.AcquireReleaseLeader
[ OK ] TestMockPoolReplayer.AcquireReleaseLeader (55 ms)
[ RUN ] TestMockPoolReplayer.Namespaces
[ OK ] TestMockPoolReplayer.Namespaces (2075 ms)
[ RUN ] TestMockPoolReplayer.NamespacesError
</pre>
<p><a class="external" href="https://jenkins.ceph.com/job/ceph-pull-requests/40443/console">https://jenkins.ceph.com/job/ceph-pull-requests/40443/console</a></p>
<p><a class="external" href="https://github.com/ceph/ceph/pull/32182">https://github.com/ceph/ceph/pull/32182</a></p>
rgw - Bug #40902 (Duplicate): make check: unittest_rgw_reshard_wait failed (ReshardWait.wait_yield)
https://tracker.ceph.com/issues/40902
2019-07-23T09:13:40Z
Sebastian Wagner
<p><a class="external" href="https://jenkins.ceph.com/job/ceph-pull-requests/29742">https://jenkins.ceph.com/job/ceph-pull-requests/29742</a></p>
<pre>
155/178 Test #162: unittest_rgw_reshard_wait ...............***Failed 1.06 sec
Running main() from gmock_main.cc
[==========] Running 5 tests from 1 test suite.
[----------] Global test environment set-up.
[----------] 5 tests from ReshardWait
[ RUN ] ReshardWait.wait_block
[ OK ] ReshardWait.wait_block (10 ms)
[ RUN ] ReshardWait.stop_block
[ OK ] ReshardWait.stop_block (13 ms)
[ RUN ] ReshardWait.wait_yield
/home/jenkins-build/build/workspace/ceph-pull-requests/src/test/rgw/test_rgw_reshard_wait.cc:72: Failure
Expected equality of these values:
1u
Which is: 1
context.poll()
Which is: 2
/home/jenkins-build/build/workspace/ceph-pull-requests/src/test/rgw/test_rgw_reshard_wait.cc:73: Failure
Value of: context.stopped()
Actual: true
Expected: false
/home/jenkins-build/build/workspace/ceph-pull-requests/src/test/rgw/test_rgw_reshard_wait.cc:75: Failure
Expected equality of these values:
1u
Which is: 1
context.run_one()
Which is: 0
[ FAILED ] ReshardWait.wait_yield (15 ms)
[ RUN ] ReshardWait.stop_yield
[ OK ] ReshardWait.stop_yield (10 ms)
[ RUN ] ReshardWait.stop_multiple
[ OK ] ReshardWait.stop_multiple (20 ms)
[----------] 5 tests from ReshardWait (68 ms total)
[----------] Global test environment tear-down
[==========] 5 tests from 1 test suite ran. (68 ms total)
[ PASSED ] 4 tests.
[ FAILED ] 1 test, listed below:
[ FAILED ] ReshardWait.wait_yield
1 FAILED TEST
</pre>
<p>Sorry, but I cannot provide any details, as the PR was not related to this failure.</p>
rgw - Documentation #38721 (Resolved): Remove OpenStack Kilo reference in the keystone documentation
https://tracker.ceph.com/issues/38721
2019-03-13T12:14:49Z
Sebastian Wagner
<p>Kilo is EOL since 2016: <a class="external" href="https://releases.openstack.org/">https://releases.openstack.org/</a></p>
<p>Ocata is the oldest (non-EOLed) version of OpenStack.</p>
<p>Relates to <a class="external" href="http://tracker.ceph.com/issues/18197">http://tracker.ceph.com/issues/18197</a></p>
rbd - Bug #22253 (Can't reproduce): "rbd info" crashed: stack smashing detected
https://tracker.ceph.com/issues/22253
2017-11-27T14:37:58Z
Sebastian Wagner
<p>Environment: quite small vstart cluster.</p>
<p>This is the stack trace:<br /><pre>
#3 0x00007fffed44711c in __GI___fortify_fail (msg=<optimized out>, msg@entry=0x7fffed4bd441 "stack smashing detected") at fortify_fail.c:37
#4 0x00007fffed4470c0 in __stack_chk_fail () at stack_chk_fail.c:28
#5 0x00007ffff78f0beb in librbd::ImageCtx::perf_start (this=this@entry=0x555555b7bf70, name="librbd-8c39e2ae8944a-rbd-huge2") at /home/sebastian/Repos/ceph/src/librbd/ImageCtx.cc:397
#6 0x00007ffff78f3cb4 in librbd::ImageCtx::init (this=0x555555b7bf70) at /home/sebastian/Repos/ceph/src/librbd/ImageCtx.cc:275
#7 0x00007ffff799dacd in librbd::image::OpenRequest<librbd::ImageCtx>::send_register_watch (this=this@entry=0x555555b7fe60) at /home/sebastian/Repos/ceph/src/librbd/image/OpenRequest.cc:477
#8 0x00007ffff79a3102 in librbd::image::OpenRequest<librbd::ImageCtx>::handle_v2_apply_metadata (this=this@entry=0x555555b7fe60, result=result@entry=0x7fffb77fa374) at /home/sebastian/Repos/ceph/src/librbd/image/OpenRequest.cc:471
#9 0x00007ffff79a351f in librbd::util::detail::rados_state_callback<librbd::image::OpenRequest<librbd::ImageCtx>, &librbd::image::OpenRequest<librbd::ImageCtx>::handle_v2_apply_metadata, true> (c=<optimized out>, arg=0x555555b7fe60) at /home/sebastian/Repos/ceph/src/librbd/Utils.h:39
#10 0x00007ffff75d678d in librados::C_AioComplete::finish (this=0x7fffd0001b60, r=<optimized out>) at /home/sebastian/Repos/ceph/src/librados/AioCompletionImpl.h:169
#11 0x0000555555613949 in Context::complete (this=0x7fffd0001b60, r=<optimized out>) at /home/sebastian/Repos/ceph/src/include/Context.h:70
#12 0x00007fffeeab6010 in Finisher::finisher_thread_entry (this=0x555555acb3e8) at /home/sebastian/Repos/ceph/src/common/Finisher.cc:72
#13 0x00007fffee3a86ba in start_thread (arg=0x7fffb77fe700) at pthread_create.c:333
#14 0x00007fffed4353dd in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:109
</pre></p>