Bug #18744
Updated by Nathan Cutler about 7 years ago
Jewel 10.2.6 integration testing is suffering from high rates of "saw valgrind issues" failures in rgw runs. Here is an typical run, with scads of "saw valgrind issues" failures: http://pulpito.ceph.com/smithfarm-2017-01-30_12:10:53-rgw-wip-jewel-backports-distro-basic-smithi/ Upon closer examination, Shaman is correctly querying for the notcmalloc build flavor, build, like so: https://shaman.ceph.com/api/search/?status=ready&project=ceph&flavor=notcmalloc&distros=centos%2F7%2Fx86_64&sha1=b671230f7f70b620905eb02c6dbd93d051b53fb7 This Shaman query returns JSON like this: <code>[{"status": "ready", "sha1": "b671230f7f70b620905eb02c6dbd93d051b53fb7", "extra": {"build_url": "https://jenkins.ceph.com/job/ceph-dev-new-build/ARCH=x86_64,AVAILABLE_ARCH=x86_64,AVAILABLE_DIST=centos7,DIST=centos7,MACHINE_SIZE=huge/847/", "root_build_cause": "SCMTRIGGER", "version": "10.2.5-6034-gb671230", "node_name": "172.21.1.42+slave-centos05", "job_name": "ceph-dev-new-build/ARCH=x86_64,AVAILABLE_ARCH=x86_64,AVAILABLE_DIST=centos7,DIST=centos7,MACHINE_SIZE=huge", "package_manager_version": "10.2.5-6034.gb671230"}, "url": "https://4.chacra.ceph.com/r/ceph/wip-jewel-backports/b671230f7f70b620905eb02c6dbd93d051b53fb7/centos/7/flavors/notcmalloc/", "distro_codename": null, "modified": "2017-01-30 00:03:31.999555", "distro_version": "7", "project": "ceph", "flavor": "notcmalloc", "ref": "wip-jewel-backports", "chacra_url": "https://4.chacra.ceph.com/repos/ceph/wip-jewel-backports/b671230f7f70b620905eb02c6dbd93d051b53fb7/centos/7/flavors/notcmalloc/", "archs": ["x86_64", "source"], "distro": "centos"}]</code> Examining the build log from https://jenkins.ceph.com/job/ceph-dev-new-build/ARCH=x86_64,AVAILABLE_ARCH=x86_64,AVAILABLE_DIST=centos7,DIST=centos7,MACHINE_SIZE=huge/847/ it is obvious that the Ceph daemons are being linked with libtcmalloc: <pre> Processing files: ceph-mon-10.2.5-6034.gb671230.el7.x86_64 Provides: ceph-mon = 1:10.2.5-6034.gb671230.el7 ceph-mon(x86-64) = 1:10.2.5-6034.gb671230.el7 Requires(interp): /bin/sh /bin/sh /bin/sh Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PartialHardlinkSets) <= 4.0.4-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 Requires(post): /bin/sh Requires(preun): /bin/sh Requires(postun): /bin/sh Requires: /usr/bin/env ld-linux-x86-64.so.2()(64bit) ld-linux-x86-64.so.2(GLIBC_2.3)(64bit) libboost_iostreams-mt.so.1.53.0()(64bit) libboost_random-mt.so.1.53.0()(64bit) libboost_system-mt.so.1.53.0()(64bit) libboost_thread-mt.so.1.53.0()(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.10)(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.16)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.3)(64bit) libc.so.6(GLIBC_2.3.2)(64bit) libc.so.6(GLIBC_2.3.3)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.5)(64bit) libc.so.6(GLIBC_2.6)(64bit) libc.so.6(GLIBC_2.7)(64bit) libc.so.6(GLIBC_2.8)(64bit) libc.so.6(GLIBC_2.9)(64bit) libdl.so.2()(64bit) libdl.so.2(GLIBC_2.2.5)(64bit) libgcc_s.so.1()(64bit) libgcc_s.so.1(GCC_3.0)(64bit) libleveldb.so.1()(64bit) libm.so.6()(64bit) libm.so.6(GLIBC_2.2.5)(64bit) libnspr4.so()(64bit) libnss3.so()(64bit) libnss3.so(NSS_3.12.5)(64bit) libnss3.so(NSS_3.12.9)(64bit) libnss3.so(NSS_3.2)(64bit) libnss3.so(NSS_3.3)(64bit) libpthread.so.0()(64bit) libpthread.so.0(GLIBC_2.12)(64bit) libpthread.so.0(GLIBC_2.2.5)(64bit) libpthread.so.0(GLIBC_2.3.2)(64bit) librt.so.1()(64bit) librt.so.1(GLIBC_2.2.5)(64bit) libsnappy.so.1()(64bit) libstdc++.so.6()(64bit) libstdc++.so.6(CXXABI_1.3)(64bit) libstdc++.so.6(CXXABI_1.3.1)(64bit) libstdc++.so.6(CXXABI_1.3.5)(64bit) libstdc++.so.6(CXXABI_1.3.7)(64bit) libstdc++.so.6(GLIBCXX_3.4)(64bit) libstdc++.so.6(GLIBCXX_3.4.11)(64bit) libstdc++.so.6(GLIBCXX_3.4.14)(64bit) libstdc++.so.6(GLIBCXX_3.4.15)(64bit) libstdc++.so.6(GLIBCXX_3.4.18)(64bit) libstdc++.so.6(GLIBCXX_3.4.19)(64bit) libstdc++.so.6(GLIBCXX_3.4.9)(64bit) libtcmalloc.so.4()(64bit) libz.so.1()(64bit) python(abi) = 2.7 rtld(GNU_HASH) </pre> (same for the other daemons) See #18084 for the last time this happened.