Project

General

Profile

Actions

Bug #54515

open

mon/health-mute.sh: TEST_mute: return 1 (HEALTH WARN 3 mgr modules have failed dependencies)

Added by Kamoltat (Junior) Sirivadhna about 2 years ago. Updated 4 days ago.

Status:
New
Priority:
Normal
Category:
-
Target version:
-
% Done:

0%

Source:
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Component(RADOS):
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

/a/yuriw-2022-03-04_21:56:41-rados-wip-yuri4-testing-2022-03-03-1448-distro-default-smithi/6721547

2022-03-05T03:31:43.871 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mon/health-mute.sh:39: TEST_mute:  ceph health
2022-03-05T03:31:43.871 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mon/health-mute.sh:39: TEST_mute:  grep HEALTH_OK
2022-03-05T03:31:44.229 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mon/health-mute.sh:39: TEST_mute:  return 1
2022-03-05T03:31:44.229 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mon/health-mute.sh:17: run:  return 1
2022-03-05T03:31:44.230 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2261: main:  code=1
2022-03-05T03:31:44.230 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: main:  teardown td/health-mute 1
2022-03-05T03:31:44.230 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown:  local dir=td/health-mute
2022-03-05T03:31:44.230 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown:  local dumplogs=1
2022-03-05T03:31:44.231 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown:  kill_daemons td/health-mute KILL
2022-03-05T03:31:44.231 INFO:tasks.workunit.client.0.smithi053.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons:  shopt -q -o xtrace
2022-03-05T03:31:44.231 INFO:tasks.workunit.client.0.smithi053.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons:  echo true
2022-03-05T03:31:44.231 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons:  local trace=true
2022-03-05T03:31:44.232 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons:  true
2022-03-05T03:31:44.232 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons:  shopt -u -o xtrace
2022-03-05T03:31:44.336 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons:  return 0
2022-03-05T03:31:44.337 INFO:tasks.workunit.client.0.smithi053.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown:  uname
2022-03-05T03:31:44.337 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown:  '[' Linux '!=' FreeBSD ']'
2022-03-05T03:31:44.338 INFO:tasks.workunit.client.0.smithi053.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown:  stat -f -c %T .
2022-03-05T03:31:44.339 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown:  '[' ext2/ext3 == btrfs ']'
2022-03-05T03:31:44.339 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown:  local cores=no
2022-03-05T03:31:44.339 INFO:tasks.workunit.client.0.smithi053.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown:  sysctl -n kernel.core_pattern
2022-03-05T03:31:44.340 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown:  local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core
2022-03-05T03:31:44.341 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown:  '[' / = '|' ']'
2022-03-05T03:31:44.341 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown:  grep -q '^core\|core$'
2022-03-05T03:31:44.341 INFO:tasks.workunit.client.0.smithi053.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown:  dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core
2022-03-05T03:31:44.342 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown:  ls /home/ubuntu/cephtest/archive/coredump
2022-03-05T03:31:44.343 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown:  '[' no = yes -o 1 = 1 ']'
2022-03-05T03:31:44.343 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown:  '[' -n '' ']'
2022-03-05T03:31:44.343 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:194: teardown:  mkdir -p /home/ubuntu/cephtest/archive/log
2022-03-05T03:31:44.344 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:195: teardown:  mv td/health-mute/mgr.x.log td/health-mute/mon.a.log td/health-mute/osd.0.log td/health-mute/osd.1.log td/health-mute/osd.2.log /home/ubuntu/cephtest/archive/log
2022-03-05T03:31:44.344 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown:  rm -fr td/health-mute
2022-03-05T03:31:44.352 INFO:tasks.workunit.client.0.smithi053.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown:  get_asok_dir
2022-03-05T03:31:44.352 INFO:tasks.workunit.client.0.smithi053.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir:  '[' -n '' ']'
2022-03-05T03:31:44.353 INFO:tasks.workunit.client.0.smithi053.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir:  echo /tmp/ceph-asok.28375
2022-03-05T03:31:44.353 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown:  rm -rf /tmp/ceph-asok.28375
2022-03-05T03:31:44.354 DEBUG:teuthology.orchestra.run:got remote process result: 1
2022-03-05T03:31:44.355 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown:  '[' no = yes ']'
2022-03-05T03:31:44.355 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown:  return 0
2022-03-05T03:31:44.355 INFO:tasks.workunit.client.0.smithi053.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: main:  return 1
2022-03-05T03:31:44.356 INFO:tasks.workunit:Stopping ['mon'] on client.0...
2022-03-05T03:31:44.356 DEBUG:teuthology.orchestra.run.smithi053:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0
2022-03-05T03:31:44.603 ERROR:teuthology.run_tasks:Saw exception from tasks.
Traceback (most recent call last):
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/run_tasks.py", line 91, in run_tasks
    manager = run_one_task(taskname, ctx=ctx, config=config)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/run_tasks.py", line 70, in run_one_task
    return task(**kwargs)
  File "/home/teuthworker/src/github.com_ceph_ceph-c_c8f79f870e0d6a996c92d420e6256d312bac1c7c/qa/tasks/workunit.py", line 148, in task
    cleanup=cleanup)
  File "/home/teuthworker/src/github.com_ceph_ceph-c_c8f79f870e0d6a996c92d420e6256d312bac1c7c/qa/tasks/workunit.py", line 298, in _spawn_on_all_clients
    timeout=timeout)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/parallel.py", line 84, in __exit__
    for result in self:
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/parallel.py", line 98, in __next__
    resurrect_traceback(result)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/parallel.py", line 30, in resurrect_traceback
    raise exc.exc_info[1]
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/parallel.py", line 23, in capture_traceback
    return func(*args, **kwargs)
  File "/home/teuthworker/src/github.com_ceph_ceph-c_c8f79f870e0d6a996c92d420e6256d312bac1c7c/qa/tasks/workunit.py", line 427, in _run_tests
    label="workunit test {workunit}".format(workunit=workunit)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/orchestra/remote.py", line 509, in run
    r = self._runner(client=self.ssh, name=self.shortname, **kwargs)
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/orchestra/run.py", line 455, in run
    r.wait()
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/orchestra/run.py", line 161, in wait
    self._raise_for_status()
  File "/home/teuthworker/src/git.ceph.com_git_teuthology_eeaeee38264fad8a01fd1ce912d8908676ed64bc/teuthology/orchestra/run.py", line 183, in _raise_for_status
    node=self.hostname, label=self.label
teuthology.exceptions.CommandFailedError: Command failed (workunit test mon/health-mute.sh) on smithi053 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=c8f79f870e0d6a996c92d420e6256d312bac1c7c TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/mon/health-mute.sh'

Actions #1

Updated by Kefu Chai about 2 years ago

/a/kchai-2022-03-17_05:18:57-rados-wip-cxx20-fixes-core-kefu-distro-default-smithi/6740941/

Actions #2

Updated by Kamoltat (Junior) Sirivadhna about 2 years ago

/a/ksirivad-2022-04-05_19:51:49-rados:standalone:workloads-master-distro-basic-smithi/6778033/

2022-04-05T20:05:50.263 INFO:tasks.workunit.client.0.smithi142.stdout:  cluster:
2022-04-05T20:05:50.264 INFO:tasks.workunit.client.0.smithi142.stdout:    id:     01eb1d2c-e97a-4f99-9175-7359c1d666c5
2022-04-05T20:05:50.264 INFO:tasks.workunit.client.0.smithi142.stdout:    health: HEALTH_WARN
2022-04-05T20:05:50.264 INFO:tasks.workunit.client.0.smithi142.stdout:            3 mgr modules have failed dependencies
2022-04-05T20:05:50.264 INFO:tasks.workunit.client.0.smithi142.stdout:
2022-04-05T20:05:50.264 INFO:tasks.workunit.client.0.smithi142.stdout:  services:
2022-04-05T20:05:50.265 INFO:tasks.workunit.client.0.smithi142.stdout:    mon: 1 daemons, quorum a (age 106s)
2022-04-05T20:05:50.265 INFO:tasks.workunit.client.0.smithi142.stdout:    mgr: x(active, since 98s)
2022-04-05T20:05:50.265 INFO:tasks.workunit.client.0.smithi142.stdout:    osd: 3 osds: 3 up (since 6s), 3 in (since 38s)
2022-04-05T20:05:50.265 INFO:tasks.workunit.client.0.smithi142.stdout:
2022-04-05T20:05:50.265 INFO:tasks.workunit.client.0.smithi142.stdout:  data:
2022-04-05T20:05:50.266 INFO:tasks.workunit.client.0.smithi142.stdout:    pools:   1 pools, 8 pgs
2022-04-05T20:05:50.266 INFO:tasks.workunit.client.0.smithi142.stdout:    objects: 0 objects, 0 B
2022-04-05T20:05:50.266 INFO:tasks.workunit.client.0.smithi142.stdout:    usage:   459 MiB used, 300 GiB / 300 GiB avail
2022-04-05T20:05:50.266 INFO:tasks.workunit.client.0.smithi142.stdout:    pgs:     8 active+clean
2022-04-05T20:05:50.266 INFO:tasks.workunit.client.0.smithi142.stdout:
2022-04-05T20:05:50.280 INFO:tasks.workunit.client.0.smithi142.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mon/health-mute.sh:39: TEST_mute:  ceph health
2022-04-05T20:05:50.280 INFO:tasks.workunit.client.0.smithi142.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mon/health-mute.sh:39: TEST_mute:  grep HEALTH_OK
2022-04-05T20:05:50.662 INFO:tasks.workunit.client.0.smithi142.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/mon/health-mute.sh:39: TEST_mute:  return 1

Actions #3

Updated by Kamoltat (Junior) Sirivadhna about 2 years ago

  • Subject changed from mon/health-mute.sh: TEST_mute: return 1 to mon/health-mute.sh: TEST_mute: return 1 (HEALTH WARN 3 mgr modules have failed dependencies)
Actions #4

Updated by Kamoltat (Junior) Sirivadhna about 2 years ago

{
    "MGR_MODULE_DEPENDENCY": {
        "severity": "HEALTH_WARN",
        "summary": {
            "message": "3 mgr modules have failed dependencies",
            "count": 3
        },
        "detail": [
            {
                "message": "Module 'nfs' has failed dependency: 'loki'" 
            },
            {
                "message": "Module 'orchestrator' has failed dependency: 'loki'" 
            },
            {
                "message": "Module 'volumes' has failed dependency:
Actions #5

Updated by Kamoltat (Junior) Sirivadhna about 2 years ago

  • Assignee set to Kamoltat (Junior) Sirivadhna
Actions #6

Updated by Laura Flores 4 days ago

/a/lflores-2024-04-01_18:07:25-rados-wip-yuri8-testing-2024-03-25-1419-distro-default-smithi/7634102

Actions

Also available in: Atom PDF