Project

General

Profile

Actions

Bug #41817

closed

qa/standalone/scrub/osd-recovery-scrub.sh timed out waiting for scrub

Added by Sage Weil over 4 years ago. Updated over 4 years ago.

Status:
Closed
Priority:
High
Assignee:
David Zafman
Category:
-
Target version:
-
% Done:

0%

Source:
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Component(RADOS):
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

2019-09-13T01:09:00.211 INFO:tasks.workunit.client.0.smithi075.stderr:10487: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1792: wait_for_scrub:  (( i++ ))
2019-09-13T01:09:00.211 INFO:tasks.workunit.client.0.smithi075.stderr:10487: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1792: wait_for_scrub:  (( i < 300 ))
2019-09-13T01:09:00.211 INFO:tasks.workunit.client.0.smithi075.stderr:10487: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1793: wait_for_scrub:  get_last_scrub_stamp 1.1a last_scrub_stamp
2019-09-13T01:09:00.211 INFO:tasks.workunit.client.0.smithi075.stderr:10487: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1431: get_last_scrub_stamp:  local pgid=1.1a
2019-09-13T01:09:00.211 INFO:tasks.workunit.client.0.smithi075.stderr:10487: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1432: get_last_scrub_stamp:  local sname=last_scrub_stamp
2019-09-13T01:09:00.212 INFO:tasks.workunit.client.0.smithi075.stderr:10487: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1433: get_last_scrub_stamp:  ceph --format json pg dump pgs
2019-09-13T01:09:00.212 INFO:tasks.workunit.client.0.smithi075.stderr:10487: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1434: get_last_scrub_stamp:  jq -r '.pg_stats | .[] | select(.pgid=="1.1a") | .last_scrub_stamp'
2019-09-13T01:09:00.212 INFO:tasks.workunit.client.0.smithi075.stderr:10487: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1793: wait_for_scrub:  test '' '>' 2019-09-13T01:01:41.335322+0000
2019-09-13T01:09:00.212 INFO:tasks.workunit.client.0.smithi075.stderr:10487: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1796: wait_for_scrub:  sleep 1
2019-09-13T01:09:00.212 INFO:tasks.workunit.client.0.smithi075.stderr:10487: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1792: wait_for_scrub:  (( i++ ))
2019-09-13T01:09:00.212 INFO:tasks.workunit.client.0.smithi075.stderr:10487: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1792: wait_for_scrub:  (( i < 300 ))
2019-09-13T01:09:00.212 INFO:tasks.workunit.client.0.smithi075.stderr:10487: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1798: wait_for_scrub:  return 1
2019-09-13T01:09:00.216 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1910: run_in_background:  return 1
2019-09-13T01:09:00.216 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1948: wait_background:  return_code=1
2019-09-13T01:09:00.216 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: wait_background:  for pid in $pids
2019-09-13T01:09:00.216 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: wait_background:  wait 15053
2019-09-13T01:09:00.216 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1948: wait_background:  return_code=1
2019-09-13T01:09:00.217 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: wait_background:  for pid in $pids
2019-09-13T01:09:00.217 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: wait_background:  wait 15081
2019-09-13T01:09:00.217 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1948: wait_background:  return_code=1
2019-09-13T01:09:00.217 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: wait_background:  for pid in $pids
2019-09-13T01:09:00.217 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: wait_background:  wait 15090
2019-09-13T01:09:00.217 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1948: wait_background:  return_code=1
2019-09-13T01:09:00.217 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: wait_background:  for pid in $pids
2019-09-13T01:09:00.218 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: wait_background:  wait 15111
2019-09-13T01:09:00.218 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1948: wait_background:  return_code=1
2019-09-13T01:09:00.218 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: wait_background:  for pid in $pids
2019-09-13T01:09:00.218 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: wait_background:  wait 15122
2019-09-13T01:09:00.218 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1948: wait_background:  return_code=1
2019-09-13T01:09:00.218 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1953: wait_background:  eval 'pids='\'''\'''
2019-09-13T01:09:00.218 INFO:tasks.workunit.client.0.smithi075.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1953: wait_background:  pids=
2019-09-13T01:09:00.218 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1955: wait_background:  return 1
2019-09-13T01:09:00.219 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:73: TEST_recovery_scrub:  return_code=1
2019-09-13T01:09:00.219 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:74: TEST_recovery_scrub:  '[' 1 -ne 0 ']'
2019-09-13T01:09:00.219 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:74: TEST_recovery_scrub:  return 1
2019-09-13T01:09:00.219 INFO:tasks.workunit.client.0.smithi075.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:31: run:  return 1

/a/sage-2019-09-12_21:32:16-rados-wip-sage-testing-2019-09-12-1043-distro-basic-smithi/4300769

Related issues 1 (0 open1 closed)

Related to RADOS - Bug #41900: auto-scaler breaks many standalone testsResolvedDavid Zafman09/17/2019

Actions
Actions #1

Updated by Neha Ojha over 4 years ago

  • Assignee set to David Zafman

David, can you please take a look at this whenever you get a chance.

Actions #2

Updated by Kefu Chai over 4 years ago

/a/kchai-2019-09-15_15:37:26-rados-wip-kefu-testing-2019-09-15-1533-distro-basic-mira/4311115/
/a/pdonnell-2019-09-14_22:40:03-rados-master-distro-basic-smithi/4307683/

Actions #3

Updated by David Zafman over 4 years ago

  • Status changed from 12 to In Progress

This is likely cause by enabling of auto scaler.

Actions #4

Updated by David Zafman over 4 years ago

  • Related to Bug #41900: auto-scaler breaks many standalone tests added
Actions #5

Updated by David Zafman over 4 years ago

  • Pull request ID set to 30467

This fix for this particular issue is to just disable auto scaler because it just causes a hang in the test but no crashes.

Actions #6

Updated by David Zafman over 4 years ago

  • Status changed from In Progress to Closed
Actions

Also available in: Atom PDF