Project

General

Profile

Actions

Bug #13109

closed

msg/async leak, observed on mon

Added by Sage Weil over 8 years ago. Updated over 8 years ago.

Status:
Resolved
Priority:
Urgent
Assignee:
-
Category:
-
Target version:
-
% Done:

0%

Source:
Q/A
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Pull request ID:
Crash signature (v1):
Crash signature (v2):

Description

/a/sage-2015-09-15_12:12:39-rados:verify-wip-sage-testing---basic-multi/1058089

all of the failures are processes that randomly picked async msgr.

tons of stuff.. probably a leaked cct ref (or something that holds a cct ref)?


Related issues 1 (0 open1 closed)

Related to Ceph - Bug #13251: mon leaks lots on shutodwnResolvedSage Weil09/26/2015

Actions
Actions #1

Updated by Sage Weil over 8 years ago

hmm, mon uses loopback heavily and other stuff doesn't. maybe that's why this triggers there?

Actions #2

Updated by Haomai Wang over 8 years ago

Hi sage, after detail inspects, I don't find any clue about async leak. And I don't get your point about async cause lots of memory leak? Could you give your thought about the vargind result file?

Actions #3

Updated by Sage Weil over 8 years ago

much simpler set of leaks:

/a/sage-2015-10-01_11:19:41-rados:verify-infernalis---basic-multi/1079757

it's MLog ... not sure which message instance got leaked tho. debug refs = 10 will probably help.

another one with MCommand

/a/sage-2015-10-01_11:19:41-rados:verify-infernalis---basic-multi/1079781

Actions #4

Updated by Kefu Chai over 8 years ago

  • Status changed from 12 to Fix Under Review
Actions #5

Updated by Sage Weil over 8 years ago

One more leak in the other path too, pushed a fix to wip-sage-testing.

Actions #6

Updated by Sage Weil over 8 years ago

  • Status changed from Fix Under Review to Resolved

362b18a532a5077a1418c88f5acbc6e468a45a5a

doesn't look like it was async msgr related after all.

Actions

Also available in: Atom PDF