Project

General

Profile

Bug #37979

mds: use up to 80G memory when have large stress

Added by Min Chen 3 months ago. Updated about 2 months ago.

Status:
New
Priority:
Normal
Assignee:
-
Category:
-
Target version:
-
Start date:
Due date:
% Done:

0%

Source:
Community (user)
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Component(FS):
Labels (FS):
Pull request ID:

Description

ceph version 12.2.7-569-gac5687a (ac5687af649a114f3ed6d6a73d8cf475fded987f) luminous (stable)

# ceph daemon mds.node1 dump_mempools
{
    "bloom_filter": {
        "items": 36688617,
        "bytes": 36688617
    },
    "bluestore_alloc": {
        "items": 0,
        "bytes": 0
    },
    "bluestore_cache_data": {
        "items": 0,
        "bytes": 0
    },
    "bluestore_cache_onode": {
        "items": 0,
        "bytes": 0
    },
    "bluestore_cache_other": {
        "items": 0,
        "bytes": 0
    },
    "bluestore_fsck": {
        "items": 0,
        "bytes": 0
    },
    "bluestore_txc": {
        "items": 0,
        "bytes": 0
    },
    "bluestore_writing_deferred": {
        "items": 0,
        "bytes": 0
    },
    "bluestore_writing": {
        "items": 0,
        "bytes": 0
    },
    "bluefs": {
        "items": 0,
        "bytes": 0
    },
    "buffer_anon": {
        "items": 141285,
        "bytes": 80306856511
    },
    "buffer_meta": {
        "items": 68965,
        "bytes": 6068920
    },
    "osd": {
        "items": 0,
        "bytes": 0
    },
    "osd_ec_extent_cache": {
        "items": 0,
        "bytes": 0
    },
    "osd_mapbl": {
        "items": 0,
        "bytes": 0
    },
    "osd_pglog": {
        "items": 0,
        "bytes": 0
    },
    "osdmap": {
        "items": 175,
        "bytes": 10616
    },
    "osdmap_mapping": {
        "items": 0,
        "bytes": 0
    },
    "pgmap": {
        "items": 0,
        "bytes": 0
    },
    "mds_co": {
        "items": 186873569,
        "bytes": 8161771349
    },
    "unittest_1": {
        "items": 0,
        "bytes": 0
    },
    "unittest_2": {
        "items": 0,
        "bytes": 0
    },
    "total": {
        "items": 223772611,
        "bytes": 88511396013
    }

History

#1 Updated by zhou yang 3 months ago

I have encountered similar problems. The mon daemon's buffer_anon use too much memery,about 10G,but I have no idea.

Min Chen wrote:

ceph version 12.2.7-569-gac5687a (ac5687af649a114f3ed6d6a73d8cf475fded987f) luminous (stable)

  1. ceph daemon mds.node1 dump_mempools {
    "bloom_filter": {
    "items": 36688617,
    "bytes": 36688617
    },
    "bluestore_alloc": {
    "items": 0,
    "bytes": 0
    },
    "bluestore_cache_data": {
    "items": 0,
    "bytes": 0
    },
    "bluestore_cache_onode": {
    "items": 0,
    "bytes": 0
    },
    "bluestore_cache_other": {
    "items": 0,
    "bytes": 0
    },
    "bluestore_fsck": {
    "items": 0,
    "bytes": 0
    },
    "bluestore_txc": {
    "items": 0,
    "bytes": 0
    },
    "bluestore_writing_deferred": {
    "items": 0,
    "bytes": 0
    },
    "bluestore_writing": {
    "items": 0,
    "bytes": 0
    },
    "bluefs": {
    "items": 0,
    "bytes": 0
    },
    "buffer_anon": {
    "items": 141285,
    "bytes": 80306856511
    },
    "buffer_meta": {
    "items": 68965,
    "bytes": 6068920
    },
    "osd": {
    "items": 0,
    "bytes": 0
    },
    "osd_ec_extent_cache": {
    "items": 0,
    "bytes": 0
    },
    "osd_mapbl": {
    "items": 0,
    "bytes": 0
    },
    "osd_pglog": {
    "items": 0,
    "bytes": 0
    },
    "osdmap": {
    "items": 175,
    "bytes": 10616
    },
    "osdmap_mapping": {
    "items": 0,
    "bytes": 0
    },
    "pgmap": {
    "items": 0,
    "bytes": 0
    },
    "mds_co": {
    "items": 186873569,
    "bytes": 8161771349
    },
    "unittest_1": {
    "items": 0,
    "bytes": 0
    },
    "unittest_2": {
    "items": 0,
    "bytes": 0
    },
    "total": {
    "items": 223772611,
    "bytes": 88511396013
    }

#2 Updated by Patrick Donnelly 3 months ago

  • Description updated (diff)
  • Target version set to v14.0.0
  • Start date deleted (01/21/2019)
  • Source set to Community (user)

Thanks for the report. Not sure on the cause yet.

#3 Updated by Patrick Donnelly about 2 months ago

  • Target version changed from v14.0.0 to v15.0.0

#4 Updated by Patrick Donnelly about 2 months ago

  • Target version deleted (v15.0.0)

Also available in: Atom PDF