Project

General

Profile

Actions

Documentation #11503

closed

[Docs] Placement Groups

Added by Peter Matulis almost 9 years ago. Updated almost 9 years ago.

Status:
Resolved
Priority:
Normal
Assignee:
Category:
-
Target version:
-
% Done:

0%

Tags:
Backport:
Reviewed:
Affected Versions:
Pull request ID:

Description

The top of the PG page [1] is completely wrong.

a) You do not create a pool in this way:
ceph osd pool set {pool-name} pg_num

But you do set the number of PGs for a pool in this way:
ceph osd pool set {pool-name} pg_num {pg_num}

b) PG numbers are limited by 32 per OSD
For instance, "Between 5 and 10 OSDs set pg_num to 512". This will not work. The maximum PGs I can set for 10 OSDs is 320.

[1]: http://ceph.com/docs/master/rados/operations/placement-groups/

Actions #1

Updated by Kefu Chai almost 9 years ago

  • Assignee set to Kefu Chai
Actions #2

Updated by Kefu Chai almost 9 years ago

Peter Matulis wrote:

The top of the PG page [1] is completely wrong.

b) PG numbers are limited by 32 per OSD
For instance, "Between 5 and 10 OSDs set pg_num to 512". This will not work. The maximum PGs I can set for 10 OSDs is 320.

Peter, i am able to set 1024 for a 3-OSD cluster created using vstart.sh script. by inspecting the code, we have a setting named "mon_max_pool_pg_num" which control the maximum number for a pool. but this # is not associated with the # of OSD. following is how i create a 1024-pg pool with my little cluster.

$ ./ceph osd pool create cache_pool4 1024
*** DEVELOPER MODE: setting PATH, PYTHONPATH and LD_LIBRARY_PATH ***
pool 'cache_pool4' created
$ ./ceph osd pool get cache_pool4 all
*** DEVELOPER MODE: setting PATH, PYTHONPATH and LD_LIBRARY_PATH ***
size: 3
min_size: 1
crash_replay_interval: 0
pg_num: 1024
pgp_num: 1024
crush_ruleset: 0
auid: 0
write_fadvise_dontneed: false
$ ./ceph osd ls
*** DEVELOPER MODE: setting PATH, PYTHONPATH and LD_LIBRARY_PATH ***
0
1
2

[1]: http://ceph.com/docs/master/rados/operations/placement-groups/

Actions #3

Updated by Kefu Chai almost 9 years ago

  • Status changed from New to Resolved
Actions

Also available in: Atom PDF