Project

General

Profile

Tasks #584

do throughput scaling tests on sepia

Added by Sage Weil over 13 years ago. Updated over 12 years ago.

Status:
Rejected
Priority:
Normal
Assignee:
-
Category:
-
Target version:
-
% Done:

0%

Spent time:
Tags:
Reviewed:
Affected Versions:
Pull request ID:

Description

Use rados bench on N nodes, scaling N, and see how the throughput scales.

History

#1 Updated by Sage Weil over 13 years ago

  • Assignee set to Samuel Just

#2 Updated by Samuel Just over 13 years ago

  • Status changed from New to Resolved

Results of running rados -p bench bench 20 write on <Nodes>. <Average Throughput> is the average of the Bandwidth statistic reported by rados over all nodes.
The throughput does not seem to be degrading much with additional nodes running rados bench.

Nodes Average Throughput Total Throughput
1 74.20 MB/s 74.201 MB/s
2 45.95 MB/s 91.911 MB/s
3 26.56 MB/s 79.694 MB/s
4 19.50 MB/s 78.017 MB/s
5 15.68 MB/s 78.409 MB/s
6 13.42 MB/s 80.544 MB/s
7 13.12 MB/s 91.84 MB/s
8 12.63 MB/s 101.063 MB/s
9 11.60 MB/s 104.479 MB/s
10 11.22 MB/s 112.286 MB/s
11 8.725 MB/s 95.977 MB/s
12 9.127 MB/s 109.535 MB/s
13 8.258 MB/s 107.355 MB/s
14 8.095 MB/s 113.342 MB/s
15 6.959 MB/s 104.386 MB/s
16 6.825 MB/s 109.206 MB/s
17 6.467 MB/s 109.948 MB/s
18 5.924 MB/s 106.647 MB/s
19 5.861 MB/s 111.37 MB/s
20 4.997 MB/s 99.957 MB/s
21 5.213 MB/s 109.483 MB/s
22 4.962 MB/s 109.178 MB/s
23 4.453 MB/s 102.437 MB/s
24 4.614 MB/s 110.75 MB/s
25 4.425 MB/s 110.632 MB/s
26 3.873 MB/s 100.715 MB/s

#3 Updated by Sage Weil over 13 years ago

  • Status changed from Resolved to In Progress

There's definitely a problem here; the total throughput should be scaling more or less linearly until we hit a bottleneck in the switch or max out the cluster's throughput.

#4 Updated by Greg Farnum over 13 years ago

What was the variance in per-node throughput? Did we have one node dominating?

#5 Updated by Sage Weil over 13 years ago

  • Target version changed from v0.24 to v0.25

#6 Updated by Greg Farnum about 13 years ago

I don't remember: did we actually finish this and just not update the bug? If not we should decide what to do with it!

#7 Updated by Sage Weil about 13 years ago

  • Target version changed from v0.25 to v0.26

#8 Updated by Sage Weil about 13 years ago

  • Target version changed from v0.26 to v0.27

#9 Updated by Sage Weil almost 13 years ago

  • Status changed from In Progress to New
  • Assignee deleted (Samuel Just)
  • Priority changed from High to Normal
  • Target version changed from v0.27 to 12

#10 Updated by Sage Weil over 12 years ago

  • Status changed from New to Rejected

#11 Updated by Sage Weil over 12 years ago

  • Target version deleted (12)

Also available in: Atom PDF