Project

General

Profile

Bug #5069 » bug_5069_all_mon_logs.txt

Florian Wiessner, 05/16/2013 03:25 AM

 
Executing zcat /var/log/ceph/ceph-mon.\?.log.\?.gz \|grep \'2013-05-16 05:15\' on node01
2013-05-16 05:15:00.274581 7fa996c33700 10 mon.0@0(leader).log v15758969 update_from_paxos
2013-05-16 05:15:00.274647 7fa996c33700 10 mon.0@0(leader).log v15758969 update_from_paxos version 15758969 summary v 15758969
2013-05-16 05:15:00.274840 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:00.363566 7fa997983700 10 mon.0@0(leader).log v15758969 encode_full log v 15758969
2013-05-16 05:15:00.363685 7fa997983700 10 mon.0@0(leader).log v15758969 encode_pending v15758970
2013-05-16 05:15:00.581705 7fa997983700 10 mon.0@0(leader).pg v16089272 encode_pending v 16089273
2013-05-16 05:15:00.916155 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos
2013-05-16 05:15:00.916252 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos version 15758970 summary v 15758969
2013-05-16 05:15:00.916391 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos latest full 15758969
2013-05-16 05:15:00.916426 7fa996c33700 7 mon.0@0(leader).log v15758970 update_from_paxos applying incremental log 15758970 2013-05-16 05:14:58.035619 mon.0 188.65.144.4:6789/0 13181 : [INF] pgmap v16089271: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 15173KB/s rd, 2925KB/s wr, 1421op/s
2013-05-16 05:15:00.916457 7fa996c33700 7 mon.0@0(leader).log v15758970 update_from_paxos applying incremental log 15758970 2013-05-16 05:14:59.586224 mon.0 188.65.144.4:6789/0 13182 : [INF] pgmap v16089272: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9925KB/s rd, 2685KB/s wr, 814op/s
2013-05-16 05:15:00.916578 7fa996c33700 10 mon.0@0(leader).log v15758970 check_subs
2013-05-16 05:15:00.916630 7fa996c33700 10 mon.0@0(leader).log v15758970 create_pending v 15758971
2013-05-16 05:15:00.916737 7fa996c33700 7 mon.0@0(leader).log v15758970 _updated_log for mon.0 188.65.144.4:6789/0
2013-05-16 05:15:01.067342 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:01.067357 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x651b780 from 188.65.144.8:6789/0
2013-05-16 05:15:01.067413 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:01.067417 7fa996c33700 10 mon.0@0(leader) e8 mesg 0xbb27500 from 188.65.144.8:6789/0
2013-05-16 05:15:01.265370 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:01.265682 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:01.265733 7fa997983700 10 mon.0@0(leader).log v15758970 update_from_paxos
2013-05-16 05:15:01.265892 7fa997983700 10 mon.0@0(leader).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:01.265918 7fa997983700 10 mon.0@0(leader).log v15758970 log
2013-05-16 05:15:01.266034 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:01.266258 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:02.133211 7fa996c33700 7 mon.0@0(leader).pg v16089272 update_from_paxos applying incremental 16089273
2013-05-16 05:15:02.133300 7fa996c33700 10 mon.0@0(leader).pg v16089273 v16089273: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3956KB/s rd, 1703KB/s wr, 374op/s
2013-05-16 05:15:02.133450 7fa996c33700 10 mon.0@0(leader).pg v16089273 map_pg_creates to 0 pgs
2013-05-16 05:15:02.133455 7fa996c33700 10 mon.0@0(leader).pg v16089273 send_pg_creates to 0 pgs
2013-05-16 05:15:02.133458 7fa996c33700 10 mon.0@0(leader).pg v16089273 update_logger
2013-05-16 05:15:02.133498 7fa996c33700 10 mon.0@0(leader).pg v16089273 create_pending v 16089274
2013-05-16 05:15:02.133519 7fa996c33700 7 mon.0@0(leader).pg v16089273 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:02.133581 7fa996c33700 10 mon.0@0(leader).pg v16089273 preprocess_query pg_stats(25 pgs tid 28175 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:02.133641 7fa996c33700 10 mon.0@0(leader).pg v16089273 prepare_update pg_stats(25 pgs tid 28175 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:02.133648 7fa996c33700 10 mon.0@0(leader).pg v16089273 prepare_pg_stats pg_stats(25 pgs tid 28175 v 3959) v1 from osd.4
2013-05-16 05:15:02.133655 7fa996c33700 10 mon.0@0(leader).pg v16089273 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:02.133742 7fa996c33700 10 mon.0@0(leader).pg v16089273 preprocess_query pg_stats(25 pgs tid 28212 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:02.134081 7fa996c33700 10 mon.0@0(leader).pg v16089273 prepare_update pg_stats(25 pgs tid 28212 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:02.134087 7fa996c33700 10 mon.0@0(leader).pg v16089273 prepare_pg_stats pg_stats(25 pgs tid 28212 v 3959) v1 from osd.6
2013-05-16 05:15:02.134091 7fa996c33700 10 mon.0@0(leader).pg v16089273 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:02.134174 7fa996c33700 10 mon.0@0(leader).pg v16089273 preprocess_query pg_stats(30 pgs tid 28172 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:02.134207 7fa996c33700 10 mon.0@0(leader).pg v16089273 prepare_update pg_stats(30 pgs tid 28172 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:02.134211 7fa996c33700 10 mon.0@0(leader).pg v16089273 prepare_pg_stats pg_stats(30 pgs tid 28172 v 3959) v1 from osd.5
2013-05-16 05:15:02.134215 7fa996c33700 10 mon.0@0(leader).pg v16089273 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:02.134294 7fa996c33700 10 mon.0@0(leader).pg v16089273 preprocess_query pg_stats(23 pgs tid 28146 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:02.134346 7fa996c33700 10 mon.0@0(leader).pg v16089273 prepare_update pg_stats(23 pgs tid 28146 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:02.134352 7fa996c33700 10 mon.0@0(leader).pg v16089273 prepare_pg_stats pg_stats(23 pgs tid 28146 v 3959) v1 from osd.3
2013-05-16 05:15:02.134356 7fa996c33700 10 mon.0@0(leader).pg v16089273 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:02.134413 7fa996c33700 10 mon.0@0(leader).pg v16089273 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:02.134416 7fa996c33700 10 mon.0@0(leader).pg v16089273 update_logger
2013-05-16 05:15:02.134426 7fa996c33700 0 log [INF] : pgmap v16089273: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3956KB/s rd, 1703KB/s wr, 374op/s
2013-05-16 05:15:02.134592 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos
2013-05-16 05:15:02.134617 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:02.134655 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:02.134739 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos
2013-05-16 05:15:02.134759 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:02.134784 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:02.134881 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos
2013-05-16 05:15:02.134922 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:02.134948 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:02.135008 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos
2013-05-16 05:15:02.135110 7fa996c33700 10 mon.0@0(leader).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:02.135123 7fa996c33700 10 mon.0@0(leader).log v15758970 preprocess_query log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:02.135137 7fa996c33700 10 mon.0@0(leader).log v15758970 preprocess_log log(1 entries) v1 from mon.0
2013-05-16 05:15:02.135313 7fa996c33700 10 mon.0@0(leader).log v15758970 prepare_update log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:02.135335 7fa996c33700 10 mon.0@0(leader).log v15758970 prepare_log log(1 entries) v1 from mon.0
2013-05-16 05:15:02.135348 7fa996c33700 10 mon.0@0(leader).log v15758970 logging 2013-05-16 05:15:02.134427 mon.0 188.65.144.4:6789/0 13183 : [INF] pgmap v16089273: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3956KB/s rd, 1703KB/s wr, 374op/s
2013-05-16 05:15:02.171939 7fa997983700 10 mon.0@0(leader).pg v16089273 encode_pending v 16089274
2013-05-16 05:15:02.633088 7fa997983700 10 mon.0@0(leader).log v15758970 encode_full log v 15758970
2013-05-16 05:15:02.633420 7fa997983700 10 mon.0@0(leader).log v15758970 encode_pending v15758971
2013-05-16 05:15:05.137990 7fa996c33700 7 mon.0@0(leader).pg v16089273 update_from_paxos applying incremental 16089274
2013-05-16 05:15:05.138216 7fa996c33700 10 mon.0@0(leader).pg v16089274 v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:05.138248 7fa996c33700 10 mon.0@0(leader).pg v16089274 map_pg_creates to 0 pgs
2013-05-16 05:15:05.138251 7fa996c33700 10 mon.0@0(leader).pg v16089274 send_pg_creates to 0 pgs
2013-05-16 05:15:05.138253 7fa996c33700 10 mon.0@0(leader).pg v16089274 update_logger
2013-05-16 05:15:05.138287 7fa996c33700 10 mon.0@0(leader).pg v16089274 create_pending v 16089275
2013-05-16 05:15:05.138298 7fa996c33700 7 mon.0@0(leader).pg v16089274 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:05.138320 7fa996c33700 7 mon.0@0(leader).pg v16089274 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:05.138347 7fa996c33700 7 mon.0@0(leader).pg v16089274 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:05.138365 7fa996c33700 7 mon.0@0(leader).pg v16089274 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:05.138378 7fa996c33700 10 mon.0@0(leader).pg v16089274 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:05.138380 7fa996c33700 10 mon.0@0(leader).pg v16089274 update_logger
2013-05-16 05:15:05.138390 7fa996c33700 0 log [INF] : pgmap v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:06.891612 7fa997983700 10 mon.0@0(leader).pg v16089274 check_down_pgs
2013-05-16 05:15:06.891666 7fa997983700 10 mon.0@0(leader).pg v16089274 v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:06.891967 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:06.892000 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:06.892401 7fa997983700 10 mon.0@0(leader).osd e3959 min_last_epoch_clean 3925
2013-05-16 05:15:06.892750 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:06.892810 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:06.892887 7fa996c33700 10 mon.0@0(leader).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:06.892891 7fa996c33700 10 mon.0@0(leader).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:06.893052 7fa996c33700 10 mon.0@0(leader).pg v16089274 preprocess_query pg_stats(19 pgs tid 28057 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:06.893109 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_update pg_stats(19 pgs tid 28057 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:06.893116 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_pg_stats pg_stats(19 pgs tid 28057 v 3959) v1 from osd.1
2013-05-16 05:15:06.893122 7fa996c33700 10 mon.0@0(leader).pg v16089274 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:06.893198 7fa996c33700 10 mon.0@0(leader).pg v16089274 preprocess_query pg_stats(28 pgs tid 28081 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:06.893237 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_update pg_stats(28 pgs tid 28081 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:06.893242 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_pg_stats pg_stats(28 pgs tid 28081 v 3959) v1 from osd.2
2013-05-16 05:15:06.893246 7fa996c33700 10 mon.0@0(leader).pg v16089274 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:06.893330 7fa996c33700 10 mon.0@0(leader).pg v16089274 preprocess_query pg_stats(27 pgs tid 28173 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:06.893389 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_update pg_stats(27 pgs tid 28173 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:06.893396 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_pg_stats pg_stats(27 pgs tid 28173 v 3959) v1 from osd.5
2013-05-16 05:15:06.893400 7fa996c33700 10 mon.0@0(leader).pg v16089274 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:06.893457 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:06.893464 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x645b780 from 188.65.144.8:6789/0
2013-05-16 05:15:06.893502 7fa996c33700 10 mon.0@0(leader).pg v16089274 preprocess_query pg_stats(25 pgs tid 28213 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:06.893546 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_update pg_stats(25 pgs tid 28213 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:06.893553 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_pg_stats pg_stats(25 pgs tid 28213 v 3959) v1 from osd.6
2013-05-16 05:15:06.893556 7fa996c33700 10 mon.0@0(leader).pg v16089274 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:06.893656 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34448 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:06.893704 7fa996c33700 10 mon.0@0(leader).pg v16089274 preprocess_query pg_stats(20 pgs tid 28176 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:06.893752 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_update pg_stats(20 pgs tid 28176 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:06.893758 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_pg_stats pg_stats(20 pgs tid 28176 v 3959) v1 from osd.4
2013-05-16 05:15:06.893761 7fa996c33700 10 mon.0@0(leader).pg v16089274 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:06.893845 7fa996c33700 10 mon.0@0(leader).pg v16089274 preprocess_query pg_stats(21 pgs tid 28066 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:06.893881 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_update pg_stats(21 pgs tid 28066 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:06.893886 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_pg_stats pg_stats(21 pgs tid 28066 v 3959) v1 from osd.0
2013-05-16 05:15:06.893890 7fa996c33700 10 mon.0@0(leader).pg v16089274 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:06.893935 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:06.893941 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x2463c80 from 188.65.144.8:6789/0
2013-05-16 05:15:06.893984 7fa996c33700 10 mon.0@0(leader).pg v16089274 preprocess_query pg_stats(18 pgs tid 28147 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:06.894018 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_update pg_stats(18 pgs tid 28147 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:06.894024 7fa996c33700 10 mon.0@0(leader).pg v16089274 prepare_pg_stats pg_stats(18 pgs tid 28147 v 3959) v1 from osd.3
2013-05-16 05:15:06.894028 7fa996c33700 10 mon.0@0(leader).pg v16089274 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:06.943766 7fa997983700 10 mon.0@0(leader).pg v16089274 encode_pending v 16089275
2013-05-16 05:15:08.723215 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos
2013-05-16 05:15:08.723289 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos version 15758971 summary v 15758970
2013-05-16 05:15:08.723573 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos latest full 15758970
2013-05-16 05:15:08.723703 7fa996c33700 7 mon.0@0(leader).log v15758971 update_from_paxos applying incremental log 15758971 2013-05-16 05:15:02.134427 mon.0 188.65.144.4:6789/0 13183 : [INF] pgmap v16089273: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3956KB/s rd, 1703KB/s wr, 374op/s
2013-05-16 05:15:08.804430 7fa996c33700 10 mon.0@0(leader).log v15758971 check_subs
2013-05-16 05:15:08.804727 7fa996c33700 10 mon.0@0(leader).log v15758971 create_pending v 15758972
2013-05-16 05:15:08.804776 7fa996c33700 7 mon.0@0(leader).log v15758971 _updated_log for mon.0 188.65.144.4:6789/0
2013-05-16 05:15:08.805027 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos
2013-05-16 05:15:08.805054 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:08.805070 7fa996c33700 10 mon.0@0(leader).log v15758971 preprocess_query log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:08.805171 7fa996c33700 10 mon.0@0(leader).log v15758971 preprocess_log log(1 entries) v1 from mon.0
2013-05-16 05:15:08.805198 7fa996c33700 10 mon.0@0(leader).log v15758971 nothing new
2013-05-16 05:15:09.427902 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34449 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:10.127970 7fa996c33700 10 mon.0@0(leader).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:10.127974 7fa996c33700 10 mon.0@0(leader).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:10.814999 7fa996c33700 7 mon.0@0(leader).pg v16089274 update_from_paxos applying incremental 16089275
2013-05-16 05:15:10.815328 7fa996c33700 10 mon.0@0(leader).pg v16089275 v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:10.815373 7fa996c33700 10 mon.0@0(leader).pg v16089275 map_pg_creates to 0 pgs
2013-05-16 05:15:10.815377 7fa996c33700 10 mon.0@0(leader).pg v16089275 send_pg_creates to 0 pgs
2013-05-16 05:15:10.815379 7fa996c33700 10 mon.0@0(leader).pg v16089275 update_logger
2013-05-16 05:15:10.815420 7fa996c33700 10 mon.0@0(leader).pg v16089275 create_pending v 16089276
2013-05-16 05:15:10.815431 7fa996c33700 7 mon.0@0(leader).pg v16089275 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:10.815449 7fa996c33700 7 mon.0@0(leader).pg v16089275 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:10.815475 7fa996c33700 7 mon.0@0(leader).pg v16089275 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:10.815499 7fa996c33700 7 mon.0@0(leader).pg v16089275 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:10.815524 7fa996c33700 7 mon.0@0(leader).pg v16089275 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:10.815545 7fa996c33700 7 mon.0@0(leader).pg v16089275 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:10.815565 7fa996c33700 7 mon.0@0(leader).pg v16089275 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:10.815619 7fa996c33700 10 mon.0@0(leader).pg v16089275 preprocess_query pg_stats(23 pgs tid 28058 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:10.815672 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_update pg_stats(23 pgs tid 28058 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:10.815679 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_pg_stats pg_stats(23 pgs tid 28058 v 3959) v1 from osd.1
2013-05-16 05:15:10.815685 7fa996c33700 10 mon.0@0(leader).pg v16089275 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:10.815756 7fa996c33700 10 mon.0@0(leader).pg v16089275 preprocess_query pg_stats(34 pgs tid 28082 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:10.815802 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_update pg_stats(34 pgs tid 28082 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:10.815808 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_pg_stats pg_stats(34 pgs tid 28082 v 3959) v1 from osd.2
2013-05-16 05:15:10.815814 7fa996c33700 10 mon.0@0(leader).pg v16089275 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:10.815891 7fa996c33700 10 mon.0@0(leader).pg v16089275 preprocess_query pg_stats(27 pgs tid 28067 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:10.815932 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_update pg_stats(27 pgs tid 28067 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:10.815938 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_pg_stats pg_stats(27 pgs tid 28067 v 3959) v1 from osd.0
2013-05-16 05:15:10.815943 7fa996c33700 10 mon.0@0(leader).pg v16089275 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:10.816022 7fa996c33700 10 mon.0@0(leader).pg v16089275 preprocess_query pg_stats(24 pgs tid 28177 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:10.816062 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_update pg_stats(24 pgs tid 28177 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:10.816068 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_pg_stats pg_stats(24 pgs tid 28177 v 3959) v1 from osd.4
2013-05-16 05:15:10.816074 7fa996c33700 10 mon.0@0(leader).pg v16089275 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:10.816125 7fa996c33700 10 mon.0@0(leader).pg v16089275 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:10.816129 7fa996c33700 10 mon.0@0(leader).pg v16089275 update_logger
2013-05-16 05:15:10.816145 7fa996c33700 0 log [INF] : pgmap v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:10.816247 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos
2013-05-16 05:15:10.816281 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:10.816314 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:10.816523 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos
2013-05-16 05:15:10.816555 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:10.816582 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:10.816637 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos
2013-05-16 05:15:10.816658 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:10.816675 7fa996c33700 10 mon.0@0(leader).log v15758971 preprocess_query log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:10.816690 7fa996c33700 10 mon.0@0(leader).log v15758971 preprocess_log log(2 entries) v1 from mon.0
2013-05-16 05:15:10.816721 7fa996c33700 10 mon.0@0(leader).log v15758971 prepare_update log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:10.816736 7fa996c33700 10 mon.0@0(leader).log v15758971 prepare_log log(2 entries) v1 from mon.0
2013-05-16 05:15:10.816749 7fa996c33700 10 mon.0@0(leader).log v15758971 logging 2013-05-16 05:15:05.138392 mon.0 188.65.144.4:6789/0 13184 : [INF] pgmap v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:10.816777 7fa996c33700 10 mon.0@0(leader).log v15758971 logging 2013-05-16 05:15:10.816147 mon.0 188.65.144.4:6789/0 13185 : [INF] pgmap v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:10.817842 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos
2013-05-16 05:15:10.817925 7fa996c33700 10 mon.0@0(leader).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:10.817984 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:10.818073 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:10.818080 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x68c3a00 from 188.65.144.8:6789/0
2013-05-16 05:15:10.818121 7fa996c33700 10 mon.0@0(leader).pg v16089275 preprocess_query pg_stats(32 pgs tid 28214 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:10.818181 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_update pg_stats(32 pgs tid 28214 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:10.818191 7fa996c33700 10 mon.0@0(leader).pg v16089275 prepare_pg_stats pg_stats(32 pgs tid 28214 v 3959) v1 from osd.6
2013-05-16 05:15:10.818197 7fa996c33700 10 mon.0@0(leader).pg v16089275 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:10.866156 7fa997983700 10 mon.0@0(leader).pg v16089275 encode_pending v 16089276
2013-05-16 05:15:11.424368 7fa997983700 10 mon.0@0(leader).log v15758971 encode_full log v 15758971
2013-05-16 05:15:11.424694 7fa997983700 10 mon.0@0(leader).log v15758971 encode_pending v15758972
2013-05-16 05:15:11.424742 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:11.424757 7fa996c33700 10 mon.0@0(leader) e8 mesg 0xba31500 from 188.65.144.8:6789/0
2013-05-16 05:15:12.425174 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:12.425221 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:12.425288 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:12.425336 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:12.425612 7fa996c33700 7 mon.0@0(leader).pg v16089275 update_from_paxos applying incremental 16089276
2013-05-16 05:15:12.425952 7fa996c33700 10 mon.0@0(leader).pg v16089276 v16089276: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3626KB/s rd, 3249KB/s wr, 562op/s
2013-05-16 05:15:12.425999 7fa996c33700 10 mon.0@0(leader).pg v16089276 map_pg_creates to 0 pgs
2013-05-16 05:15:12.426006 7fa996c33700 10 mon.0@0(leader).pg v16089276 send_pg_creates to 0 pgs
2013-05-16 05:15:12.426008 7fa996c33700 10 mon.0@0(leader).pg v16089276 update_logger
2013-05-16 05:15:12.426055 7fa996c33700 10 mon.0@0(leader).pg v16089276 create_pending v 16089277
2013-05-16 05:15:12.426070 7fa996c33700 7 mon.0@0(leader).pg v16089276 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:12.426100 7fa996c33700 7 mon.0@0(leader).pg v16089276 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:12.426117 7fa996c33700 7 mon.0@0(leader).pg v16089276 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:12.426134 7fa996c33700 7 mon.0@0(leader).pg v16089276 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:12.426158 7fa996c33700 7 mon.0@0(leader).pg v16089276 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:12.426339 7fa996c33700 10 mon.0@0(leader).pg v16089276 preprocess_query pg_stats(18 pgs tid 28174 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:12.426387 7fa996c33700 10 mon.0@0(leader).pg v16089276 prepare_update pg_stats(18 pgs tid 28174 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:12.426394 7fa996c33700 10 mon.0@0(leader).pg v16089276 prepare_pg_stats pg_stats(18 pgs tid 28174 v 3959) v1 from osd.5
2013-05-16 05:15:12.426400 7fa996c33700 10 mon.0@0(leader).pg v16089276 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:12.426469 7fa996c33700 10 mon.0@0(leader).pg v16089276 preprocess_query pg_stats(8 pgs tid 28148 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:12.426535 7fa996c33700 10 mon.0@0(leader).pg v16089276 prepare_update pg_stats(8 pgs tid 28148 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:12.426542 7fa996c33700 10 mon.0@0(leader).pg v16089276 prepare_pg_stats pg_stats(8 pgs tid 28148 v 3959) v1 from osd.3
2013-05-16 05:15:12.426547 7fa996c33700 10 mon.0@0(leader).pg v16089276 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:12.426590 7fa996c33700 10 mon.0@0(leader).pg v16089276 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:12.426594 7fa996c33700 10 mon.0@0(leader).pg v16089276 update_logger
2013-05-16 05:15:12.426610 7fa996c33700 0 log [INF] : pgmap v16089276: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3626KB/s rd, 3249KB/s wr, 562op/s
2013-05-16 05:15:13.154950 7fa997983700 10 mon.0@0(leader).pg v16089276 encode_pending v 16089277
2013-05-16 05:15:13.155423 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34450 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:13.633690 7fa9f3e72700 0 -- 188.65.144.4:6789/0 >> 188.65.144.5:6789/0 pipe(0x200d500 sd=25 :55877 s=2 pgs=3429 cs=1 l=0).fault with nothing to send, going to standby
2013-05-16 05:15:13.746997 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos
2013-05-16 05:15:13.747076 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos version 15758972 summary v 15758971
2013-05-16 05:15:13.747158 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos latest full 15758971
2013-05-16 05:15:13.747321 7fa996c33700 7 mon.0@0(leader).log v15758972 update_from_paxos applying incremental log 15758972 2013-05-16 05:15:05.138392 mon.0 188.65.144.4:6789/0 13184 : [INF] pgmap v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:13.747371 7fa996c33700 7 mon.0@0(leader).log v15758972 update_from_paxos applying incremental log 15758972 2013-05-16 05:15:10.816147 mon.0 188.65.144.4:6789/0 13185 : [INF] pgmap v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:13.778354 7fa996c33700 10 mon.0@0(leader).log v15758972 check_subs
2013-05-16 05:15:13.778421 7fa996c33700 10 mon.0@0(leader).log v15758972 create_pending v 15758973
2013-05-16 05:15:13.778615 7fa996c33700 7 mon.0@0(leader).log v15758972 _updated_log for mon.0 188.65.144.4:6789/0
2013-05-16 05:15:13.778659 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos
2013-05-16 05:15:13.778688 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos version 15758972 summary v 15758972
2013-05-16 05:15:13.778710 7fa996c33700 10 mon.0@0(leader).log v15758972 preprocess_query log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:13.778727 7fa996c33700 10 mon.0@0(leader).log v15758972 preprocess_log log(1 entries) v1 from mon.0
2013-05-16 05:15:13.778751 7fa996c33700 10 mon.0@0(leader).log v15758972 nothing new
2013-05-16 05:15:15.225366 7fa996c33700 7 mon.0@0(leader).pg v16089276 update_from_paxos applying incremental 16089277
2013-05-16 05:15:15.225447 7fa996c33700 10 mon.0@0(leader).pg v16089277 v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:15.225485 7fa996c33700 10 mon.0@0(leader).pg v16089277 map_pg_creates to 0 pgs
2013-05-16 05:15:15.225489 7fa996c33700 10 mon.0@0(leader).pg v16089277 send_pg_creates to 0 pgs
2013-05-16 05:15:15.225491 7fa996c33700 10 mon.0@0(leader).pg v16089277 update_logger
2013-05-16 05:15:15.225772 7fa996c33700 10 mon.0@0(leader).pg v16089277 create_pending v 16089278
2013-05-16 05:15:15.225792 7fa996c33700 7 mon.0@0(leader).pg v16089277 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:15.225819 7fa996c33700 7 mon.0@0(leader).pg v16089277 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:15.225870 7fa996c33700 10 mon.0@0(leader).pg v16089277 preprocess_query pg_stats(2 pgs tid 28059 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:15.225927 7fa996c33700 10 mon.0@0(leader).pg v16089277 prepare_update pg_stats(2 pgs tid 28059 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:15.225933 7fa996c33700 10 mon.0@0(leader).pg v16089277 prepare_pg_stats pg_stats(2 pgs tid 28059 v 3959) v1 from osd.1
2013-05-16 05:15:15.225940 7fa996c33700 10 mon.0@0(leader).pg v16089277 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:15.225996 7fa996c33700 10 mon.0@0(leader).pg v16089277 preprocess_query pg_stats(4 pgs tid 28083 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:15.226051 7fa996c33700 10 mon.0@0(leader).pg v16089277 prepare_update pg_stats(4 pgs tid 28083 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:15.226057 7fa996c33700 10 mon.0@0(leader).pg v16089277 prepare_pg_stats pg_stats(4 pgs tid 28083 v 3959) v1 from osd.2
2013-05-16 05:15:15.226062 7fa996c33700 10 mon.0@0(leader).pg v16089277 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:15.226113 7fa996c33700 10 mon.0@0(leader).pg v16089277 preprocess_query pg_stats(6 pgs tid 28068 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:15.226157 7fa996c33700 10 mon.0@0(leader).pg v16089277 prepare_update pg_stats(6 pgs tid 28068 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:15.226163 7fa996c33700 10 mon.0@0(leader).pg v16089277 prepare_pg_stats pg_stats(6 pgs tid 28068 v 3959) v1 from osd.0
2013-05-16 05:15:15.226168 7fa996c33700 10 mon.0@0(leader).pg v16089277 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:15.226204 7fa996c33700 10 mon.0@0(leader).pg v16089277 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:15.226208 7fa996c33700 10 mon.0@0(leader).pg v16089277 update_logger
2013-05-16 05:15:15.226223 7fa996c33700 0 log [INF] : pgmap v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:15.226449 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos
2013-05-16 05:15:15.226490 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos version 15758972 summary v 15758972
2013-05-16 05:15:15.226602 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:15.226747 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos
2013-05-16 05:15:15.226778 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos version 15758972 summary v 15758972
2013-05-16 05:15:15.226791 7fa996c33700 10 mon.0@0(leader).log v15758972 preprocess_query log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:15.226820 7fa996c33700 10 mon.0@0(leader).log v15758972 preprocess_log log(2 entries) v1 from mon.0
2013-05-16 05:15:15.226862 7fa996c33700 10 mon.0@0(leader).log v15758972 prepare_update log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:15.226880 7fa996c33700 10 mon.0@0(leader).log v15758972 prepare_log log(2 entries) v1 from mon.0
2013-05-16 05:15:15.226902 7fa996c33700 10 mon.0@0(leader).log v15758972 logging 2013-05-16 05:15:12.426612 mon.0 188.65.144.4:6789/0 13186 : [INF] pgmap v16089276: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3626KB/s rd, 3249KB/s wr, 562op/s
2013-05-16 05:15:15.227121 7fa996c33700 10 mon.0@0(leader).log v15758972 logging 2013-05-16 05:15:15.226225 mon.0 188.65.144.4:6789/0 13187 : [INF] pgmap v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:15.228548 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos
2013-05-16 05:15:15.228656 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos version 15758972 summary v 15758972
2013-05-16 05:15:15.228761 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:15.289296 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos
2013-05-16 05:15:15.289331 7fa996c33700 10 mon.0@0(leader).log v15758972 update_from_paxos version 15758972 summary v 15758972
2013-05-16 05:15:15.289404 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:15.370736 7fa996c33700 10 mon.0@0(leader).pg v16089277 preprocess_query pg_stats(9 pgs tid 28178 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:15.370782 7fa996c33700 10 mon.0@0(leader).pg v16089277 prepare_update pg_stats(9 pgs tid 28178 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:15.370789 7fa996c33700 10 mon.0@0(leader).pg v16089277 prepare_pg_stats pg_stats(9 pgs tid 28178 v 3959) v1 from osd.4
2013-05-16 05:15:15.370796 7fa996c33700 10 mon.0@0(leader).pg v16089277 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:15.550558 7fa997983700 10 mon.0@0(leader).log v15758972 encode_full log v 15758972
2013-05-16 05:15:15.550808 7fa997983700 10 mon.0@0(leader).log v15758972 encode_pending v15758973
2013-05-16 05:15:15.998175 7fa997983700 10 mon.0@0(leader).pg v16089277 encode_pending v 16089278
2013-05-16 05:15:15.998342 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:15.998358 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x68c3a00 from 188.65.144.8:6789/0
2013-05-16 05:15:15.998452 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34451 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:16.786223 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos
2013-05-16 05:15:16.786283 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos version 15758973 summary v 15758972
2013-05-16 05:15:16.786306 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos latest full 15758972
2013-05-16 05:15:16.786347 7fa996c33700 7 mon.0@0(leader).log v15758973 update_from_paxos applying incremental log 15758973 2013-05-16 05:15:12.426612 mon.0 188.65.144.4:6789/0 13186 : [INF] pgmap v16089276: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3626KB/s rd, 3249KB/s wr, 562op/s
2013-05-16 05:15:16.786378 7fa996c33700 7 mon.0@0(leader).log v15758973 update_from_paxos applying incremental log 15758973 2013-05-16 05:15:15.226225 mon.0 188.65.144.4:6789/0 13187 : [INF] pgmap v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:16.786409 7fa996c33700 10 mon.0@0(leader).log v15758973 check_subs
2013-05-16 05:15:16.786460 7fa996c33700 10 mon.0@0(leader).log v15758973 create_pending v 15758974
2013-05-16 05:15:16.786580 7fa996c33700 7 mon.0@0(leader).log v15758973 _updated_log for mon.0 188.65.144.4:6789/0
2013-05-16 05:15:17.300190 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:17.300200 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x7523a00 from 188.65.144.8:6789/0
2013-05-16 05:15:18.182269 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:18.182376 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:18.182416 7fa997983700 10 mon.0@0(leader).log v15758973 update_from_paxos
2013-05-16 05:15:18.182444 7fa997983700 10 mon.0@0(leader).log v15758973 update_from_paxos version 15758973 summary v 15758973
2013-05-16 05:15:18.182537 7fa997983700 10 mon.0@0(leader).log v15758973 log
2013-05-16 05:15:18.182568 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:18.182594 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:18.182851 7fa996c33700 7 mon.0@0(leader).pg v16089277 update_from_paxos applying incremental 16089278
2013-05-16 05:15:18.182921 7fa996c33700 10 mon.0@0(leader).pg v16089278 v16089278: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 830KB/s rd, 512KB/s wr, 186op/s
2013-05-16 05:15:18.183095 7fa996c33700 10 mon.0@0(leader).pg v16089278 map_pg_creates to 0 pgs
2013-05-16 05:15:18.183100 7fa996c33700 10 mon.0@0(leader).pg v16089278 send_pg_creates to 0 pgs
2013-05-16 05:15:18.183103 7fa996c33700 10 mon.0@0(leader).pg v16089278 update_logger
2013-05-16 05:15:18.183135 7fa996c33700 10 mon.0@0(leader).pg v16089278 create_pending v 16089279
2013-05-16 05:15:18.183154 7fa996c33700 7 mon.0@0(leader).pg v16089278 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:18.183179 7fa996c33700 7 mon.0@0(leader).pg v16089278 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:18.183200 7fa996c33700 7 mon.0@0(leader).pg v16089278 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:18.183218 7fa996c33700 7 mon.0@0(leader).pg v16089278 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:18.183283 7fa996c33700 10 mon.0@0(leader).pg v16089278 preprocess_query pg_stats(12 pgs tid 28215 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:18.183613 7fa996c33700 10 mon.0@0(leader).pg v16089278 prepare_update pg_stats(12 pgs tid 28215 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:18.183622 7fa996c33700 10 mon.0@0(leader).pg v16089278 prepare_pg_stats pg_stats(12 pgs tid 28215 v 3959) v1 from osd.6
2013-05-16 05:15:18.183628 7fa996c33700 10 mon.0@0(leader).pg v16089278 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:18.183695 7fa996c33700 10 mon.0@0(leader).pg v16089278 preprocess_query pg_stats(9 pgs tid 28175 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:18.183738 7fa996c33700 10 mon.0@0(leader).pg v16089278 prepare_update pg_stats(9 pgs tid 28175 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:18.183744 7fa996c33700 10 mon.0@0(leader).pg v16089278 prepare_pg_stats pg_stats(9 pgs tid 28175 v 3959) v1 from osd.5
2013-05-16 05:15:18.183749 7fa996c33700 10 mon.0@0(leader).pg v16089278 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:18.183801 7fa996c33700 10 mon.0@0(leader).pg v16089278 preprocess_query pg_stats(10 pgs tid 28149 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:18.183852 7fa996c33700 10 mon.0@0(leader).pg v16089278 prepare_update pg_stats(10 pgs tid 28149 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:18.183858 7fa996c33700 10 mon.0@0(leader).pg v16089278 prepare_pg_stats pg_stats(10 pgs tid 28149 v 3959) v1 from osd.3
2013-05-16 05:15:18.183863 7fa996c33700 10 mon.0@0(leader).pg v16089278 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:18.184032 7fa996c33700 10 mon.0@0(leader).pg v16089278 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:18.184037 7fa996c33700 10 mon.0@0(leader).pg v16089278 update_logger
2013-05-16 05:15:18.184053 7fa996c33700 0 log [INF] : pgmap v16089278: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 830KB/s rd, 512KB/s wr, 186op/s
2013-05-16 05:15:18.184141 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos
2013-05-16 05:15:18.184173 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos version 15758973 summary v 15758973
2013-05-16 05:15:18.184212 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:18.184362 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos
2013-05-16 05:15:18.184397 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos version 15758973 summary v 15758973
2013-05-16 05:15:18.184416 7fa996c33700 10 mon.0@0(leader).log v15758973 preprocess_query log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:18.184440 7fa996c33700 10 mon.0@0(leader).log v15758973 preprocess_log log(1 entries) v1 from mon.0
2013-05-16 05:15:18.184614 7fa996c33700 10 mon.0@0(leader).log v15758973 prepare_update log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:18.184635 7fa996c33700 10 mon.0@0(leader).log v15758973 prepare_log log(1 entries) v1 from mon.0
2013-05-16 05:15:18.184657 7fa996c33700 10 mon.0@0(leader).log v15758973 logging 2013-05-16 05:15:18.184056 mon.0 188.65.144.4:6789/0 13188 : [INF] pgmap v16089278: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 830KB/s rd, 512KB/s wr, 186op/s
2013-05-16 05:15:18.185755 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos
2013-05-16 05:15:18.185863 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos version 15758973 summary v 15758973
2013-05-16 05:15:18.185898 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:18.270168 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos
2013-05-16 05:15:18.270203 7fa996c33700 10 mon.0@0(leader).log v15758973 update_from_paxos version 15758973 summary v 15758973
2013-05-16 05:15:18.270232 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:18.403113 7fa997983700 10 mon.0@0(leader).log v15758973 encode_full log v 15758973
2013-05-16 05:15:18.403651 7fa997983700 10 mon.0@0(leader).log v15758973 encode_pending v15758974
2013-05-16 05:15:19.062319 7fa997983700 10 mon.0@0(leader).pg v16089278 encode_pending v 16089279
2013-05-16 05:15:20.131614 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos
2013-05-16 05:15:20.132528 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos version 15758974 summary v 15758973
2013-05-16 05:15:20.132575 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos latest full 15758973
2013-05-16 05:15:20.132639 7fa996c33700 7 mon.0@0(leader).log v15758974 update_from_paxos applying incremental log 15758974 2013-05-16 05:15:18.184056 mon.0 188.65.144.4:6789/0 13188 : [INF] pgmap v16089278: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 830KB/s rd, 512KB/s wr, 186op/s
2013-05-16 05:15:20.132814 7fa996c33700 10 mon.0@0(leader).log v15758974 check_subs
2013-05-16 05:15:20.132874 7fa996c33700 10 mon.0@0(leader).log v15758974 create_pending v 15758975
2013-05-16 05:15:20.133074 7fa996c33700 7 mon.0@0(leader).log v15758974 _updated_log for mon.0 188.65.144.4:6789/0
2013-05-16 05:15:21.124055 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34452 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:21.124196 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:21.124203 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x6861780 from 188.65.144.8:6789/0
2013-05-16 05:15:23.183075 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:23.183115 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:23.183384 7fa997983700 10 mon.0@0(leader).log v15758974 update_from_paxos
2013-05-16 05:15:23.183425 7fa997983700 10 mon.0@0(leader).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:23.183449 7fa997983700 10 mon.0@0(leader).log v15758974 log
2013-05-16 05:15:23.183488 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:23.183515 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:23.469822 7fa996c33700 7 mon.0@0(leader).pg v16089278 update_from_paxos applying incremental 16089279
2013-05-16 05:15:23.469907 7fa996c33700 10 mon.0@0(leader).pg v16089279 v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:23.469939 7fa996c33700 10 mon.0@0(leader).pg v16089279 map_pg_creates to 0 pgs
2013-05-16 05:15:23.469943 7fa996c33700 10 mon.0@0(leader).pg v16089279 send_pg_creates to 0 pgs
2013-05-16 05:15:23.469945 7fa996c33700 10 mon.0@0(leader).pg v16089279 update_logger
2013-05-16 05:15:23.469974 7fa996c33700 10 mon.0@0(leader).pg v16089279 create_pending v 16089280
2013-05-16 05:15:23.469996 7fa996c33700 7 mon.0@0(leader).pg v16089279 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:23.470021 7fa996c33700 7 mon.0@0(leader).pg v16089279 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:23.470048 7fa996c33700 7 mon.0@0(leader).pg v16089279 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:23.470212 7fa996c33700 10 mon.0@0(leader).pg v16089279 preprocess_query pg_stats(20 pgs tid 28060 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:23.470284 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_update pg_stats(20 pgs tid 28060 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:23.470291 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_pg_stats pg_stats(20 pgs tid 28060 v 3959) v1 from osd.1
2013-05-16 05:15:23.470296 7fa996c33700 10 mon.0@0(leader).pg v16089279 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:23.470378 7fa996c33700 10 mon.0@0(leader).pg v16089279 preprocess_query pg_stats(22 pgs tid 28084 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:23.470437 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_update pg_stats(22 pgs tid 28084 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:23.470447 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_pg_stats pg_stats(22 pgs tid 28084 v 3959) v1 from osd.2
2013-05-16 05:15:23.470453 7fa996c33700 10 mon.0@0(leader).pg v16089279 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:23.470533 7fa996c33700 10 mon.0@0(leader).pg v16089279 preprocess_query pg_stats(24 pgs tid 28069 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:23.470571 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_update pg_stats(24 pgs tid 28069 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:23.470577 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_pg_stats pg_stats(24 pgs tid 28069 v 3959) v1 from osd.0
2013-05-16 05:15:23.470581 7fa996c33700 10 mon.0@0(leader).pg v16089279 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:23.470644 7fa996c33700 10 mon.0@0(leader).pg v16089279 preprocess_query pg_stats(19 pgs tid 28179 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:23.470681 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_update pg_stats(19 pgs tid 28179 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:23.470686 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_pg_stats pg_stats(19 pgs tid 28179 v 3959) v1 from osd.4
2013-05-16 05:15:23.470694 7fa996c33700 10 mon.0@0(leader).pg v16089279 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:23.470755 7fa996c33700 10 mon.0@0(leader).pg v16089279 preprocess_query pg_stats(27 pgs tid 28176 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:23.470792 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_update pg_stats(27 pgs tid 28176 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:23.470797 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_pg_stats pg_stats(27 pgs tid 28176 v 3959) v1 from osd.5
2013-05-16 05:15:23.470801 7fa996c33700 10 mon.0@0(leader).pg v16089279 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:23.470867 7fa996c33700 10 mon.0@0(leader).pg v16089279 preprocess_query pg_stats(28 pgs tid 28216 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:23.470906 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_update pg_stats(28 pgs tid 28216 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:23.470912 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_pg_stats pg_stats(28 pgs tid 28216 v 3959) v1 from osd.6
2013-05-16 05:15:23.470915 7fa996c33700 10 mon.0@0(leader).pg v16089279 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:23.470960 7fa996c33700 10 mon.0@0(leader).pg v16089279 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:23.470964 7fa996c33700 10 mon.0@0(leader).pg v16089279 update_logger
2013-05-16 05:15:23.470976 7fa996c33700 0 log [INF] : pgmap v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:23.471064 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos
2013-05-16 05:15:23.471091 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:23.471124 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:23.471202 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos
2013-05-16 05:15:23.471235 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:23.471266 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:23.471333 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos
2013-05-16 05:15:23.471359 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:23.471388 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:23.471445 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos
2013-05-16 05:15:23.471480 7fa996c33700 10 mon.0@0(leader).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:23.471498 7fa996c33700 10 mon.0@0(leader).log v15758974 preprocess_query log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:23.471518 7fa996c33700 10 mon.0@0(leader).log v15758974 preprocess_log log(1 entries) v1 from mon.0
2013-05-16 05:15:23.471557 7fa996c33700 10 mon.0@0(leader).log v15758974 prepare_update log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:23.471696 7fa996c33700 10 mon.0@0(leader).log v15758974 prepare_log log(1 entries) v1 from mon.0
2013-05-16 05:15:23.471723 7fa996c33700 10 mon.0@0(leader).log v15758974 logging 2013-05-16 05:15:23.470978 mon.0 188.65.144.4:6789/0 13189 : [INF] pgmap v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:23.471795 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:23.471803 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x2695500 from 188.65.144.8:6789/0
2013-05-16 05:15:23.471842 7fa996c33700 10 mon.0@0(leader).pg v16089279 preprocess_query pg_stats(18 pgs tid 28150 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:23.471882 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_update pg_stats(18 pgs tid 28150 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:23.471889 7fa996c33700 10 mon.0@0(leader).pg v16089279 prepare_pg_stats pg_stats(18 pgs tid 28150 v 3959) v1 from osd.3
2013-05-16 05:15:23.471894 7fa996c33700 10 mon.0@0(leader).pg v16089279 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:23.520966 7fa997983700 10 mon.0@0(leader).pg v16089279 encode_pending v 16089280
2013-05-16 05:15:27.445858 7fa997983700 10 mon.0@0(leader).log v15758974 encode_full log v 15758974
2013-05-16 05:15:27.446002 7fa997983700 10 mon.0@0(leader).log v15758974 encode_pending v15758975
2013-05-16 05:15:27.446200 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34453 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:27.446534 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:27.446541 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x7497c80 from 188.65.144.8:6789/0
2013-05-16 05:15:27.446574 7fa996c33700 10 mon.0@0(leader) e8 handle_subscribe mon_subscribe({mdsmap=4484+,monmap=9+,osdmap=3960}) v2
2013-05-16 05:15:27.446580 7fa996c33700 10 mon.0@0(leader) e8 check_sub monmap next 9 have 8
2013-05-16 05:15:27.446597 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:27.446601 7fa996c33700 10 mon.0@0(leader) e8 mesg 0xbb27c80 from 188.65.144.8:6789/0
2013-05-16 05:15:28.281245 7fa997983700 10 mon.0@0(leader).data_health(12158) service_tick
2013-05-16 05:15:28.281259 7fa997983700 0 mon.0@0(leader).data_health(12158) update_stats avail 69% total 701854104 used 178150900 avail 488051008
2013-05-16 05:15:28.281263 7fa997983700 10 mon.0@0(leader).data_health(12158) share_stats
2013-05-16 05:15:28.281390 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:28.281439 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:28.281680 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:28.281793 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:28.521041 7fa996c33700 7 mon.0@0(leader).pg v16089279 update_from_paxos applying incremental 16089280
2013-05-16 05:15:28.521376 7fa996c33700 10 mon.0@0(leader).pg v16089280 v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:28.521410 7fa996c33700 10 mon.0@0(leader).pg v16089280 map_pg_creates to 0 pgs
2013-05-16 05:15:28.521414 7fa996c33700 10 mon.0@0(leader).pg v16089280 send_pg_creates to 0 pgs
2013-05-16 05:15:28.521416 7fa996c33700 10 mon.0@0(leader).pg v16089280 update_logger
2013-05-16 05:15:28.521456 7fa996c33700 10 mon.0@0(leader).pg v16089280 create_pending v 16089281
2013-05-16 05:15:28.521480 7fa996c33700 7 mon.0@0(leader).pg v16089280 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:28.521499 7fa996c33700 7 mon.0@0(leader).pg v16089280 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:28.521513 7fa996c33700 7 mon.0@0(leader).pg v16089280 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:28.521527 7fa996c33700 7 mon.0@0(leader).pg v16089280 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:28.521539 7fa996c33700 7 mon.0@0(leader).pg v16089280 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:28.521555 7fa996c33700 7 mon.0@0(leader).pg v16089280 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:28.521572 7fa996c33700 7 mon.0@0(leader).pg v16089280 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:28.521804 7fa996c33700 10 mon.0@0(leader).pg v16089280 preprocess_query pg_stats(26 pgs tid 28061 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:28.521852 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_update pg_stats(26 pgs tid 28061 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:28.521858 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_pg_stats pg_stats(26 pgs tid 28061 v 3959) v1 from osd.1
2013-05-16 05:15:28.521863 7fa996c33700 10 mon.0@0(leader).pg v16089280 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:28.521927 7fa996c33700 10 mon.0@0(leader).pg v16089280 preprocess_query pg_stats(25 pgs tid 28180 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:28.521960 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_update pg_stats(25 pgs tid 28180 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:28.521965 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_pg_stats pg_stats(25 pgs tid 28180 v 3959) v1 from osd.4
2013-05-16 05:15:28.521968 7fa996c33700 10 mon.0@0(leader).pg v16089280 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:28.522029 7fa996c33700 10 mon.0@0(leader).pg v16089280 preprocess_query pg_stats(30 pgs tid 28070 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:28.522061 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_update pg_stats(30 pgs tid 28070 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:28.522066 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_pg_stats pg_stats(30 pgs tid 28070 v 3959) v1 from osd.0
2013-05-16 05:15:28.522069 7fa996c33700 10 mon.0@0(leader).pg v16089280 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:28.522122 7fa996c33700 10 mon.0@0(leader).pg v16089280 preprocess_query pg_stats(35 pgs tid 28085 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:28.522154 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_update pg_stats(35 pgs tid 28085 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:28.522159 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_pg_stats pg_stats(35 pgs tid 28085 v 3959) v1 from osd.2
2013-05-16 05:15:28.522162 7fa996c33700 10 mon.0@0(leader).pg v16089280 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:28.522235 7fa996c33700 10 mon.0@0(leader).pg v16089280 preprocess_query pg_stats(35 pgs tid 28177 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:28.522265 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_update pg_stats(35 pgs tid 28177 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:28.522270 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_pg_stats pg_stats(35 pgs tid 28177 v 3959) v1 from osd.5
2013-05-16 05:15:28.522273 7fa996c33700 10 mon.0@0(leader).pg v16089280 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:28.522332 7fa996c33700 10 mon.0@0(leader).pg v16089280 preprocess_query pg_stats(33 pgs tid 28217 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:28.522367 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_update pg_stats(33 pgs tid 28217 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:28.522372 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_pg_stats pg_stats(33 pgs tid 28217 v 3959) v1 from osd.6
2013-05-16 05:15:28.522375 7fa996c33700 10 mon.0@0(leader).pg v16089280 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:28.522442 7fa996c33700 10 mon.0@0(leader).pg v16089280 preprocess_query pg_stats(22 pgs tid 28151 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:28.522486 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_update pg_stats(22 pgs tid 28151 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:28.522491 7fa996c33700 10 mon.0@0(leader).pg v16089280 prepare_pg_stats pg_stats(22 pgs tid 28151 v 3959) v1 from osd.3
2013-05-16 05:15:28.522494 7fa996c33700 10 mon.0@0(leader).pg v16089280 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:28.522527 7fa996c33700 10 mon.0@0(leader).pg v16089280 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:28.522530 7fa996c33700 10 mon.0@0(leader).pg v16089280 update_logger
2013-05-16 05:15:28.522540 7fa996c33700 0 log [INF] : pgmap v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:29.408309 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34454 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:29.408734 7fa997983700 10 mon.0@0(leader).pg v16089280 encode_pending v 16089281
2013-05-16 05:15:32.561851 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos
2013-05-16 05:15:32.562054 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos version 15758975 summary v 15758974
2013-05-16 05:15:32.562086 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos latest full 15758974
2013-05-16 05:15:32.562198 7fa996c33700 7 mon.0@0(leader).log v15758975 update_from_paxos applying incremental log 15758975 2013-05-16 05:15:23.470978 mon.0 188.65.144.4:6789/0 13189 : [INF] pgmap v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:32.562241 7fa996c33700 10 mon.0@0(leader).log v15758975 check_subs
2013-05-16 05:15:32.562281 7fa996c33700 10 mon.0@0(leader).log v15758975 create_pending v 15758976
2013-05-16 05:15:32.562397 7fa996c33700 7 mon.0@0(leader).log v15758975 _updated_log for mon.0 188.65.144.4:6789/0
2013-05-16 05:15:32.562442 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos
2013-05-16 05:15:32.562462 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:32.562478 7fa996c33700 10 mon.0@0(leader).log v15758975 preprocess_query log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:32.562493 7fa996c33700 10 mon.0@0(leader).log v15758975 preprocess_log log(1 entries) v1 from mon.0
2013-05-16 05:15:32.562514 7fa996c33700 10 mon.0@0(leader).log v15758975 nothing new
2013-05-16 05:15:36.113594 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34455 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:36.113712 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:36.113750 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:36.113792 7fa997983700 10 mon.0@0(leader).log v15758975 update_from_paxos
2013-05-16 05:15:36.113823 7fa997983700 10 mon.0@0(leader).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:36.113840 7fa997983700 10 mon.0@0(leader).log v15758975 log
2013-05-16 05:15:36.113870 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:36.113900 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:36.113967 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:36.113974 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x60d4780 from 188.65.144.8:6789/0
2013-05-16 05:15:36.114160 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34456 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:36.114293 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:36.114298 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x6de8280 from 188.65.144.8:6789/0
2013-05-16 05:15:36.114324 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:36.114329 7fa996c33700 10 mon.0@0(leader) e8 mesg 0xbb27280 from 188.65.144.8:6789/0
2013-05-16 05:15:37.700404 7fa996c33700 7 mon.0@0(leader).pg v16089280 update_from_paxos applying incremental 16089281
2013-05-16 05:15:37.700832 7fa996c33700 10 mon.0@0(leader).pg v16089281 v16089281: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7640KB/s rd, 2360KB/s wr, 899op/s
2013-05-16 05:15:37.700870 7fa996c33700 10 mon.0@0(leader).pg v16089281 map_pg_creates to 0 pgs
2013-05-16 05:15:37.700873 7fa996c33700 10 mon.0@0(leader).pg v16089281 send_pg_creates to 0 pgs
2013-05-16 05:15:37.700874 7fa996c33700 10 mon.0@0(leader).pg v16089281 update_logger
2013-05-16 05:15:37.701118 7fa996c33700 10 mon.0@0(leader).pg v16089281 create_pending v 16089282
2013-05-16 05:15:37.701139 7fa996c33700 7 mon.0@0(leader).pg v16089281 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:37.701165 7fa996c33700 7 mon.0@0(leader).pg v16089281 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:37.701186 7fa996c33700 7 mon.0@0(leader).pg v16089281 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:37.701207 7fa996c33700 7 mon.0@0(leader).pg v16089281 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:37.701243 7fa996c33700 7 mon.0@0(leader).pg v16089281 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:37.701271 7fa996c33700 7 mon.0@0(leader).pg v16089281 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:37.701299 7fa996c33700 7 mon.0@0(leader).pg v16089281 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:37.701523 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(29 pgs tid 28062 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:37.701573 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(29 pgs tid 28062 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:37.701579 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(29 pgs tid 28062 v 3959) v1 from osd.1
2013-05-16 05:15:37.701584 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:37.701652 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(35 pgs tid 28086 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:37.701686 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(35 pgs tid 28086 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:37.701691 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(35 pgs tid 28086 v 3959) v1 from osd.2
2013-05-16 05:15:37.701694 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:37.701758 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(31 pgs tid 28071 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:37.701791 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(31 pgs tid 28071 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:37.701796 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(31 pgs tid 28071 v 3959) v1 from osd.0
2013-05-16 05:15:37.701799 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:37.701864 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(32 pgs tid 28181 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:37.701899 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(32 pgs tid 28181 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:37.701904 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(32 pgs tid 28181 v 3959) v1 from osd.4
2013-05-16 05:15:37.701907 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:37.701973 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(36 pgs tid 28178 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:37.702005 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(36 pgs tid 28178 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:37.702010 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(36 pgs tid 28178 v 3959) v1 from osd.5
2013-05-16 05:15:37.702013 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:37.702081 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(34 pgs tid 28218 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:37.702114 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(34 pgs tid 28218 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:37.702118 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(34 pgs tid 28218 v 3959) v1 from osd.6
2013-05-16 05:15:37.702121 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:37.702188 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(33 pgs tid 28182 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:37.702236 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(33 pgs tid 28182 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:37.702242 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(33 pgs tid 28182 v 3959) v1 from osd.4
2013-05-16 05:15:37.702245 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:37.702308 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(34 pgs tid 28072 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:37.702335 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(34 pgs tid 28072 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:37.702340 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(34 pgs tid 28072 v 3959) v1 from osd.0
2013-05-16 05:15:37.702342 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:37.702408 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(32 pgs tid 28063 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:37.702443 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(32 pgs tid 28063 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:37.702447 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(32 pgs tid 28063 v 3959) v1 from osd.1
2013-05-16 05:15:37.702451 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:37.702511 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(35 pgs tid 28087 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:37.702549 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(35 pgs tid 28087 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:37.702554 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(35 pgs tid 28087 v 3959) v1 from osd.2
2013-05-16 05:15:37.702557 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:37.702615 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(38 pgs tid 28179 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:37.702644 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(38 pgs tid 28179 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:37.702649 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(38 pgs tid 28179 v 3959) v1 from osd.5
2013-05-16 05:15:37.702652 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:37.702742 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(19 pgs tid 28152 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:37.702774 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(19 pgs tid 28152 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:37.702779 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(19 pgs tid 28152 v 3959) v1 from osd.3
2013-05-16 05:15:37.702783 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:37.702840 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(38 pgs tid 28219 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:37.702874 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(38 pgs tid 28219 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:37.702879 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(38 pgs tid 28219 v 3959) v1 from osd.6
2013-05-16 05:15:37.702881 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:37.702926 7fa996c33700 10 mon.0@0(leader).pg v16089281 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:37.702930 7fa996c33700 10 mon.0@0(leader).pg v16089281 update_logger
2013-05-16 05:15:37.702940 7fa996c33700 0 log [INF] : pgmap v16089281: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7640KB/s rd, 2360KB/s wr, 899op/s
2013-05-16 05:15:37.703169 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos
2013-05-16 05:15:37.703277 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:37.703303 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:37.703545 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos
2013-05-16 05:15:37.703572 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:37.703656 7fa996c33700 10 mon.0@0(leader).log v15758975 preprocess_query log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:37.703744 7fa996c33700 10 mon.0@0(leader).log v15758975 preprocess_log log(2 entries) v1 from mon.0
2013-05-16 05:15:37.703781 7fa996c33700 10 mon.0@0(leader).log v15758975 prepare_update log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:37.703796 7fa996c33700 10 mon.0@0(leader).log v15758975 prepare_log log(2 entries) v1 from mon.0
2013-05-16 05:15:37.703996 7fa996c33700 10 mon.0@0(leader).log v15758975 logging 2013-05-16 05:15:28.522542 mon.0 188.65.144.4:6789/0 13190 : [INF] pgmap v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:37.704096 7fa996c33700 10 mon.0@0(leader).log v15758975 logging 2013-05-16 05:15:37.702942 mon.0 188.65.144.4:6789/0 13191 : [INF] pgmap v16089281: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7640KB/s rd, 2360KB/s wr, 899op/s
2013-05-16 05:15:37.706562 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos
2013-05-16 05:15:37.706590 7fa996c33700 10 mon.0@0(leader).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:37.706747 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:37.708403 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:37.708413 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x61ee500 from 188.65.144.8:6789/0
2013-05-16 05:15:37.708453 7fa996c33700 10 mon.0@0(leader).pg v16089281 preprocess_query pg_stats(23 pgs tid 28153 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:37.708513 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_update pg_stats(23 pgs tid 28153 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:37.708519 7fa996c33700 10 mon.0@0(leader).pg v16089281 prepare_pg_stats pg_stats(23 pgs tid 28153 v 3959) v1 from osd.3
2013-05-16 05:15:37.708523 7fa996c33700 10 mon.0@0(leader).pg v16089281 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:37.751903 7fa997983700 10 mon.0@0(leader).pg v16089281 encode_pending v 16089282
2013-05-16 05:15:38.923561 7fa997983700 10 mon.0@0(leader).log v15758975 encode_full log v 15758975
2013-05-16 05:15:38.923672 7fa997983700 10 mon.0@0(leader).log v15758975 encode_pending v15758976
2013-05-16 05:15:38.923727 7fa996c33700 10 mon.0@0(leader).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:38.923733 7fa996c33700 10 mon.0@0(leader).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:40.068868 7fa996c33700 7 mon.0@0(leader).pg v16089281 update_from_paxos applying incremental 16089282
2013-05-16 05:15:40.069369 7fa996c33700 10 mon.0@0(leader).pg v16089282 v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:40.069413 7fa996c33700 10 mon.0@0(leader).pg v16089282 map_pg_creates to 0 pgs
2013-05-16 05:15:40.069417 7fa996c33700 10 mon.0@0(leader).pg v16089282 send_pg_creates to 0 pgs
2013-05-16 05:15:40.069420 7fa996c33700 10 mon.0@0(leader).pg v16089282 update_logger
2013-05-16 05:15:40.069482 7fa996c33700 10 mon.0@0(leader).pg v16089282 create_pending v 16089283
2013-05-16 05:15:40.069509 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:40.069535 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:40.069567 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:40.069590 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:40.069613 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:40.069636 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:40.069665 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:40.069703 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:40.069739 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:40.069766 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:40.069791 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:40.069819 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:40.069840 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:40.069875 7fa996c33700 7 mon.0@0(leader).pg v16089282 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:40.070004 7fa996c33700 10 mon.0@0(leader).pg v16089282 preprocess_query pg_stats(31 pgs tid 28064 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:40.070061 7fa996c33700 10 mon.0@0(leader).pg v16089282 prepare_update pg_stats(31 pgs tid 28064 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:40.070068 7fa996c33700 10 mon.0@0(leader).pg v16089282 prepare_pg_stats pg_stats(31 pgs tid 28064 v 3959) v1 from osd.1
2013-05-16 05:15:40.070074 7fa996c33700 10 mon.0@0(leader).pg v16089282 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:40.070132 7fa996c33700 10 mon.0@0(leader).pg v16089282 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:40.070136 7fa996c33700 10 mon.0@0(leader).pg v16089282 update_logger
2013-05-16 05:15:40.070151 7fa996c33700 0 log [INF] : pgmap v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:41.136630 7fa997983700 10 mon.0@0(leader).pg v16089282 encode_full pgmap v 16089282
2013-05-16 05:15:41.138160 7fa997983700 10 mon.0@0(leader).pg v16089282 encode_pending v 16089283
2013-05-16 05:15:41.138348 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:41.138387 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:41.138438 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:41.138485 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:41.138632 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:41.138639 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x6861780 from 188.65.144.8:6789/0
2013-05-16 05:15:41.138720 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34457 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:41.873589 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos
2013-05-16 05:15:41.873670 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos version 15758976 summary v 15758975
2013-05-16 05:15:41.873722 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos latest full 15758975
2013-05-16 05:15:41.873759 7fa996c33700 7 mon.0@0(leader).log v15758976 update_from_paxos applying incremental log 15758976 2013-05-16 05:15:28.522542 mon.0 188.65.144.4:6789/0 13190 : [INF] pgmap v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:41.873790 7fa996c33700 7 mon.0@0(leader).log v15758976 update_from_paxos applying incremental log 15758976 2013-05-16 05:15:37.702942 mon.0 188.65.144.4:6789/0 13191 : [INF] pgmap v16089281: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7640KB/s rd, 2360KB/s wr, 899op/s
2013-05-16 05:15:41.873827 7fa996c33700 10 mon.0@0(leader).log v15758976 check_subs
2013-05-16 05:15:41.873868 7fa996c33700 10 mon.0@0(leader).log v15758976 create_pending v 15758977
2013-05-16 05:15:41.873899 7fa996c33700 7 mon.0@0(leader).log v15758976 _updated_log for mon.0 188.65.144.4:6789/0
2013-05-16 05:15:41.873955 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos
2013-05-16 05:15:41.873975 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:41.873987 7fa996c33700 10 mon.0@0(leader).log v15758976 preprocess_query log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:41.874001 7fa996c33700 10 mon.0@0(leader).log v15758976 preprocess_log log(1 entries) v1 from mon.0
2013-05-16 05:15:41.874026 7fa996c33700 10 mon.0@0(leader).log v15758976 nothing new
2013-05-16 05:15:45.606588 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:45.606598 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x2695500 from 188.65.144.8:6789/0
2013-05-16 05:15:45.606714 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34458 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:47.335805 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:47.335840 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:47.336049 7fa997983700 10 mon.0@0(leader).log v15758976 update_from_paxos
2013-05-16 05:15:47.336228 7fa997983700 10 mon.0@0(leader).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:47.336245 7fa997983700 10 mon.0@0(leader).log v15758976 log
2013-05-16 05:15:47.336346 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:47.336379 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:47.336598 7fa996c33700 7 mon.0@0(leader).pg v16089282 update_from_paxos applying incremental 16089283
2013-05-16 05:15:47.336701 7fa996c33700 10 mon.0@0(leader).pg v16089283 v16089283: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7797KB/s rd, 4129KB/s wr, 1105op/s
2013-05-16 05:15:47.336731 7fa996c33700 10 mon.0@0(leader).pg v16089283 map_pg_creates to 0 pgs
2013-05-16 05:15:47.336736 7fa996c33700 10 mon.0@0(leader).pg v16089283 send_pg_creates to 0 pgs
2013-05-16 05:15:47.336739 7fa996c33700 10 mon.0@0(leader).pg v16089283 update_logger
2013-05-16 05:15:47.336761 7fa996c33700 10 mon.0@0(leader).pg v16089283 create_pending v 16089284
2013-05-16 05:15:47.336773 7fa996c33700 7 mon.0@0(leader).pg v16089283 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:47.336841 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(39 pgs tid 28088 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:47.337219 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(39 pgs tid 28088 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:47.337234 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(39 pgs tid 28088 v 3959) v1 from osd.2
2013-05-16 05:15:47.337241 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:47.337331 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(25 pgs tid 28180 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:47.337403 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(25 pgs tid 28180 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:47.337413 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(25 pgs tid 28180 v 3959) v1 from osd.5
2013-05-16 05:15:47.337418 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:47.337677 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(29 pgs tid 28220 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:47.337717 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(29 pgs tid 28220 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:47.337722 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(29 pgs tid 28220 v 3959) v1 from osd.6
2013-05-16 05:15:47.337726 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:47.337799 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(24 pgs tid 28183 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:47.337856 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(24 pgs tid 28183 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:47.337863 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(24 pgs tid 28183 v 3959) v1 from osd.4
2013-05-16 05:15:47.337866 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:47.337945 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(36 pgs tid 28073 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:47.337993 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(36 pgs tid 28073 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:47.338001 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(36 pgs tid 28073 v 3959) v1 from osd.0
2013-05-16 05:15:47.338006 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:47.338091 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(18 pgs tid 28154 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:47.338137 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(18 pgs tid 28154 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:47.338144 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(18 pgs tid 28154 v 3959) v1 from osd.3
2013-05-16 05:15:47.338148 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:47.338207 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(28 pgs tid 28184 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:47.338253 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(28 pgs tid 28184 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:47.338261 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(28 pgs tid 28184 v 3959) v1 from osd.4
2013-05-16 05:15:47.338264 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:47.338329 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(30 pgs tid 28074 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:47.338377 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(30 pgs tid 28074 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:47.338383 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(30 pgs tid 28074 v 3959) v1 from osd.0
2013-05-16 05:15:47.338386 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:47.338451 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(24 pgs tid 28065 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:47.338486 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(24 pgs tid 28065 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:47.338501 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(24 pgs tid 28065 v 3959) v1 from osd.1
2013-05-16 05:15:47.338504 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:47.338570 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(33 pgs tid 28089 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:47.338617 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(33 pgs tid 28089 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:47.338625 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(33 pgs tid 28089 v 3959) v1 from osd.2
2013-05-16 05:15:47.338630 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:47.338701 7fa996c33700 10 mon.0@0(leader).pg v16089283 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:47.338706 7fa996c33700 10 mon.0@0(leader).pg v16089283 update_logger
2013-05-16 05:15:47.338720 7fa996c33700 0 log [INF] : pgmap v16089283: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7797KB/s rd, 4129KB/s wr, 1105op/s
2013-05-16 05:15:47.338817 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos
2013-05-16 05:15:47.338850 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:47.338886 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:47.338999 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos
2013-05-16 05:15:47.339118 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:47.339144 7fa996c33700 10 mon.0@0(leader).log v15758976 preprocess_query log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:47.339162 7fa996c33700 10 mon.0@0(leader).log v15758976 preprocess_log log(2 entries) v1 from mon.0
2013-05-16 05:15:47.339350 7fa996c33700 10 mon.0@0(leader).log v15758976 prepare_update log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:47.339379 7fa996c33700 10 mon.0@0(leader).log v15758976 prepare_log log(2 entries) v1 from mon.0
2013-05-16 05:15:47.339414 7fa996c33700 10 mon.0@0(leader).log v15758976 logging 2013-05-16 05:15:40.070153 mon.0 188.65.144.4:6789/0 13192 : [INF] pgmap v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:47.339469 7fa996c33700 10 mon.0@0(leader).log v15758976 logging 2013-05-16 05:15:47.338721 mon.0 188.65.144.4:6789/0 13193 : [INF] pgmap v16089283: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7797KB/s rd, 4129KB/s wr, 1105op/s
2013-05-16 05:15:47.339587 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(27 pgs tid 28181 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:47.339632 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(27 pgs tid 28181 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:47.339639 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(27 pgs tid 28181 v 3959) v1 from osd.5
2013-05-16 05:15:47.339644 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:47.350307 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos
2013-05-16 05:15:47.350352 7fa996c33700 10 mon.0@0(leader).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:47.350389 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:47.350825 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:47.350836 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x61eea00 from 188.65.144.8:6789/0
2013-05-16 05:15:47.351076 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(32 pgs tid 28221 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:47.351130 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(32 pgs tid 28221 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:47.351138 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(32 pgs tid 28221 v 3959) v1 from osd.6
2013-05-16 05:15:47.351142 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:47.351205 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:47.351211 7fa996c33700 10 mon.0@0(leader) e8 mesg 0xc29ba00 from 188.65.144.8:6789/0
2013-05-16 05:15:47.351238 7fa996c33700 10 mon.0@0(leader).pg v16089283 preprocess_query pg_stats(21 pgs tid 28155 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:47.351291 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_update pg_stats(21 pgs tid 28155 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:47.351298 7fa996c33700 10 mon.0@0(leader).pg v16089283 prepare_pg_stats pg_stats(21 pgs tid 28155 v 3959) v1 from osd.3
2013-05-16 05:15:47.351301 7fa996c33700 10 mon.0@0(leader).pg v16089283 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:47.387693 7fa997983700 10 mon.0@0(leader).pg v16089283 encode_pending v 16089284
2013-05-16 05:15:48.868753 7fa997983700 10 mon.0@0(leader).log v15758976 encode_full log v 15758976
2013-05-16 05:15:48.869194 7fa997983700 10 mon.0@0(leader).log v15758976 encode_pending v15758977
2013-05-16 05:15:48.869833 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34459 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:51.352237 7fa996c33700 7 mon.0@0(leader).pg v16089283 update_from_paxos applying incremental 16089284
2013-05-16 05:15:51.352680 7fa996c33700 10 mon.0@0(leader).pg v16089284 v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:15:51.352796 7fa996c33700 10 mon.0@0(leader).pg v16089284 map_pg_creates to 0 pgs
2013-05-16 05:15:51.352801 7fa996c33700 10 mon.0@0(leader).pg v16089284 send_pg_creates to 0 pgs
2013-05-16 05:15:51.352802 7fa996c33700 10 mon.0@0(leader).pg v16089284 update_logger
2013-05-16 05:15:51.352851 7fa996c33700 10 mon.0@0(leader).pg v16089284 create_pending v 16089285
2013-05-16 05:15:51.352862 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:51.352890 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:51.352906 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:51.352922 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:51.352959 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:51.352979 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:51.352999 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:51.353023 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:51.353042 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:51.353098 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:51.353116 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:51.353137 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:51.353164 7fa996c33700 7 mon.0@0(leader).pg v16089284 _updated_stats for osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:51.353420 7fa996c33700 10 mon.0@0(leader).pg v16089284 preprocess_query pg_stats(24 pgs tid 28066 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:51.353469 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_update pg_stats(24 pgs tid 28066 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:51.353478 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_pg_stats pg_stats(24 pgs tid 28066 v 3959) v1 from osd.1
2013-05-16 05:15:51.353482 7fa996c33700 10 mon.0@0(leader).pg v16089284 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:51.353625 7fa996c33700 10 mon.0@0(leader).pg v16089284 preprocess_query pg_stats(37 pgs tid 28090 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:51.353664 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_update pg_stats(37 pgs tid 28090 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:51.353669 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_pg_stats pg_stats(37 pgs tid 28090 v 3959) v1 from osd.2
2013-05-16 05:15:51.353672 7fa996c33700 10 mon.0@0(leader).pg v16089284 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:51.353745 7fa996c33700 10 mon.0@0(leader).pg v16089284 preprocess_query pg_stats(30 pgs tid 28185 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:51.353775 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_update pg_stats(30 pgs tid 28185 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:51.353780 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_pg_stats pg_stats(30 pgs tid 28185 v 3959) v1 from osd.4
2013-05-16 05:15:51.353783 7fa996c33700 10 mon.0@0(leader).pg v16089284 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:51.353846 7fa996c33700 10 mon.0@0(leader).pg v16089284 preprocess_query pg_stats(34 pgs tid 28075 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:51.353882 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_update pg_stats(34 pgs tid 28075 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:51.353886 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_pg_stats pg_stats(34 pgs tid 28075 v 3959) v1 from osd.0
2013-05-16 05:15:51.353889 7fa996c33700 10 mon.0@0(leader).pg v16089284 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:51.353953 7fa996c33700 10 mon.0@0(leader).pg v16089284 preprocess_query pg_stats(31 pgs tid 28182 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:51.353988 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_update pg_stats(31 pgs tid 28182 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:51.353993 7fa996c33700 10 mon.0@0(leader).pg v16089284 prepare_pg_stats pg_stats(31 pgs tid 28182 v 3959) v1 from osd.5
2013-05-16 05:15:51.353996 7fa996c33700 10 mon.0@0(leader).pg v16089284 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:51.354035 7fa996c33700 10 mon.0@0(leader).pg v16089284 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:51.354038 7fa996c33700 10 mon.0@0(leader).pg v16089284 update_logger
2013-05-16 05:15:51.354048 7fa996c33700 0 log [INF] : pgmap v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:15:52.297012 7fa997983700 10 mon.0@0(leader).pg v16089284 encode_pending v 16089285
2013-05-16 05:15:52.297831 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:52.297838 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x61ee500 from 188.65.144.8:6789/0
2013-05-16 05:15:52.297990 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34460 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:52.298016 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:52.298021 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x629c000 from 188.65.144.8:6789/0
2013-05-16 05:15:52.336541 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:52.336705 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:52.336856 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:52.336974 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:15:53.509549 7fa996c33700 10 mon.0@0(leader).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:53.509556 7fa996c33700 10 mon.0@0(leader).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:53.842544 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos
2013-05-16 05:15:53.842611 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos version 15758977 summary v 15758976
2013-05-16 05:15:53.842657 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos latest full 15758976
2013-05-16 05:15:53.842704 7fa996c33700 7 mon.0@0(leader).log v15758977 update_from_paxos applying incremental log 15758977 2013-05-16 05:15:40.070153 mon.0 188.65.144.4:6789/0 13192 : [INF] pgmap v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:53.842750 7fa996c33700 7 mon.0@0(leader).log v15758977 update_from_paxos applying incremental log 15758977 2013-05-16 05:15:47.338721 mon.0 188.65.144.4:6789/0 13193 : [INF] pgmap v16089283: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7797KB/s rd, 4129KB/s wr, 1105op/s
2013-05-16 05:15:53.842794 7fa996c33700 10 mon.0@0(leader).log v15758977 check_subs
2013-05-16 05:15:53.842851 7fa996c33700 10 mon.0@0(leader).log v15758977 create_pending v 15758978
2013-05-16 05:15:53.843096 7fa996c33700 7 mon.0@0(leader).log v15758977 _updated_log for mon.0 188.65.144.4:6789/0
2013-05-16 05:15:53.843338 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos
2013-05-16 05:15:53.843371 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos version 15758977 summary v 15758977
2013-05-16 05:15:53.843392 7fa996c33700 10 mon.0@0(leader).log v15758977 preprocess_query log(1 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:53.843420 7fa996c33700 10 mon.0@0(leader).log v15758977 preprocess_log log(1 entries) v1 from mon.0
2013-05-16 05:15:53.843457 7fa996c33700 10 mon.0@0(leader).log v15758977 nothing new
2013-05-16 05:15:57.236434 7fa996c33700 7 mon.0@0(leader).pg v16089284 update_from_paxos applying incremental 16089285
2013-05-16 05:15:57.236998 7fa996c33700 10 mon.0@0(leader).pg v16089285 v16089285: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6541KB/s rd, 4026KB/s wr, 761op/s
2013-05-16 05:15:57.237052 7fa996c33700 10 mon.0@0(leader).pg v16089285 map_pg_creates to 0 pgs
2013-05-16 05:15:57.237059 7fa996c33700 10 mon.0@0(leader).pg v16089285 send_pg_creates to 0 pgs
2013-05-16 05:15:57.237061 7fa996c33700 10 mon.0@0(leader).pg v16089285 update_logger
2013-05-16 05:15:57.237106 7fa996c33700 10 mon.0@0(leader).pg v16089285 create_pending v 16089286
2013-05-16 05:15:57.237118 7fa996c33700 7 mon.0@0(leader).pg v16089285 _updated_stats for osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:57.237138 7fa996c33700 7 mon.0@0(leader).pg v16089285 _updated_stats for osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:57.237154 7fa996c33700 7 mon.0@0(leader).pg v16089285 _updated_stats for osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:57.237168 7fa996c33700 7 mon.0@0(leader).pg v16089285 _updated_stats for osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:57.237185 7fa996c33700 7 mon.0@0(leader).pg v16089285 _updated_stats for osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:57.237251 7fa996c33700 10 mon.0@0(leader).pg v16089285 preprocess_query pg_stats(34 pgs tid 28222 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:57.237306 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_update pg_stats(34 pgs tid 28222 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:57.237312 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_pg_stats pg_stats(34 pgs tid 28222 v 3959) v1 from osd.6
2013-05-16 05:15:57.237317 7fa996c33700 10 mon.0@0(leader).pg v16089285 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:57.237540 7fa996c33700 10 mon.0@0(leader).pg v16089285 preprocess_query pg_stats(29 pgs tid 28156 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:57.237580 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_update pg_stats(29 pgs tid 28156 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:57.237586 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_pg_stats pg_stats(29 pgs tid 28156 v 3959) v1 from osd.3
2013-05-16 05:15:57.237589 7fa996c33700 10 mon.0@0(leader).pg v16089285 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:57.237663 7fa996c33700 10 mon.0@0(leader).pg v16089285 preprocess_query pg_stats(24 pgs tid 28067 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:57.237696 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_update pg_stats(24 pgs tid 28067 v 3959) v1 from osd.1 188.65.144.5:6801/29051
2013-05-16 05:15:57.237701 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_pg_stats pg_stats(24 pgs tid 28067 v 3959) v1 from osd.1
2013-05-16 05:15:57.237704 7fa996c33700 10 mon.0@0(leader).pg v16089285 got osd.1 osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]) (was osd_stat(418 GB used, 451 GB avail, 916 GB total, peers [0,2,3,4,5,6]/[]))
2013-05-16 05:15:57.237760 7fa996c33700 10 mon.0@0(leader).pg v16089285 preprocess_query pg_stats(33 pgs tid 28091 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:57.237793 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_update pg_stats(33 pgs tid 28091 v 3959) v1 from osd.2 188.65.144.6:6803/29392
2013-05-16 05:15:57.237797 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_pg_stats pg_stats(33 pgs tid 28091 v 3959) v1 from osd.2
2013-05-16 05:15:57.237801 7fa996c33700 10 mon.0@0(leader).pg v16089285 got osd.2 osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]) (was osd_stat(459 GB used, 410 GB avail, 916 GB total, peers [0,1,3,4,5,6]/[]))
2013-05-16 05:15:57.237867 7fa996c33700 10 mon.0@0(leader).pg v16089285 preprocess_query pg_stats(26 pgs tid 28186 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:57.237897 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_update pg_stats(26 pgs tid 28186 v 3959) v1 from osd.4 188.65.144.8:6802/20221
2013-05-16 05:15:57.237902 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_pg_stats pg_stats(26 pgs tid 28186 v 3959) v1 from osd.4
2013-05-16 05:15:57.237906 7fa996c33700 10 mon.0@0(leader).pg v16089285 got osd.4 osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]) (was osd_stat(455 GB used, 442 GB avail, 897 GB total, peers [0,1,2,3,5,6]/[]))
2013-05-16 05:15:57.237965 7fa996c33700 10 mon.0@0(leader).pg v16089285 preprocess_query pg_stats(33 pgs tid 28076 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:57.238000 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_update pg_stats(33 pgs tid 28076 v 3959) v1 from osd.0 188.65.144.4:6804/1876
2013-05-16 05:15:57.238005 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_pg_stats pg_stats(33 pgs tid 28076 v 3959) v1 from osd.0
2013-05-16 05:15:57.238008 7fa996c33700 10 mon.0@0(leader).pg v16089285 got osd.0 osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]) (was osd_stat(424 GB used, 446 GB avail, 916 GB total, peers [1,2,3,4,5,6]/[]))
2013-05-16 05:15:57.238051 7fa996c33700 10 mon.0@0(leader).pg v16089285 check_osd_map already seen 3959 >= 3959
2013-05-16 05:15:57.238054 7fa996c33700 10 mon.0@0(leader).pg v16089285 update_logger
2013-05-16 05:15:57.238065 7fa996c33700 0 log [INF] : pgmap v16089285: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6541KB/s rd, 4026KB/s wr, 761op/s
2013-05-16 05:15:57.238286 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos
2013-05-16 05:15:57.238386 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos version 15758977 summary v 15758977
2013-05-16 05:15:57.238411 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:57.238477 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos
2013-05-16 05:15:57.238511 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos version 15758977 summary v 15758977
2013-05-16 05:15:57.238523 7fa996c33700 10 mon.0@0(leader).log v15758977 preprocess_query log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:57.238545 7fa996c33700 10 mon.0@0(leader).log v15758977 preprocess_log log(2 entries) v1 from mon.0
2013-05-16 05:15:57.238577 7fa996c33700 10 mon.0@0(leader).log v15758977 prepare_update log(2 entries) v1 from mon.0 188.65.144.4:6789/0
2013-05-16 05:15:57.238592 7fa996c33700 10 mon.0@0(leader).log v15758977 prepare_log log(2 entries) v1 from mon.0
2013-05-16 05:15:57.238610 7fa996c33700 10 mon.0@0(leader).log v15758977 logging 2013-05-16 05:15:51.354050 mon.0 188.65.144.4:6789/0 13194 : [INF] pgmap v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:15:57.238633 7fa996c33700 10 mon.0@0(leader).log v15758977 logging 2013-05-16 05:15:57.238066 mon.0 188.65.144.4:6789/0 13195 : [INF] pgmap v16089285: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6541KB/s rd, 4026KB/s wr, 761op/s
2013-05-16 05:15:57.238786 7fa996c33700 10 mon.0@0(leader).mds e4483 preprocess_query mdsbeacon(53566/0 up:active seq 34461 v4483) v2 from mds.0 188.65.144.4:6800/10309
2013-05-16 05:15:57.238835 7fa996c33700 10 mon.0@0(leader).pg v16089285 preprocess_query pg_stats(34 pgs tid 28183 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:57.238876 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_update pg_stats(34 pgs tid 28183 v 3959) v1 from osd.5 188.65.144.9:6801/5830
2013-05-16 05:15:57.238882 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_pg_stats pg_stats(34 pgs tid 28183 v 3959) v1 from osd.5
2013-05-16 05:15:57.238886 7fa996c33700 10 mon.0@0(leader).pg v16089285 got osd.5 osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]) (was osd_stat(420 GB used, 476 GB avail, 897 GB total, peers [0,1,2,3,4,6]/[]))
2013-05-16 05:15:57.242262 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos
2013-05-16 05:15:57.242289 7fa996c33700 10 mon.0@0(leader).log v15758977 update_from_paxos version 15758977 summary v 15758977
2013-05-16 05:15:57.242312 7fa996c33700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:57.244814 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.6 188.65.144.10:6800/23197 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:57.244821 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x5e19a00 from 188.65.144.8:6789/0
2013-05-16 05:15:57.244869 7fa996c33700 10 mon.0@0(leader).pg v16089285 preprocess_query pg_stats(32 pgs tid 28223 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:57.244920 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_update pg_stats(32 pgs tid 28223 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:57.244926 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_pg_stats pg_stats(32 pgs tid 28223 v 3959) v1 from osd.6
2013-05-16 05:15:57.244930 7fa996c33700 10 mon.0@0(leader).pg v16089285 got osd.6 osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]) (was osd_stat(479 GB used, 417 GB avail, 897 GB total, peers [0,1,2,3,4,5]/[]))
2013-05-16 05:15:57.245234 7fa996c33700 10 mon.0@0(leader) e8 received forwarded message from osd.3 188.65.144.7:6801/20087 via mon.4 188.65.144.8:6789/0
2013-05-16 05:15:57.245240 7fa996c33700 10 mon.0@0(leader) e8 mesg 0x5e19780 from 188.65.144.8:6789/0
2013-05-16 05:15:57.245278 7fa996c33700 10 mon.0@0(leader).pg v16089285 preprocess_query pg_stats(25 pgs tid 28157 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:57.245321 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_update pg_stats(25 pgs tid 28157 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:57.245327 7fa996c33700 10 mon.0@0(leader).pg v16089285 prepare_pg_stats pg_stats(25 pgs tid 28157 v 3959) v1 from osd.3
2013-05-16 05:15:57.245330 7fa996c33700 10 mon.0@0(leader).pg v16089285 got osd.3 osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]) (was osd_stat(407 GB used, 462 GB avail, 916 GB total, peers [0,1,2,4,5,6]/[]))
2013-05-16 05:15:57.287520 7fa997983700 10 mon.0@0(leader).pg v16089285 encode_pending v 16089286
2013-05-16 05:15:59.128792 7fa997983700 10 mon.0@0(leader).log v15758977 encode_full log v 15758977
2013-05-16 05:15:59.129092 7fa997983700 10 mon.0@0(leader).log v15758977 encode_pending v15758978
2013-05-16 05:15:59.129237 7fa997983700 10 mon.0@0(leader).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:59.129270 7fa997983700 10 mon.0@0(leader).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:59.129521 7fa997983700 10 mon.0@0(leader).auth v8461 update_from_paxos
2013-05-16 05:15:59.129556 7fa997983700 10 mon.0@0(leader).auth v8461 auth
2013-05-16 05:16:04.180673 7fa996c33700 7 mon.0@0(leader).log v15758978 update_from_paxos applying incremental log 15758978 2013-05-16 05:15:51.354050 mon.0 188.65.144.4:6789/0 13194 : [INF] pgmap v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:16:04.180791 7fa996c33700 7 mon.0@0(leader).log v15758978 update_from_paxos applying incremental log 15758978 2013-05-16 05:15:57.238066 mon.0 188.65.144.4:6789/0 13195 : [INF] pgmap v16089285: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6541KB/s rd, 4026KB/s wr, 761op/s
Executing zcat /var/log/ceph/ceph-mon.\?.log.\?.gz \|grep \'2013-05-16 05:15\' on node02
2013-05-16 05:15:05.893590 7fc8666a8700 10 mon.1@1(peon) e8 ms_handle_reset 0xd4b3420 188.65.144.5:6801/29051
2013-05-16 05:15:05.893610 7fc8666a8700 10 mon.1@1(peon) e8 ms_handle_reset 0xd4b3b00 188.65.144.10:6800/23197
2013-05-16 05:15:05.893613 7fc8666a8700 10 mon.1@1(peon) e8 ms_handle_reset 0x13f0e9a0 188.65.144.6:6803/29392
2013-05-16 05:15:05.893616 7fc8666a8700 10 mon.1@1(peon) e8 ms_handle_reset 0x13f0e840 188.65.144.9:6801/5830
2013-05-16 05:15:05.898655 7fc866ea9700 10 mon.1@1(peon).data_health(12144) service_tick
2013-05-16 05:15:05.898666 7fc866ea9700 0 mon.1@1(peon).data_health(12144) update_stats avail 71% total 701854104 used 164029844 avail 502172064
2013-05-16 05:15:05.898669 7fc866ea9700 10 mon.1@1(peon).data_health(12144) share_stats
2013-05-16 05:15:05.898725 7fc8666a8700 -1 mon.1@1(peon).paxos(paxos recovering c 31846715..31865061) lease_expire from mon.0 188.65.144.4:6789/0 is 461.369059 seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
2013-05-16 05:15:05.899686 7fc8666a8700 7 mon.1@1(peon).pg v16088653 update_from_paxos loading latest full pgmap v16089048
2013-05-16 05:15:05.901363 7fc8666a8700 7 mon.1@1(peon).pg v16089048 update_from_paxos applying incremental 16089049
2013-05-16 05:15:05.901446 7fc8666a8700 10 mon.1@1(peon).pg v16089049 v16089049: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 16042KB/s rd, 6887KB/s wr, 499op/s
2013-05-16 05:15:05.901471 7fc8666a8700 7 mon.1@1(peon).pg v16089049 update_from_paxos applying incremental 16089050
2013-05-16 05:15:05.901498 7fc8666a8700 10 mon.1@1(peon).pg v16089050 v16089050: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 7029KB/s rd, 5736KB/s wr, 298op/s
2013-05-16 05:15:05.901524 7fc8666a8700 7 mon.1@1(peon).pg v16089050 update_from_paxos applying incremental 16089051
2013-05-16 05:15:05.901656 7fc8666a8700 10 mon.1@1(peon).pg v16089051 v16089051: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 3050KB/s rd, 19242KB/s wr, 228op/s
2013-05-16 05:15:05.901681 7fc8666a8700 7 mon.1@1(peon).pg v16089051 update_from_paxos applying incremental 16089052
2013-05-16 05:15:05.901754 7fc8666a8700 10 mon.1@1(peon).pg v16089052 v16089052: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 4535KB/s rd, 25051KB/s wr, 273op/s
2013-05-16 05:15:05.901785 7fc8666a8700 7 mon.1@1(peon).pg v16089052 update_from_paxos applying incremental 16089053
2013-05-16 05:15:05.902130 7fc8666a8700 10 mon.1@1(peon).pg v16089053 v16089053: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 4294KB/s rd, 6114KB/s wr, 174op/s
2013-05-16 05:15:05.902168 7fc8666a8700 7 mon.1@1(peon).pg v16089053 update_from_paxos applying incremental 16089054
2013-05-16 05:15:05.902361 7fc8666a8700 10 mon.1@1(peon).pg v16089054 v16089054: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 5746KB/s rd, 6644KB/s wr, 332op/s
2013-05-16 05:15:05.902395 7fc8666a8700 7 mon.1@1(peon).pg v16089054 update_from_paxos applying incremental 16089055
2013-05-16 05:15:05.902426 7fc8666a8700 10 mon.1@1(peon).pg v16089055 v16089055: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 6147KB/s rd, 6530KB/s wr, 275op/s
2013-05-16 05:15:05.902472 7fc8666a8700 7 mon.1@1(peon).pg v16089055 update_from_paxos applying incremental 16089056
2013-05-16 05:15:05.902663 7fc8666a8700 10 mon.1@1(peon).pg v16089056 v16089056: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 12650KB/s rd, 7490KB/s wr, 288op/s
2013-05-16 05:15:05.902696 7fc8666a8700 7 mon.1@1(peon).pg v16089056 update_from_paxos applying incremental 16089057
2013-05-16 05:15:05.902729 7fc8666a8700 10 mon.1@1(peon).pg v16089057 v16089057: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 13571KB/s rd, 9114KB/s wr, 395op/s
2013-05-16 05:15:05.902852 7fc8666a8700 7 mon.1@1(peon).pg v16089057 update_from_paxos applying incremental 16089058
2013-05-16 05:15:05.902895 7fc8666a8700 10 mon.1@1(peon).pg v16089058 v16089058: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 2669KB/s rd, 3035KB/s wr, 128op/s
2013-05-16 05:15:05.902926 7fc8666a8700 7 mon.1@1(peon).pg v16089058 update_from_paxos applying incremental 16089059
2013-05-16 05:15:05.903063 7fc8666a8700 10 mon.1@1(peon).pg v16089059 v16089059: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 15454KB/s rd, 5864KB/s wr, 483op/s
2013-05-16 05:15:05.903089 7fc8666a8700 7 mon.1@1(peon).pg v16089059 update_from_paxos applying incremental 16089060
2013-05-16 05:15:05.903169 7fc8666a8700 10 mon.1@1(peon).pg v16089060 v16089060: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 14420KB/s rd, 3386KB/s wr, 512op/s
2013-05-16 05:15:05.903188 7fc8666a8700 7 mon.1@1(peon).pg v16089060 update_from_paxos applying incremental 16089061
2013-05-16 05:15:05.903210 7fc8666a8700 10 mon.1@1(peon).pg v16089061 v16089061: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 5460KB/s rd, 3489KB/s wr, 175op/s
2013-05-16 05:15:05.903234 7fc8666a8700 7 mon.1@1(peon).pg v16089061 update_from_paxos applying incremental 16089062
2013-05-16 05:15:05.903356 7fc8666a8700 10 mon.1@1(peon).pg v16089062 v16089062: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 23364KB/s rd, 33421KB/s wr, 351op/s
2013-05-16 05:15:05.903375 7fc8666a8700 10 mon.1@1(peon).pg v16089062 map_pg_creates to 0 pgs
2013-05-16 05:15:05.903377 7fc8666a8700 10 mon.1@1(peon).pg v16089062 send_pg_creates to 0 pgs
2013-05-16 05:15:05.903378 7fc8666a8700 10 mon.1@1(peon).pg v16089062 update_logger
2013-05-16 05:15:05.903516 7fc8666a8700 10 mon.1@1(peon).pg v16089062 update_logger
2013-05-16 05:15:05.904191 7fc8666a8700 -1 mon/MDSMonitor.cc: In function 'virtual void MDSMonitor::update_from_paxos()' thread 7fc8666a8700 time 2013-05-16 05:15:05.903535
-47> 2013-05-16 05:15:05.893590 7fc8666a8700 10 mon.1@1(peon) e8 ms_handle_reset 0xd4b3420 188.65.144.5:6801/29051
-46> 2013-05-16 05:15:05.893610 7fc8666a8700 10 mon.1@1(peon) e8 ms_handle_reset 0xd4b3b00 188.65.144.10:6800/23197
-45> 2013-05-16 05:15:05.893613 7fc8666a8700 10 mon.1@1(peon) e8 ms_handle_reset 0x13f0e9a0 188.65.144.6:6803/29392
-44> 2013-05-16 05:15:05.893616 7fc8666a8700 10 mon.1@1(peon) e8 ms_handle_reset 0x13f0e840 188.65.144.9:6801/5830
-43> 2013-05-16 05:15:05.893637 7fc8666a8700 1 -- 188.65.144.5:6789/0 <== mon.0 188.65.144.4:6789/0 1801767837 ==== paxos(lease lc 31865061 fc 31852646 pn 0 opn 0) v3 ==== 80+0+0 (1652509130 0 0) 0x15549500 con 0x2137420
-42> 2013-05-16 05:15:05.898655 7fc866ea9700 10 mon.1@1(peon).data_health(12144) service_tick
-41> 2013-05-16 05:15:05.898666 7fc866ea9700 0 mon.1@1(peon).data_health(12144) update_stats avail 71% total 701854104 used 164029844 avail 502172064
-40> 2013-05-16 05:15:05.898669 7fc866ea9700 10 mon.1@1(peon).data_health(12144) share_stats
-39> 2013-05-16 05:15:05.898672 7fc866ea9700 1 -- 188.65.144.5:6789/0 --> mon.0 188.65.144.4:6789/0 -- mon_health( service 1 op tell e 0 r 0 flags ) v1 -- ?+0 0x19195200
-38> 2013-05-16 05:15:05.898682 7fc866ea9700 1 -- 188.65.144.5:6789/0 --> mon.2 188.65.144.6:6789/0 -- mon_health( service 1 op tell e 0 r 0 flags none ) v1 -- ?+0 0x19195000
-37> 2013-05-16 05:15:05.898690 7fc866ea9700 1 -- 188.65.144.5:6789/0 --> mon.3 188.65.144.7:6789/0 -- mon_health( service 1 op tell e 0 r 0 flags ) v1 -- ?+0 0x880aa00
-36> 2013-05-16 05:15:05.898695 7fc866ea9700 1 -- 188.65.144.5:6789/0 --> mon.4 188.65.144.8:6789/0 -- mon_health( service 1 op tell e 0 r 0 flags ) v1 -- ?+0 0x880ac00
-35> 2013-05-16 05:15:05.898725 7fc8666a8700 -1 mon.1@1(peon).paxos(paxos recovering c 31846715..31865061) lease_expire from mon.0 188.65.144.4:6789/0 is 461.369059 seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
-34> 2013-05-16 05:15:05.898750 7fc8666a8700 1 -- 188.65.144.5:6789/0 --> mon.0 188.65.144.4:6789/0 -- paxos(lease_ack lc 31865061 fc 31846715 pn 0 opn 0) v3 -- ?+0 0x191f9000
-33> 2013-05-16 05:15:05.899686 7fc8666a8700 7 mon.1@1(peon).pg v16088653 update_from_paxos loading latest full pgmap v16089048
-32> 2013-05-16 05:15:05.901363 7fc8666a8700 7 mon.1@1(peon).pg v16089048 update_from_paxos applying incremental 16089049
-31> 2013-05-16 05:15:05.901446 7fc8666a8700 10 mon.1@1(peon).pg v16089049 v16089049: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 16042KB/s rd, 6887KB/s wr, 499op/s
-30> 2013-05-16 05:15:05.901471 7fc8666a8700 7 mon.1@1(peon).pg v16089049 update_from_paxos applying incremental 16089050
-29> 2013-05-16 05:15:05.901498 7fc8666a8700 10 mon.1@1(peon).pg v16089050 v16089050: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 7029KB/s rd, 5736KB/s wr, 298op/s
-28> 2013-05-16 05:15:05.901524 7fc8666a8700 7 mon.1@1(peon).pg v16089050 update_from_paxos applying incremental 16089051
-27> 2013-05-16 05:15:05.901656 7fc8666a8700 10 mon.1@1(peon).pg v16089051 v16089051: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 3050KB/s rd, 19242KB/s wr, 228op/s
-26> 2013-05-16 05:15:05.901681 7fc8666a8700 7 mon.1@1(peon).pg v16089051 update_from_paxos applying incremental 16089052
-25> 2013-05-16 05:15:05.901754 7fc8666a8700 10 mon.1@1(peon).pg v16089052 v16089052: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 4535KB/s rd, 25051KB/s wr, 273op/s
-24> 2013-05-16 05:15:05.901785 7fc8666a8700 7 mon.1@1(peon).pg v16089052 update_from_paxos applying incremental 16089053
-23> 2013-05-16 05:15:05.902130 7fc8666a8700 10 mon.1@1(peon).pg v16089053 v16089053: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 4294KB/s rd, 6114KB/s wr, 174op/s
-22> 2013-05-16 05:15:05.902168 7fc8666a8700 7 mon.1@1(peon).pg v16089053 update_from_paxos applying incremental 16089054
-21> 2013-05-16 05:15:05.902361 7fc8666a8700 10 mon.1@1(peon).pg v16089054 v16089054: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 5746KB/s rd, 6644KB/s wr, 332op/s
-20> 2013-05-16 05:15:05.902395 7fc8666a8700 7 mon.1@1(peon).pg v16089054 update_from_paxos applying incremental 16089055
-19> 2013-05-16 05:15:05.902426 7fc8666a8700 10 mon.1@1(peon).pg v16089055 v16089055: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 6147KB/s rd, 6530KB/s wr, 275op/s
-18> 2013-05-16 05:15:05.902472 7fc8666a8700 7 mon.1@1(peon).pg v16089055 update_from_paxos applying incremental 16089056
-17> 2013-05-16 05:15:05.902663 7fc8666a8700 10 mon.1@1(peon).pg v16089056 v16089056: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 12650KB/s rd, 7490KB/s wr, 288op/s
-16> 2013-05-16 05:15:05.902696 7fc8666a8700 7 mon.1@1(peon).pg v16089056 update_from_paxos applying incremental 16089057
-15> 2013-05-16 05:15:05.902729 7fc8666a8700 10 mon.1@1(peon).pg v16089057 v16089057: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 13571KB/s rd, 9114KB/s wr, 395op/s
-14> 2013-05-16 05:15:05.902852 7fc8666a8700 7 mon.1@1(peon).pg v16089057 update_from_paxos applying incremental 16089058
-13> 2013-05-16 05:15:05.902895 7fc8666a8700 10 mon.1@1(peon).pg v16089058 v16089058: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 2669KB/s rd, 3035KB/s wr, 128op/s
-12> 2013-05-16 05:15:05.902926 7fc8666a8700 7 mon.1@1(peon).pg v16089058 update_from_paxos applying incremental 16089059
-11> 2013-05-16 05:15:05.903063 7fc8666a8700 10 mon.1@1(peon).pg v16089059 v16089059: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 15454KB/s rd, 5864KB/s wr, 483op/s
-10> 2013-05-16 05:15:05.903089 7fc8666a8700 7 mon.1@1(peon).pg v16089059 update_from_paxos applying incremental 16089060
-9> 2013-05-16 05:15:05.903169 7fc8666a8700 10 mon.1@1(peon).pg v16089060 v16089060: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 14420KB/s rd, 3386KB/s wr, 512op/s
-8> 2013-05-16 05:15:05.903188 7fc8666a8700 7 mon.1@1(peon).pg v16089060 update_from_paxos applying incremental 16089061
-7> 2013-05-16 05:15:05.903210 7fc8666a8700 10 mon.1@1(peon).pg v16089061 v16089061: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 5460KB/s rd, 3489KB/s wr, 175op/s
-6> 2013-05-16 05:15:05.903234 7fc8666a8700 7 mon.1@1(peon).pg v16089061 update_from_paxos applying incremental 16089062
-5> 2013-05-16 05:15:05.903356 7fc8666a8700 10 mon.1@1(peon).pg v16089062 v16089062: 768 pgs: 768 active+clean; 1533 GB data, 3065 GB used, 3107 GB / 6359 GB avail; 23364KB/s rd, 33421KB/s wr, 351op/s
-4> 2013-05-16 05:15:05.903375 7fc8666a8700 10 mon.1@1(peon).pg v16089062 map_pg_creates to 0 pgs
-3> 2013-05-16 05:15:05.903377 7fc8666a8700 10 mon.1@1(peon).pg v16089062 send_pg_creates to 0 pgs
-2> 2013-05-16 05:15:05.903378 7fc8666a8700 10 mon.1@1(peon).pg v16089062 update_logger
-1> 2013-05-16 05:15:05.903516 7fc8666a8700 10 mon.1@1(peon).pg v16089062 update_logger
0> 2013-05-16 05:15:05.904191 7fc8666a8700 -1 mon/MDSMonitor.cc: In function 'virtual void MDSMonitor::update_from_paxos()' thread 7fc8666a8700 time 2013-05-16 05:15:05.903535
2013-05-16 05:15:05.963954 7fc8666a8700 -1 *** Caught signal (Aborted) **
0> 2013-05-16 05:15:05.963954 7fc8666a8700 -1 *** Caught signal (Aborted) **
Executing zcat /var/log/ceph/ceph-mon.\?.log.\?.gz \|grep \'2013-05-16 05:15\' on node03
2013-05-16 05:15:01.322656 7f280816b700 1 mon.2@2(peon).log v15758970 check_sub sending message to client.? 188.65.144.10:0/17448 with 2 entries (version 15758970)
2013-05-16 05:15:10.157262 7f280896c700 1 mon.2@2(peon).log v15758971 check_sub sending message to client.? 188.65.144.10:0/17448 with 1 entries (version 15758971)
2013-05-16 05:15:13.637266 7f280635b700 0 -- 188.65.144.6:6789/0 >> 188.65.144.5:6789/0 pipe(0x320c280 sd=29 :6789 s=2 pgs=3428 cs=1 l=0).fault with nothing to send, going to standby
2013-05-16 05:15:14.398167 7f280816b700 1 mon.2@2(peon).log v15758972 check_sub sending message to client.? 188.65.144.10:0/17448 with 2 entries (version 15758972)
2013-05-16 05:15:17.479943 7f280816b700 1 mon.2@2(peon).log v15758973 check_sub sending message to client.? 188.65.144.10:0/17448 with 2 entries (version 15758973)
2013-05-16 05:15:21.432438 7f280896c700 1 mon.2@2(peon).log v15758974 check_sub sending message to client.? 188.65.144.10:0/17448 with 1 entries (version 15758974)
2013-05-16 05:15:35.593412 7f280816b700 1 mon.2@2(peon).log v15758975 check_sub sending message to client.? 188.65.144.10:0/17448 with 1 entries (version 15758975)
2013-05-16 05:15:42.353896 7f280816b700 1 mon.2@2(peon).log v15758976 check_sub sending message to client.? 188.65.144.10:0/17448 with 2 entries (version 15758976)
2013-05-16 05:15:52.968672 7f280896c700 0 mon.2@2(peon).data_health(12158) update_stats avail 80% total 701854104 used 104330704 avail 561871204
2013-05-16 05:15:54.896988 7f280816b700 1 mon.2@2(peon).log v15758977 check_sub sending message to client.? 188.65.144.10:0/17448 with 2 entries (version 15758977)
Executing zcat /var/log/ceph/ceph-mon.\?.log.\?.gz \|grep \'2013-05-16 05:15\' on node04
2013-05-16 05:15:00.920812 7ff6c8427700 10 mon.3@3(peon).log v15758970 update_from_paxos
2013-05-16 05:15:00.920847 7ff6c8427700 10 mon.3@3(peon).log v15758970 update_from_paxos version 15758970 summary v 15758969
2013-05-16 05:15:00.920880 7ff6c8427700 10 mon.3@3(peon).log v15758970 update_from_paxos latest full 15758969
2013-05-16 05:15:00.920941 7ff6c8427700 7 mon.3@3(peon).log v15758970 update_from_paxos applying incremental log 15758970 2013-05-16 05:14:58.035619 mon.0 188.65.144.4:6789/0 13181 : [INF] pgmap v16089271: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 15173KB/s rd, 2925KB/s wr, 1421op/s
2013-05-16 05:15:00.920972 7ff6c8427700 7 mon.3@3(peon).log v15758970 update_from_paxos applying incremental log 15758970 2013-05-16 05:14:59.586224 mon.0 188.65.144.4:6789/0 13182 : [INF] pgmap v16089272: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9925KB/s rd, 2685KB/s wr, 814op/s
2013-05-16 05:15:00.921022 7ff6c8427700 10 mon.3@3(peon).log v15758970 check_subs
2013-05-16 05:15:00.921045 7ff6c8427700 10 mon.3@3(peon).log v15758970 check_sub client wants log-info ver 15758970
2013-05-16 05:15:00.921060 7ff6c8427700 10 mon.3@3(peon).log v15758970 _create_sub_incremental level 1 ver 15758970 cur summary ver 15758970
2013-05-16 05:15:00.921132 7ff6c8427700 10 mon.3@3(peon).log v15758970 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:00.921146 7ff6c8427700 1 mon.3@3(peon).log v15758970 check_sub sending message to client.? 188.65.144.4:0/29243 with 2 entries (version 15758970)
2013-05-16 05:15:00.921217 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:02.068632 7ff6c8c28700 7 mon.3@3(peon).pg v16089272 update_from_paxos applying incremental 16089273
2013-05-16 05:15:02.068737 7ff6c8c28700 10 mon.3@3(peon).pg v16089273 v16089273: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3956KB/s rd, 1703KB/s wr, 374op/s
2013-05-16 05:15:02.068761 7ff6c8c28700 10 mon.3@3(peon).pg v16089273 map_pg_creates to 0 pgs
2013-05-16 05:15:02.068772 7ff6c8c28700 10 mon.3@3(peon).pg v16089273 send_pg_creates to 0 pgs
2013-05-16 05:15:02.068774 7ff6c8c28700 10 mon.3@3(peon).pg v16089273 update_logger
2013-05-16 05:15:02.068776 7ff6c8c28700 10 mon.3@3(peon).pg v16089273 v16089273: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3956KB/s rd, 1703KB/s wr, 374op/s
2013-05-16 05:15:02.068818 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:02.068953 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:02.068960 7ff6c8c28700 10 mon.3@3(peon).log v15758970 update_from_paxos
2013-05-16 05:15:02.068993 7ff6c8c28700 10 mon.3@3(peon).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:02.069006 7ff6c8c28700 10 mon.3@3(peon).log v15758970 log
2013-05-16 05:15:02.069032 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:02.069058 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:02.137745 7ff6c8427700 10 mon.3@3(peon).log v15758970 update_from_paxos
2013-05-16 05:15:02.137776 7ff6c8427700 10 mon.3@3(peon).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:02.137803 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:05.209761 7ff6c8427700 7 mon.3@3(peon).pg v16089273 update_from_paxos applying incremental 16089274
2013-05-16 05:15:05.209995 7ff6c8427700 10 mon.3@3(peon).pg v16089274 v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:05.210031 7ff6c8427700 10 mon.3@3(peon).pg v16089274 map_pg_creates to 0 pgs
2013-05-16 05:15:05.210038 7ff6c8427700 10 mon.3@3(peon).pg v16089274 send_pg_creates to 0 pgs
2013-05-16 05:15:05.210041 7ff6c8427700 10 mon.3@3(peon).pg v16089274 update_logger
2013-05-16 05:15:05.210110 7ff6c8427700 10 mon.3@3(peon).log v15758970 update_from_paxos
2013-05-16 05:15:05.210152 7ff6c8427700 10 mon.3@3(peon).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:05.210203 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:05.902882 7ff6c8427700 10 mon.3@3(peon).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags data ) v1
2013-05-16 05:15:05.902912 7ff6c8427700 10 mon.3@3(peon).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags data ) v1
2013-05-16 05:15:07.069295 7ff6c8c28700 10 mon.3@3(peon).pg v16089274 v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:07.069474 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:07.069502 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:07.069507 7ff6c8c28700 10 mon.3@3(peon).log v15758970 update_from_paxos
2013-05-16 05:15:07.069543 7ff6c8c28700 10 mon.3@3(peon).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:07.069567 7ff6c8c28700 10 mon.3@3(peon).log v15758970 log
2013-05-16 05:15:07.069597 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:07.069623 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:08.727887 7ff6c8427700 10 mon.3@3(peon).log v15758971 update_from_paxos
2013-05-16 05:15:08.727915 7ff6c8427700 10 mon.3@3(peon).log v15758971 update_from_paxos version 15758971 summary v 15758970
2013-05-16 05:15:08.727940 7ff6c8427700 10 mon.3@3(peon).log v15758971 update_from_paxos latest full 15758970
2013-05-16 05:15:08.727974 7ff6c8427700 7 mon.3@3(peon).log v15758971 update_from_paxos applying incremental log 15758971 2013-05-16 05:15:02.134427 mon.0 188.65.144.4:6789/0 13183 : [INF] pgmap v16089273: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3956KB/s rd, 1703KB/s wr, 374op/s
2013-05-16 05:15:08.728017 7ff6c8427700 10 mon.3@3(peon).log v15758971 check_subs
2013-05-16 05:15:08.728039 7ff6c8427700 10 mon.3@3(peon).log v15758971 check_sub client wants log-info ver 15758971
2013-05-16 05:15:08.728067 7ff6c8427700 10 mon.3@3(peon).log v15758971 _create_sub_incremental level 1 ver 15758971 cur summary ver 15758971
2013-05-16 05:15:08.728112 7ff6c8427700 10 mon.3@3(peon).log v15758971 _create_sub_incremental incremental message ready (1 entries)
2013-05-16 05:15:08.728130 7ff6c8427700 1 mon.3@3(peon).log v15758971 check_sub sending message to client.? 188.65.144.4:0/29243 with 1 entries (version 15758971)
2013-05-16 05:15:08.728177 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:10.131895 7ff6c8c28700 10 mon.3@3(peon).data_health(12158) service_tick
2013-05-16 05:15:10.131911 7ff6c8c28700 0 mon.3@3(peon).data_health(12158) update_stats avail 80% total 701854104 used 99995260 avail 566206648
2013-05-16 05:15:10.131914 7ff6c8c28700 10 mon.3@3(peon).data_health(12158) share_stats
2013-05-16 05:15:10.819556 7ff6c8427700 7 mon.3@3(peon).pg v16089274 update_from_paxos applying incremental 16089275
2013-05-16 05:15:10.820228 7ff6c8427700 10 mon.3@3(peon).pg v16089275 v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:10.820623 7ff6c8427700 10 mon.3@3(peon).pg v16089275 map_pg_creates to 0 pgs
2013-05-16 05:15:10.820627 7ff6c8427700 10 mon.3@3(peon).pg v16089275 send_pg_creates to 0 pgs
2013-05-16 05:15:10.820629 7ff6c8427700 10 mon.3@3(peon).pg v16089275 update_logger
2013-05-16 05:15:10.820755 7ff6c8427700 10 mon.3@3(peon).log v15758971 update_from_paxos
2013-05-16 05:15:10.820784 7ff6c8427700 10 mon.3@3(peon).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:10.820808 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:12.069829 7ff6c8c28700 10 mon.3@3(peon).pg v16089275 v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:12.069885 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:12.069925 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:12.069931 7ff6c8c28700 10 mon.3@3(peon).log v15758971 update_from_paxos
2013-05-16 05:15:12.070077 7ff6c8c28700 10 mon.3@3(peon).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:12.070094 7ff6c8c28700 10 mon.3@3(peon).log v15758971 log
2013-05-16 05:15:12.070123 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:12.070279 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:12.496988 7ff6c8427700 7 mon.3@3(peon).pg v16089275 update_from_paxos applying incremental 16089276
2013-05-16 05:15:12.497263 7ff6c8427700 10 mon.3@3(peon).pg v16089276 v16089276: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3626KB/s rd, 3249KB/s wr, 562op/s
2013-05-16 05:15:12.497291 7ff6c8427700 10 mon.3@3(peon).pg v16089276 map_pg_creates to 0 pgs
2013-05-16 05:15:12.497295 7ff6c8427700 10 mon.3@3(peon).pg v16089276 send_pg_creates to 0 pgs
2013-05-16 05:15:12.497296 7ff6c8427700 10 mon.3@3(peon).pg v16089276 update_logger
2013-05-16 05:15:12.497426 7ff6c8427700 10 mon.3@3(peon).log v15758971 update_from_paxos
2013-05-16 05:15:12.497461 7ff6c8427700 10 mon.3@3(peon).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:12.497554 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:13.637912 7ff6cd776700 0 -- 188.65.144.7:6789/0 >> 188.65.144.5:6789/0 pipe(0x1582280 sd=23 :50000 s=2 pgs=3434 cs=1 l=0).fault with nothing to send, going to standby
2013-05-16 05:15:13.799464 7ff6c8427700 10 mon.3@3(peon).log v15758972 update_from_paxos
2013-05-16 05:15:13.799508 7ff6c8427700 10 mon.3@3(peon).log v15758972 update_from_paxos version 15758972 summary v 15758971
2013-05-16 05:15:13.799549 7ff6c8427700 10 mon.3@3(peon).log v15758972 update_from_paxos latest full 15758971
2013-05-16 05:15:13.799581 7ff6c8427700 7 mon.3@3(peon).log v15758972 update_from_paxos applying incremental log 15758972 2013-05-16 05:15:05.138392 mon.0 188.65.144.4:6789/0 13184 : [INF] pgmap v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:13.799678 7ff6c8427700 7 mon.3@3(peon).log v15758972 update_from_paxos applying incremental log 15758972 2013-05-16 05:15:10.816147 mon.0 188.65.144.4:6789/0 13185 : [INF] pgmap v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:13.799713 7ff6c8427700 10 mon.3@3(peon).log v15758972 check_subs
2013-05-16 05:15:13.799800 7ff6c8427700 10 mon.3@3(peon).log v15758972 check_sub client wants log-info ver 15758972
2013-05-16 05:15:13.799815 7ff6c8427700 10 mon.3@3(peon).log v15758972 _create_sub_incremental level 1 ver 15758972 cur summary ver 15758972
2013-05-16 05:15:13.799996 7ff6c8427700 10 mon.3@3(peon).log v15758972 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:13.800012 7ff6c8427700 1 mon.3@3(peon).log v15758972 check_sub sending message to client.? 188.65.144.4:0/29243 with 2 entries (version 15758972)
2013-05-16 05:15:13.800072 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:15.293315 7ff6c8427700 7 mon.3@3(peon).pg v16089276 update_from_paxos applying incremental 16089277
2013-05-16 05:15:15.293405 7ff6c8427700 10 mon.3@3(peon).pg v16089277 v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:15.293437 7ff6c8427700 10 mon.3@3(peon).pg v16089277 map_pg_creates to 0 pgs
2013-05-16 05:15:15.293441 7ff6c8427700 10 mon.3@3(peon).pg v16089277 send_pg_creates to 0 pgs
2013-05-16 05:15:15.293442 7ff6c8427700 10 mon.3@3(peon).pg v16089277 update_logger
2013-05-16 05:15:15.293493 7ff6c8427700 10 mon.3@3(peon).log v15758972 update_from_paxos
2013-05-16 05:15:15.293525 7ff6c8427700 10 mon.3@3(peon).log v15758972 update_from_paxos version 15758972 summary v 15758972
2013-05-16 05:15:15.293556 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:16.873986 7ff6c8427700 10 mon.3@3(peon).log v15758973 update_from_paxos
2013-05-16 05:15:16.874051 7ff6c8427700 10 mon.3@3(peon).log v15758973 update_from_paxos version 15758973 summary v 15758972
2013-05-16 05:15:16.874099 7ff6c8427700 10 mon.3@3(peon).log v15758973 update_from_paxos latest full 15758972
2013-05-16 05:15:16.874271 7ff6c8427700 7 mon.3@3(peon).log v15758973 update_from_paxos applying incremental log 15758973 2013-05-16 05:15:12.426612 mon.0 188.65.144.4:6789/0 13186 : [INF] pgmap v16089276: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3626KB/s rd, 3249KB/s wr, 562op/s
2013-05-16 05:15:16.874305 7ff6c8427700 7 mon.3@3(peon).log v15758973 update_from_paxos applying incremental log 15758973 2013-05-16 05:15:15.226225 mon.0 188.65.144.4:6789/0 13187 : [INF] pgmap v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:16.874341 7ff6c8427700 10 mon.3@3(peon).log v15758973 check_subs
2013-05-16 05:15:16.874364 7ff6c8427700 10 mon.3@3(peon).log v15758973 check_sub client wants log-info ver 15758973
2013-05-16 05:15:16.874379 7ff6c8427700 10 mon.3@3(peon).log v15758973 _create_sub_incremental level 1 ver 15758973 cur summary ver 15758973
2013-05-16 05:15:16.874433 7ff6c8427700 10 mon.3@3(peon).log v15758973 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:16.874450 7ff6c8427700 1 mon.3@3(peon).log v15758973 check_sub sending message to client.? 188.65.144.4:0/29243 with 2 entries (version 15758973)
2013-05-16 05:15:16.874522 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:17.070465 7ff6c8c28700 10 mon.3@3(peon).pg v16089277 v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:17.070519 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:17.070551 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:17.070556 7ff6c8c28700 10 mon.3@3(peon).log v15758973 update_from_paxos
2013-05-16 05:15:17.070689 7ff6c8c28700 10 mon.3@3(peon).log v15758973 update_from_paxos version 15758973 summary v 15758973
2013-05-16 05:15:17.070711 7ff6c8c28700 10 mon.3@3(peon).log v15758973 log
2013-05-16 05:15:17.070746 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:17.070780 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:18.274102 7ff6c8427700 7 mon.3@3(peon).pg v16089277 update_from_paxos applying incremental 16089278
2013-05-16 05:15:18.274183 7ff6c8427700 10 mon.3@3(peon).pg v16089278 v16089278: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 830KB/s rd, 512KB/s wr, 186op/s
2013-05-16 05:15:18.274218 7ff6c8427700 10 mon.3@3(peon).pg v16089278 map_pg_creates to 0 pgs
2013-05-16 05:15:18.274221 7ff6c8427700 10 mon.3@3(peon).pg v16089278 send_pg_creates to 0 pgs
2013-05-16 05:15:18.274223 7ff6c8427700 10 mon.3@3(peon).pg v16089278 update_logger
2013-05-16 05:15:18.274298 7ff6c8427700 10 mon.3@3(peon).log v15758973 update_from_paxos
2013-05-16 05:15:18.274341 7ff6c8427700 10 mon.3@3(peon).log v15758973 update_from_paxos version 15758973 summary v 15758973
2013-05-16 05:15:18.274374 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:20.136297 7ff6c8427700 10 mon.3@3(peon).log v15758974 update_from_paxos
2013-05-16 05:15:20.136340 7ff6c8427700 10 mon.3@3(peon).log v15758974 update_from_paxos version 15758974 summary v 15758973
2013-05-16 05:15:20.136369 7ff6c8427700 10 mon.3@3(peon).log v15758974 update_from_paxos latest full 15758973
2013-05-16 05:15:20.136405 7ff6c8427700 7 mon.3@3(peon).log v15758974 update_from_paxos applying incremental log 15758974 2013-05-16 05:15:18.184056 mon.0 188.65.144.4:6789/0 13188 : [INF] pgmap v16089278: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 830KB/s rd, 512KB/s wr, 186op/s
2013-05-16 05:15:20.136452 7ff6c8427700 10 mon.3@3(peon).log v15758974 check_subs
2013-05-16 05:15:20.136472 7ff6c8427700 10 mon.3@3(peon).log v15758974 check_sub client wants log-info ver 15758974
2013-05-16 05:15:20.136503 7ff6c8427700 10 mon.3@3(peon).log v15758974 _create_sub_incremental level 1 ver 15758974 cur summary ver 15758974
2013-05-16 05:15:20.136567 7ff6c8427700 10 mon.3@3(peon).log v15758974 _create_sub_incremental incremental message ready (1 entries)
2013-05-16 05:15:20.136581 7ff6c8427700 1 mon.3@3(peon).log v15758974 check_sub sending message to client.? 188.65.144.4:0/29243 with 1 entries (version 15758974)
2013-05-16 05:15:20.136650 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:22.070994 7ff6c8c28700 7 mon.3@3(peon).pg v16089278 update_from_paxos applying incremental 16089279
2013-05-16 05:15:22.071088 7ff6c8c28700 10 mon.3@3(peon).pg v16089279 v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:22.071124 7ff6c8c28700 10 mon.3@3(peon).pg v16089279 map_pg_creates to 0 pgs
2013-05-16 05:15:22.071127 7ff6c8c28700 10 mon.3@3(peon).pg v16089279 send_pg_creates to 0 pgs
2013-05-16 05:15:22.071128 7ff6c8c28700 10 mon.3@3(peon).pg v16089279 update_logger
2013-05-16 05:15:22.071131 7ff6c8c28700 10 mon.3@3(peon).pg v16089279 v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:22.071163 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:22.071192 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:22.071197 7ff6c8c28700 10 mon.3@3(peon).log v15758974 update_from_paxos
2013-05-16 05:15:22.071225 7ff6c8c28700 10 mon.3@3(peon).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:22.071246 7ff6c8c28700 10 mon.3@3(peon).log v15758974 log
2013-05-16 05:15:22.071276 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:22.071301 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:23.474214 7ff6c8427700 10 mon.3@3(peon).log v15758974 update_from_paxos
2013-05-16 05:15:23.474251 7ff6c8427700 10 mon.3@3(peon).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:23.474296 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:27.071439 7ff6c8c28700 10 mon.3@3(peon).pg v16089279 v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:27.071530 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:27.071560 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:27.071567 7ff6c8c28700 10 mon.3@3(peon).log v15758974 update_from_paxos
2013-05-16 05:15:27.071609 7ff6c8c28700 10 mon.3@3(peon).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:27.071623 7ff6c8c28700 10 mon.3@3(peon).log v15758974 log
2013-05-16 05:15:27.071663 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:27.071699 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:28.340104 7ff6c8427700 10 mon.3@3(peon).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags data ) v1
2013-05-16 05:15:28.340111 7ff6c8427700 10 mon.3@3(peon).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags data ) v1
2013-05-16 05:15:28.525464 7ff6c8427700 7 mon.3@3(peon).pg v16089279 update_from_paxos applying incremental 16089280
2013-05-16 05:15:28.525773 7ff6c8427700 10 mon.3@3(peon).pg v16089280 v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:28.525825 7ff6c8427700 10 mon.3@3(peon).pg v16089280 map_pg_creates to 0 pgs
2013-05-16 05:15:28.525829 7ff6c8427700 10 mon.3@3(peon).pg v16089280 send_pg_creates to 0 pgs
2013-05-16 05:15:28.525832 7ff6c8427700 10 mon.3@3(peon).pg v16089280 update_logger
2013-05-16 05:15:28.525900 7ff6c8427700 10 mon.3@3(peon).log v15758974 update_from_paxos
2013-05-16 05:15:28.525944 7ff6c8427700 10 mon.3@3(peon).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:28.525979 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:32.071904 7ff6c8c28700 10 mon.3@3(peon).pg v16089280 v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:32.071981 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:32.072016 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:32.072021 7ff6c8c28700 10 mon.3@3(peon).log v15758975 update_from_paxos
2013-05-16 05:15:32.072054 7ff6c8c28700 10 mon.3@3(peon).log v15758975 update_from_paxos version 15758975 summary v 15758974
2013-05-16 05:15:32.072084 7ff6c8c28700 10 mon.3@3(peon).log v15758975 update_from_paxos latest full 15758974
2013-05-16 05:15:32.072118 7ff6c8c28700 7 mon.3@3(peon).log v15758975 update_from_paxos applying incremental log 15758975 2013-05-16 05:15:23.470978 mon.0 188.65.144.4:6789/0 13189 : [INF] pgmap v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:32.072163 7ff6c8c28700 10 mon.3@3(peon).log v15758975 check_subs
2013-05-16 05:15:32.072186 7ff6c8c28700 10 mon.3@3(peon).log v15758975 check_sub client wants log-info ver 15758975
2013-05-16 05:15:32.072201 7ff6c8c28700 10 mon.3@3(peon).log v15758975 _create_sub_incremental level 1 ver 15758975 cur summary ver 15758975
2013-05-16 05:15:32.072248 7ff6c8c28700 10 mon.3@3(peon).log v15758975 _create_sub_incremental incremental message ready (1 entries)
2013-05-16 05:15:32.072584 7ff6c8c28700 1 mon.3@3(peon).log v15758975 check_sub sending message to client.? 188.65.144.4:0/29243 with 1 entries (version 15758975)
2013-05-16 05:15:32.072632 7ff6c8c28700 10 mon.3@3(peon).log v15758975 log
2013-05-16 05:15:32.072671 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:32.072725 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:32.566560 7ff6c8427700 10 mon.3@3(peon).log v15758975 update_from_paxos
2013-05-16 05:15:32.566592 7ff6c8427700 10 mon.3@3(peon).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:32.566622 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:37.072879 7ff6c8c28700 10 mon.3@3(peon).pg v16089280 v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:37.072930 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:37.072978 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:37.072987 7ff6c8c28700 10 mon.3@3(peon).log v15758975 update_from_paxos
2013-05-16 05:15:37.073029 7ff6c8c28700 10 mon.3@3(peon).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:37.073053 7ff6c8c28700 10 mon.3@3(peon).log v15758975 log
2013-05-16 05:15:37.073087 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:37.073114 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:37.773448 7ff6c8427700 7 mon.3@3(peon).pg v16089280 update_from_paxos applying incremental 16089281
2013-05-16 05:15:37.774060 7ff6c8427700 10 mon.3@3(peon).pg v16089281 v16089281: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7640KB/s rd, 2360KB/s wr, 899op/s
2013-05-16 05:15:37.774102 7ff6c8427700 10 mon.3@3(peon).pg v16089281 map_pg_creates to 0 pgs
2013-05-16 05:15:37.774107 7ff6c8427700 10 mon.3@3(peon).pg v16089281 send_pg_creates to 0 pgs
2013-05-16 05:15:37.774109 7ff6c8427700 10 mon.3@3(peon).pg v16089281 update_logger
2013-05-16 05:15:37.774190 7ff6c8427700 10 mon.3@3(peon).log v15758975 update_from_paxos
2013-05-16 05:15:37.774244 7ff6c8427700 10 mon.3@3(peon).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:37.774502 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:37.957523 7ff6c8427700 10 mon.3@3(peon).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags none ) v1
2013-05-16 05:15:37.957527 7ff6c8427700 10 mon.3@3(peon).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags none ) v1
2013-05-16 05:15:40.106796 7ff6c8427700 7 mon.3@3(peon).pg v16089281 update_from_paxos applying incremental 16089282
2013-05-16 05:15:40.107263 7ff6c8427700 10 mon.3@3(peon).pg v16089282 v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:40.107459 7ff6c8427700 10 mon.3@3(peon).pg v16089282 map_pg_creates to 0 pgs
2013-05-16 05:15:40.107465 7ff6c8427700 10 mon.3@3(peon).pg v16089282 send_pg_creates to 0 pgs
2013-05-16 05:15:40.107467 7ff6c8427700 10 mon.3@3(peon).pg v16089282 update_logger
2013-05-16 05:15:40.107520 7ff6c8427700 10 mon.3@3(peon).log v15758975 update_from_paxos
2013-05-16 05:15:40.107555 7ff6c8427700 10 mon.3@3(peon).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:40.107578 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:40.283327 7ff6c8427700 10 mon.3@3(peon) e8 handle_subscribe mon_subscribe({monmap=9+,osdmap=3960}) v2
2013-05-16 05:15:40.283334 7ff6c8427700 10 mon.3@3(peon) e8 check_sub monmap next 9 have 8
2013-05-16 05:15:40.491357 7ff6c8427700 10 mon.3@3(peon) e8 handle_subscribe mon_subscribe({monmap=9+,osdmap=3960}) v2
2013-05-16 05:15:40.491362 7ff6c8427700 10 mon.3@3(peon) e8 check_sub monmap next 9 have 8
2013-05-16 05:15:41.947400 7ff6c8427700 10 mon.3@3(peon).log v15758976 update_from_paxos
2013-05-16 05:15:41.947432 7ff6c8427700 10 mon.3@3(peon).log v15758976 update_from_paxos version 15758976 summary v 15758975
2013-05-16 05:15:41.947463 7ff6c8427700 10 mon.3@3(peon).log v15758976 update_from_paxos latest full 15758975
2013-05-16 05:15:41.947511 7ff6c8427700 7 mon.3@3(peon).log v15758976 update_from_paxos applying incremental log 15758976 2013-05-16 05:15:28.522542 mon.0 188.65.144.4:6789/0 13190 : [INF] pgmap v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:41.947541 7ff6c8427700 7 mon.3@3(peon).log v15758976 update_from_paxos applying incremental log 15758976 2013-05-16 05:15:37.702942 mon.0 188.65.144.4:6789/0 13191 : [INF] pgmap v16089281: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7640KB/s rd, 2360KB/s wr, 899op/s
2013-05-16 05:15:41.947577 7ff6c8427700 10 mon.3@3(peon).log v15758976 check_subs
2013-05-16 05:15:41.947600 7ff6c8427700 10 mon.3@3(peon).log v15758976 check_sub client wants log-info ver 15758976
2013-05-16 05:15:41.947619 7ff6c8427700 10 mon.3@3(peon).log v15758976 _create_sub_incremental level 1 ver 15758976 cur summary ver 15758976
2013-05-16 05:15:41.947668 7ff6c8427700 10 mon.3@3(peon).log v15758976 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:41.947682 7ff6c8427700 1 mon.3@3(peon).log v15758976 check_sub sending message to client.? 188.65.144.4:0/29243 with 2 entries (version 15758976)
2013-05-16 05:15:41.947734 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:42.073245 7ff6c8c28700 10 mon.3@3(peon).pg v16089282 v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:42.073605 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:42.073638 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:42.073642 7ff6c8c28700 10 mon.3@3(peon).log v15758976 update_from_paxos
2013-05-16 05:15:42.073669 7ff6c8c28700 10 mon.3@3(peon).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:42.073681 7ff6c8c28700 10 mon.3@3(peon).log v15758976 log
2013-05-16 05:15:42.073705 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:42.073735 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:47.073991 7ff6c8c28700 10 mon.3@3(peon).pg v16089282 v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:47.074041 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:47.074071 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:47.074077 7ff6c8c28700 10 mon.3@3(peon).log v15758976 update_from_paxos
2013-05-16 05:15:47.074103 7ff6c8c28700 10 mon.3@3(peon).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:47.074118 7ff6c8c28700 10 mon.3@3(peon).log v15758976 log
2013-05-16 05:15:47.074143 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:47.074182 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:47.468045 7ff6c8427700 7 mon.3@3(peon).pg v16089282 update_from_paxos applying incremental 16089283
2013-05-16 05:15:47.468118 7ff6c8427700 10 mon.3@3(peon).pg v16089283 v16089283: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7797KB/s rd, 4129KB/s wr, 1105op/s
2013-05-16 05:15:47.468139 7ff6c8427700 10 mon.3@3(peon).pg v16089283 map_pg_creates to 0 pgs
2013-05-16 05:15:47.468142 7ff6c8427700 10 mon.3@3(peon).pg v16089283 send_pg_creates to 0 pgs
2013-05-16 05:15:47.468143 7ff6c8427700 10 mon.3@3(peon).pg v16089283 update_logger
2013-05-16 05:15:47.468184 7ff6c8427700 10 mon.3@3(peon).log v15758976 update_from_paxos
2013-05-16 05:15:47.468209 7ff6c8427700 10 mon.3@3(peon).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:47.468346 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:51.356359 7ff6c8427700 7 mon.3@3(peon).pg v16089283 update_from_paxos applying incremental 16089284
2013-05-16 05:15:51.356795 7ff6c8427700 10 mon.3@3(peon).pg v16089284 v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:15:51.356836 7ff6c8427700 10 mon.3@3(peon).pg v16089284 map_pg_creates to 0 pgs
2013-05-16 05:15:51.356840 7ff6c8427700 10 mon.3@3(peon).pg v16089284 send_pg_creates to 0 pgs
2013-05-16 05:15:51.356843 7ff6c8427700 10 mon.3@3(peon).pg v16089284 update_logger
2013-05-16 05:15:51.356918 7ff6c8427700 10 mon.3@3(peon).log v15758976 update_from_paxos
2013-05-16 05:15:51.356960 7ff6c8427700 10 mon.3@3(peon).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:51.357001 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:52.074332 7ff6c8c28700 10 mon.3@3(peon).pg v16089284 v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:15:52.074390 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:52.074433 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:52.074439 7ff6c8c28700 10 mon.3@3(peon).log v15758976 update_from_paxos
2013-05-16 05:15:52.074477 7ff6c8c28700 10 mon.3@3(peon).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:52.074631 7ff6c8c28700 10 mon.3@3(peon).log v15758976 log
2013-05-16 05:15:52.074668 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:52.074703 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:52.970116 7ff6c8427700 10 mon.3@3(peon).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags none ) v1
2013-05-16 05:15:52.970122 7ff6c8427700 10 mon.3@3(peon).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags none ) v1
2013-05-16 05:15:53.847109 7ff6c8427700 10 mon.3@3(peon).log v15758977 update_from_paxos
2013-05-16 05:15:53.847137 7ff6c8427700 10 mon.3@3(peon).log v15758977 update_from_paxos version 15758977 summary v 15758976
2013-05-16 05:15:53.847262 7ff6c8427700 10 mon.3@3(peon).log v15758977 update_from_paxos latest full 15758976
2013-05-16 05:15:53.847306 7ff6c8427700 7 mon.3@3(peon).log v15758977 update_from_paxos applying incremental log 15758977 2013-05-16 05:15:40.070153 mon.0 188.65.144.4:6789/0 13192 : [INF] pgmap v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:53.847336 7ff6c8427700 7 mon.3@3(peon).log v15758977 update_from_paxos applying incremental log 15758977 2013-05-16 05:15:47.338721 mon.0 188.65.144.4:6789/0 13193 : [INF] pgmap v16089283: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7797KB/s rd, 4129KB/s wr, 1105op/s
2013-05-16 05:15:53.847372 7ff6c8427700 10 mon.3@3(peon).log v15758977 check_subs
2013-05-16 05:15:53.847389 7ff6c8427700 10 mon.3@3(peon).log v15758977 check_sub client wants log-info ver 15758977
2013-05-16 05:15:53.847403 7ff6c8427700 10 mon.3@3(peon).log v15758977 _create_sub_incremental level 1 ver 15758977 cur summary ver 15758977
2013-05-16 05:15:53.847663 7ff6c8427700 10 mon.3@3(peon).log v15758977 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:53.847678 7ff6c8427700 1 mon.3@3(peon).log v15758977 check_sub sending message to client.? 188.65.144.4:0/29243 with 2 entries (version 15758977)
2013-05-16 05:15:53.847794 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:57.074841 7ff6c8c28700 10 mon.3@3(peon).pg v16089284 v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:15:57.075033 7ff6c8c28700 10 mon.3@3(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:57.075067 7ff6c8c28700 10 mon.3@3(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:57.075072 7ff6c8c28700 10 mon.3@3(peon).log v15758977 update_from_paxos
2013-05-16 05:15:57.075101 7ff6c8c28700 10 mon.3@3(peon).log v15758977 update_from_paxos version 15758977 summary v 15758977
2013-05-16 05:15:57.075115 7ff6c8c28700 10 mon.3@3(peon).log v15758977 log
2013-05-16 05:15:57.075141 7ff6c8c28700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:15:57.075188 7ff6c8c28700 10 mon.3@3(peon).auth v8461 auth
2013-05-16 05:15:57.319574 7ff6c8427700 7 mon.3@3(peon).pg v16089284 update_from_paxos applying incremental 16089285
2013-05-16 05:15:57.319914 7ff6c8427700 10 mon.3@3(peon).pg v16089285 v16089285: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6541KB/s rd, 4026KB/s wr, 761op/s
2013-05-16 05:15:57.319961 7ff6c8427700 10 mon.3@3(peon).pg v16089285 map_pg_creates to 0 pgs
2013-05-16 05:15:57.319964 7ff6c8427700 10 mon.3@3(peon).pg v16089285 send_pg_creates to 0 pgs
2013-05-16 05:15:57.319965 7ff6c8427700 10 mon.3@3(peon).pg v16089285 update_logger
2013-05-16 05:15:57.320044 7ff6c8427700 10 mon.3@3(peon).log v15758977 update_from_paxos
2013-05-16 05:15:57.320072 7ff6c8427700 10 mon.3@3(peon).log v15758977 update_from_paxos version 15758977 summary v 15758977
2013-05-16 05:15:57.320111 7ff6c8427700 10 mon.3@3(peon).auth v8461 update_from_paxos
2013-05-16 05:16:04.185011 7ff6c8427700 7 mon.3@3(peon).log v15758978 update_from_paxos applying incremental log 15758978 2013-05-16 05:15:51.354050 mon.0 188.65.144.4:6789/0 13194 : [INF] pgmap v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:16:04.185046 7ff6c8427700 7 mon.3@3(peon).log v15758978 update_from_paxos applying incremental log 15758978 2013-05-16 05:15:57.238066 mon.0 188.65.144.4:6789/0 13195 : [INF] pgmap v16089285: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6541KB/s rd, 4026KB/s wr, 761op/s
Executing zcat /var/log/ceph/ceph-mon.\?.log.\?.gz \|grep \'2013-05-16 05:15\' on node05
2013-05-16 05:15:00.417730 7faa3795f700 10 mon.4@4(peon).pg v16089272 v16089272: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9925KB/s rd, 2685KB/s wr, 814op/s
2013-05-16 05:15:00.417851 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:00.417941 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:00.417952 7faa3795f700 10 mon.4@4(peon).log v15758969 update_from_paxos
2013-05-16 05:15:00.418001 7faa3795f700 10 mon.4@4(peon).log v15758969 update_from_paxos version 15758969 summary v 15758969
2013-05-16 05:15:00.418025 7faa3795f700 10 mon.4@4(peon).log v15758969 log
2013-05-16 05:15:00.418066 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:00.418118 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:00.922475 7faa3715e700 10 mon.4@4(peon).pg v16089272 preprocess_query pg_stats(25 pgs tid 28212 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:00.922573 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10349 request pg_stats(25 pgs tid 28212 v 3959) v1
2013-05-16 05:15:00.922662 7faa3715e700 10 mon.4@4(peon).log v15758970 update_from_paxos
2013-05-16 05:15:00.922706 7faa3715e700 10 mon.4@4(peon).log v15758970 update_from_paxos version 15758970 summary v 15758969
2013-05-16 05:15:00.922746 7faa3715e700 10 mon.4@4(peon).log v15758970 update_from_paxos latest full 15758969
2013-05-16 05:15:00.922789 7faa3715e700 7 mon.4@4(peon).log v15758970 update_from_paxos applying incremental log 15758970 2013-05-16 05:14:58.035619 mon.0 188.65.144.4:6789/0 13181 : [INF] pgmap v16089271: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 15173KB/s rd, 2925KB/s wr, 1421op/s
2013-05-16 05:15:00.922833 7faa3715e700 7 mon.4@4(peon).log v15758970 update_from_paxos applying incremental log 15758970 2013-05-16 05:14:59.586224 mon.0 188.65.144.4:6789/0 13182 : [INF] pgmap v16089272: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9925KB/s rd, 2685KB/s wr, 814op/s
2013-05-16 05:15:00.922893 7faa3715e700 10 mon.4@4(peon).log v15758970 check_subs
2013-05-16 05:15:00.922920 7faa3715e700 10 mon.4@4(peon).log v15758970 check_sub client wants log-info ver 15758970
2013-05-16 05:15:00.922953 7faa3715e700 10 mon.4@4(peon).log v15758970 _create_sub_incremental level 1 ver 15758970 cur summary ver 15758970
2013-05-16 05:15:00.923027 7faa3715e700 10 mon.4@4(peon).log v15758970 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:00.923047 7faa3715e700 1 mon.4@4(peon).log v15758970 check_sub sending message to client.? 188.65.144.4:0/15705 with 2 entries (version 15758970)
2013-05-16 05:15:00.923077 7faa3715e700 10 mon.4@4(peon).log v15758970 check_sub client wants log-info ver 15758970
2013-05-16 05:15:00.923103 7faa3715e700 10 mon.4@4(peon).log v15758970 _create_sub_incremental level 1 ver 15758970 cur summary ver 15758970
2013-05-16 05:15:00.923168 7faa3715e700 10 mon.4@4(peon).log v15758970 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:00.923185 7faa3715e700 1 mon.4@4(peon).log v15758970 check_sub sending message to client.? 188.65.144.6:0/26623 with 2 entries (version 15758970)
2013-05-16 05:15:00.923273 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:01.038587 7faa3715e700 10 mon.4@4(peon).pg v16089272 preprocess_query pg_stats(23 pgs tid 28146 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:01.038681 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10350 request pg_stats(23 pgs tid 28146 v 3959) v1
2013-05-16 05:15:02.139569 7faa3715e700 7 mon.4@4(peon).pg v16089272 update_from_paxos applying incremental 16089273
2013-05-16 05:15:02.139678 7faa3715e700 10 mon.4@4(peon).pg v16089273 v16089273: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3956KB/s rd, 1703KB/s wr, 374op/s
2013-05-16 05:15:02.139718 7faa3715e700 10 mon.4@4(peon).pg v16089273 map_pg_creates to 0 pgs
2013-05-16 05:15:02.139722 7faa3715e700 10 mon.4@4(peon).pg v16089273 send_pg_creates to 0 pgs
2013-05-16 05:15:02.139723 7faa3715e700 10 mon.4@4(peon).pg v16089273 update_logger
2013-05-16 05:15:02.139797 7faa3715e700 10 mon.4@4(peon).log v15758970 update_from_paxos
2013-05-16 05:15:02.139846 7faa3715e700 10 mon.4@4(peon).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:02.139896 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:05.146498 7faa3715e700 7 mon.4@4(peon).pg v16089273 update_from_paxos applying incremental 16089274
2013-05-16 05:15:05.146856 7faa3715e700 10 mon.4@4(peon).pg v16089274 v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:05.146899 7faa3715e700 10 mon.4@4(peon).pg v16089274 map_pg_creates to 0 pgs
2013-05-16 05:15:05.146902 7faa3715e700 10 mon.4@4(peon).pg v16089274 send_pg_creates to 0 pgs
2013-05-16 05:15:05.146904 7faa3715e700 10 mon.4@4(peon).pg v16089274 update_logger
2013-05-16 05:15:05.146962 7faa3715e700 10 mon.4@4(peon).log v15758970 update_from_paxos
2013-05-16 05:15:05.146996 7faa3715e700 10 mon.4@4(peon).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:05.147157 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:05.147368 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(25 pgs tid 28212) v1 to unknown.0 :/0
2013-05-16 05:15:05.147392 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(23 pgs tid 28146) v1 to unknown.0 :/0
2013-05-16 05:15:05.418587 7faa3795f700 10 mon.4@4(peon).pg v16089274 v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:05.418650 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:05.418680 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:05.418690 7faa3795f700 10 mon.4@4(peon).log v15758970 update_from_paxos
2013-05-16 05:15:05.418890 7faa3795f700 10 mon.4@4(peon).log v15758970 update_from_paxos version 15758970 summary v 15758970
2013-05-16 05:15:05.418908 7faa3795f700 10 mon.4@4(peon).log v15758970 log
2013-05-16 05:15:05.418939 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:05.419089 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:05.799115 7faa3715e700 10 mon.4@4(peon).pg v16089274 preprocess_query pg_stats(25 pgs tid 28213 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:05.799341 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10351 request pg_stats(25 pgs tid 28213 v 3959) v1
2013-05-16 05:15:05.904548 7faa3715e700 10 mon.4@4(peon).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:05.904557 7faa3715e700 10 mon.4@4(peon).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:06.144175 7faa3715e700 10 mon.4@4(peon).pg v16089274 preprocess_query pg_stats(18 pgs tid 28147 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:06.144267 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10352 request pg_stats(18 pgs tid 28147 v 3959) v1
2013-05-16 05:15:08.729646 7faa3715e700 10 mon.4@4(peon).log v15758971 update_from_paxos
2013-05-16 05:15:08.729853 7faa3715e700 10 mon.4@4(peon).log v15758971 update_from_paxos version 15758971 summary v 15758970
2013-05-16 05:15:08.730004 7faa3715e700 10 mon.4@4(peon).log v15758971 update_from_paxos latest full 15758970
2013-05-16 05:15:08.730075 7faa3715e700 7 mon.4@4(peon).log v15758971 update_from_paxos applying incremental log 15758971 2013-05-16 05:15:02.134427 mon.0 188.65.144.4:6789/0 13183 : [INF] pgmap v16089273: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3956KB/s rd, 1703KB/s wr, 374op/s
2013-05-16 05:15:08.730147 7faa3715e700 10 mon.4@4(peon).log v15758971 check_subs
2013-05-16 05:15:08.730174 7faa3715e700 10 mon.4@4(peon).log v15758971 check_sub client wants log-info ver 15758971
2013-05-16 05:15:08.730191 7faa3715e700 10 mon.4@4(peon).log v15758971 _create_sub_incremental level 1 ver 15758971 cur summary ver 15758971
2013-05-16 05:15:08.730329 7faa3715e700 10 mon.4@4(peon).log v15758971 _create_sub_incremental incremental message ready (1 entries)
2013-05-16 05:15:08.730348 7faa3715e700 1 mon.4@4(peon).log v15758971 check_sub sending message to client.? 188.65.144.4:0/15705 with 1 entries (version 15758971)
2013-05-16 05:15:08.730386 7faa3715e700 10 mon.4@4(peon).log v15758971 check_sub client wants log-info ver 15758971
2013-05-16 05:15:08.730405 7faa3715e700 10 mon.4@4(peon).log v15758971 _create_sub_incremental level 1 ver 15758971 cur summary ver 15758971
2013-05-16 05:15:08.730451 7faa3715e700 10 mon.4@4(peon).log v15758971 _create_sub_incremental incremental message ready (1 entries)
2013-05-16 05:15:08.730476 7faa3715e700 1 mon.4@4(peon).log v15758971 check_sub sending message to client.? 188.65.144.6:0/26623 with 1 entries (version 15758971)
2013-05-16 05:15:08.730535 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:10.133835 7faa3715e700 10 mon.4@4(peon).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:10.133843 7faa3715e700 10 mon.4@4(peon).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:10.419647 7faa3795f700 7 mon.4@4(peon).pg v16089274 update_from_paxos applying incremental 16089275
2013-05-16 05:15:10.420114 7faa3795f700 10 mon.4@4(peon).pg v16089275 v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:10.420162 7faa3795f700 10 mon.4@4(peon).pg v16089275 map_pg_creates to 0 pgs
2013-05-16 05:15:10.420165 7faa3795f700 10 mon.4@4(peon).pg v16089275 send_pg_creates to 0 pgs
2013-05-16 05:15:10.420167 7faa3795f700 10 mon.4@4(peon).pg v16089275 update_logger
2013-05-16 05:15:10.420170 7faa3795f700 10 mon.4@4(peon).pg v16089275 v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:10.420339 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:10.420390 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:10.420395 7faa3795f700 10 mon.4@4(peon).log v15758971 update_from_paxos
2013-05-16 05:15:10.420431 7faa3795f700 10 mon.4@4(peon).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:10.420460 7faa3795f700 10 mon.4@4(peon).log v15758971 log
2013-05-16 05:15:10.420491 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:10.420524 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:10.822298 7faa3715e700 10 mon.4@4(peon).pg v16089275 preprocess_query pg_stats(32 pgs tid 28214 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:10.822590 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10353 request pg_stats(32 pgs tid 28214 v 3959) v1
2013-05-16 05:15:10.822843 7faa3715e700 10 mon.4@4(peon).log v15758971 update_from_paxos
2013-05-16 05:15:10.822923 7faa3715e700 10 mon.4@4(peon).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:10.822968 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:10.823065 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(25 pgs tid 28213) v1 to unknown.0 :/0
2013-05-16 05:15:10.823104 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(18 pgs tid 28147) v1 to unknown.0 :/0
2013-05-16 05:15:11.144633 7faa3715e700 10 mon.4@4(peon).pg v16089275 preprocess_query pg_stats(8 pgs tid 28148 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:11.144778 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10354 request pg_stats(8 pgs tid 28148 v 3959) v1
2013-05-16 05:15:12.435778 7faa3715e700 7 mon.4@4(peon).pg v16089275 update_from_paxos applying incremental 16089276
2013-05-16 05:15:12.436372 7faa3715e700 10 mon.4@4(peon).pg v16089276 v16089276: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3626KB/s rd, 3249KB/s wr, 562op/s
2013-05-16 05:15:12.436433 7faa3715e700 10 mon.4@4(peon).pg v16089276 map_pg_creates to 0 pgs
2013-05-16 05:15:12.436437 7faa3715e700 10 mon.4@4(peon).pg v16089276 send_pg_creates to 0 pgs
2013-05-16 05:15:12.436440 7faa3715e700 10 mon.4@4(peon).pg v16089276 update_logger
2013-05-16 05:15:12.436509 7faa3715e700 10 mon.4@4(peon).log v15758971 update_from_paxos
2013-05-16 05:15:12.436551 7faa3715e700 10 mon.4@4(peon).log v15758971 update_from_paxos version 15758971 summary v 15758971
2013-05-16 05:15:12.436583 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:12.436650 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(32 pgs tid 28214) v1 to unknown.0 :/0
2013-05-16 05:15:13.639549 7faa3bd19700 0 -- 188.65.144.8:6789/0 >> 188.65.144.5:6789/0 pipe(0x3958280 sd=24 :34834 s=2 pgs=3433 cs=1 l=0).fault with nothing to send, going to standby
2013-05-16 05:15:13.755710 7faa3715e700 10 mon.4@4(peon).log v15758972 update_from_paxos
2013-05-16 05:15:13.756061 7faa3715e700 10 mon.4@4(peon).log v15758972 update_from_paxos version 15758972 summary v 15758971
2013-05-16 05:15:13.756096 7faa3715e700 10 mon.4@4(peon).log v15758972 update_from_paxos latest full 15758971
2013-05-16 05:15:13.756141 7faa3715e700 7 mon.4@4(peon).log v15758972 update_from_paxos applying incremental log 15758972 2013-05-16 05:15:05.138392 mon.0 188.65.144.4:6789/0 13184 : [INF] pgmap v16089274: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 9890KB/s rd, 8302KB/s wr, 1338op/s
2013-05-16 05:15:13.756179 7faa3715e700 7 mon.4@4(peon).log v15758972 update_from_paxos applying incremental log 15758972 2013-05-16 05:15:10.816147 mon.0 188.65.144.4:6789/0 13185 : [INF] pgmap v16089275: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6336KB/s rd, 6513KB/s wr, 967op/s
2013-05-16 05:15:13.756317 7faa3715e700 10 mon.4@4(peon).log v15758972 check_subs
2013-05-16 05:15:13.756349 7faa3715e700 10 mon.4@4(peon).log v15758972 check_sub client wants log-info ver 15758972
2013-05-16 05:15:13.756367 7faa3715e700 10 mon.4@4(peon).log v15758972 _create_sub_incremental level 1 ver 15758972 cur summary ver 15758972
2013-05-16 05:15:13.756434 7faa3715e700 10 mon.4@4(peon).log v15758972 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:13.756453 7faa3715e700 1 mon.4@4(peon).log v15758972 check_sub sending message to client.? 188.65.144.4:0/15705 with 2 entries (version 15758972)
2013-05-16 05:15:13.756483 7faa3715e700 10 mon.4@4(peon).log v15758972 check_sub client wants log-info ver 15758972
2013-05-16 05:15:13.756502 7faa3715e700 10 mon.4@4(peon).log v15758972 _create_sub_incremental level 1 ver 15758972 cur summary ver 15758972
2013-05-16 05:15:13.756562 7faa3715e700 10 mon.4@4(peon).log v15758972 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:13.756598 7faa3715e700 1 mon.4@4(peon).log v15758972 check_sub sending message to client.? 188.65.144.6:0/26623 with 2 entries (version 15758972)
2013-05-16 05:15:13.756869 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:15.234128 7faa3715e700 7 mon.4@4(peon).pg v16089276 update_from_paxos applying incremental 16089277
2013-05-16 05:15:15.234266 7faa3715e700 10 mon.4@4(peon).pg v16089277 v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:15.234300 7faa3715e700 10 mon.4@4(peon).pg v16089277 map_pg_creates to 0 pgs
2013-05-16 05:15:15.234326 7faa3715e700 10 mon.4@4(peon).pg v16089277 send_pg_creates to 0 pgs
2013-05-16 05:15:15.234328 7faa3715e700 10 mon.4@4(peon).pg v16089277 update_logger
2013-05-16 05:15:15.234413 7faa3715e700 10 mon.4@4(peon).log v15758972 update_from_paxos
2013-05-16 05:15:15.234451 7faa3715e700 10 mon.4@4(peon).log v15758972 update_from_paxos version 15758972 summary v 15758972
2013-05-16 05:15:15.234517 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:15.234634 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(8 pgs tid 28148) v1 to unknown.0 :/0
2013-05-16 05:15:15.420840 7faa3795f700 10 mon.4@4(peon).pg v16089277 v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:15.420918 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:15.420982 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:15.420991 7faa3795f700 10 mon.4@4(peon).log v15758972 update_from_paxos
2013-05-16 05:15:15.421026 7faa3795f700 10 mon.4@4(peon).log v15758972 update_from_paxos version 15758972 summary v 15758972
2013-05-16 05:15:15.421043 7faa3795f700 10 mon.4@4(peon).log v15758972 log
2013-05-16 05:15:15.421082 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:15.421113 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:15.800353 7faa3715e700 10 mon.4@4(peon).pg v16089277 preprocess_query pg_stats(12 pgs tid 28215 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:15.800447 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10355 request pg_stats(12 pgs tid 28215 v 3959) v1
2013-05-16 05:15:16.794472 7faa3715e700 10 mon.4@4(peon).pg v16089277 preprocess_query pg_stats(10 pgs tid 28149 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:16.794554 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10356 request pg_stats(10 pgs tid 28149 v 3959) v1
2013-05-16 05:15:16.794661 7faa3715e700 10 mon.4@4(peon).log v15758973 update_from_paxos
2013-05-16 05:15:16.794720 7faa3715e700 10 mon.4@4(peon).log v15758973 update_from_paxos version 15758973 summary v 15758972
2013-05-16 05:15:16.794766 7faa3715e700 10 mon.4@4(peon).log v15758973 update_from_paxos latest full 15758972
2013-05-16 05:15:16.794895 7faa3715e700 7 mon.4@4(peon).log v15758973 update_from_paxos applying incremental log 15758973 2013-05-16 05:15:12.426612 mon.0 188.65.144.4:6789/0 13186 : [INF] pgmap v16089276: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3626KB/s rd, 3249KB/s wr, 562op/s
2013-05-16 05:15:16.794937 7faa3715e700 7 mon.4@4(peon).log v15758973 update_from_paxos applying incremental log 15758973 2013-05-16 05:15:15.226225 mon.0 188.65.144.4:6789/0 13187 : [INF] pgmap v16089277: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 3262KB/s rd, 1441KB/s wr, 455op/s
2013-05-16 05:15:16.794989 7faa3715e700 10 mon.4@4(peon).log v15758973 check_subs
2013-05-16 05:15:16.795014 7faa3715e700 10 mon.4@4(peon).log v15758973 check_sub client wants log-info ver 15758973
2013-05-16 05:15:16.795041 7faa3715e700 10 mon.4@4(peon).log v15758973 _create_sub_incremental level 1 ver 15758973 cur summary ver 15758973
2013-05-16 05:15:16.795117 7faa3715e700 10 mon.4@4(peon).log v15758973 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:16.795136 7faa3715e700 1 mon.4@4(peon).log v15758973 check_sub sending message to client.? 188.65.144.4:0/15705 with 2 entries (version 15758973)
2013-05-16 05:15:16.795175 7faa3715e700 10 mon.4@4(peon).log v15758973 check_sub client wants log-info ver 15758973
2013-05-16 05:15:16.795271 7faa3715e700 10 mon.4@4(peon).log v15758973 _create_sub_incremental level 1 ver 15758973 cur summary ver 15758973
2013-05-16 05:15:16.795372 7faa3715e700 10 mon.4@4(peon).log v15758973 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:16.795391 7faa3715e700 1 mon.4@4(peon).log v15758973 check_sub sending message to client.? 188.65.144.6:0/26623 with 2 entries (version 15758973)
2013-05-16 05:15:16.795608 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:18.191459 7faa3715e700 7 mon.4@4(peon).pg v16089277 update_from_paxos applying incremental 16089278
2013-05-16 05:15:18.191661 7faa3715e700 10 mon.4@4(peon).pg v16089278 v16089278: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 830KB/s rd, 512KB/s wr, 186op/s
2013-05-16 05:15:18.191696 7faa3715e700 10 mon.4@4(peon).pg v16089278 map_pg_creates to 0 pgs
2013-05-16 05:15:18.191699 7faa3715e700 10 mon.4@4(peon).pg v16089278 send_pg_creates to 0 pgs
2013-05-16 05:15:18.191702 7faa3715e700 10 mon.4@4(peon).pg v16089278 update_logger
2013-05-16 05:15:18.191759 7faa3715e700 10 mon.4@4(peon).log v15758973 update_from_paxos
2013-05-16 05:15:18.191805 7faa3715e700 10 mon.4@4(peon).log v15758973 update_from_paxos version 15758973 summary v 15758973
2013-05-16 05:15:18.191845 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:20.137953 7faa3715e700 10 mon.4@4(peon).log v15758974 update_from_paxos
2013-05-16 05:15:20.137990 7faa3715e700 10 mon.4@4(peon).log v15758974 update_from_paxos version 15758974 summary v 15758973
2013-05-16 05:15:20.138019 7faa3715e700 10 mon.4@4(peon).log v15758974 update_from_paxos latest full 15758973
2013-05-16 05:15:20.138062 7faa3715e700 7 mon.4@4(peon).log v15758974 update_from_paxos applying incremental log 15758974 2013-05-16 05:15:18.184056 mon.0 188.65.144.4:6789/0 13188 : [INF] pgmap v16089278: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 830KB/s rd, 512KB/s wr, 186op/s
2013-05-16 05:15:20.138136 7faa3715e700 10 mon.4@4(peon).log v15758974 check_subs
2013-05-16 05:15:20.138171 7faa3715e700 10 mon.4@4(peon).log v15758974 check_sub client wants log-info ver 15758974
2013-05-16 05:15:20.138189 7faa3715e700 10 mon.4@4(peon).log v15758974 _create_sub_incremental level 1 ver 15758974 cur summary ver 15758974
2013-05-16 05:15:20.138268 7faa3715e700 10 mon.4@4(peon).log v15758974 _create_sub_incremental incremental message ready (1 entries)
2013-05-16 05:15:20.138286 7faa3715e700 1 mon.4@4(peon).log v15758974 check_sub sending message to client.? 188.65.144.4:0/15705 with 1 entries (version 15758974)
2013-05-16 05:15:20.138317 7faa3715e700 10 mon.4@4(peon).log v15758974 check_sub client wants log-info ver 15758974
2013-05-16 05:15:20.138375 7faa3715e700 10 mon.4@4(peon).log v15758974 _create_sub_incremental level 1 ver 15758974 cur summary ver 15758974
2013-05-16 05:15:20.138438 7faa3715e700 10 mon.4@4(peon).log v15758974 _create_sub_incremental incremental message ready (1 entries)
2013-05-16 05:15:20.138456 7faa3715e700 1 mon.4@4(peon).log v15758974 check_sub sending message to client.? 188.65.144.6:0/26623 with 1 entries (version 15758974)
2013-05-16 05:15:20.138540 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:20.421627 7faa3795f700 10 mon.4@4(peon).pg v16089278 v16089278: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 830KB/s rd, 512KB/s wr, 186op/s
2013-05-16 05:15:20.421701 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:20.421742 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:20.421746 7faa3795f700 10 mon.4@4(peon).log v15758974 update_from_paxos
2013-05-16 05:15:20.421779 7faa3795f700 10 mon.4@4(peon).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:20.421796 7faa3795f700 10 mon.4@4(peon).log v15758974 log
2013-05-16 05:15:20.421835 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:20.421876 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:20.801055 7faa3715e700 10 mon.4@4(peon).pg v16089278 preprocess_query pg_stats(28 pgs tid 28216 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:20.801149 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10357 request pg_stats(28 pgs tid 28216 v 3959) v1
2013-05-16 05:15:23.476117 7faa3715e700 7 mon.4@4(peon).pg v16089278 update_from_paxos applying incremental 16089279
2013-05-16 05:15:23.476238 7faa3715e700 10 mon.4@4(peon).pg v16089279 v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:23.476273 7faa3715e700 10 mon.4@4(peon).pg v16089279 map_pg_creates to 0 pgs
2013-05-16 05:15:23.476277 7faa3715e700 10 mon.4@4(peon).pg v16089279 send_pg_creates to 0 pgs
2013-05-16 05:15:23.476278 7faa3715e700 10 mon.4@4(peon).pg v16089279 update_logger
2013-05-16 05:15:23.476281 7faa3715e700 10 mon.4@4(peon).pg v16089279 preprocess_query pg_stats(18 pgs tid 28150 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:23.476378 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10358 request pg_stats(18 pgs tid 28150 v 3959) v1
2013-05-16 05:15:23.476460 7faa3715e700 10 mon.4@4(peon).log v15758974 update_from_paxos
2013-05-16 05:15:23.476514 7faa3715e700 10 mon.4@4(peon).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:23.476553 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:23.476613 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(12 pgs tid 28215) v1 to unknown.0 :/0
2013-05-16 05:15:23.476637 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(10 pgs tid 28149) v1 to unknown.0 :/0
2013-05-16 05:15:25.422110 7faa3795f700 10 mon.4@4(peon).pg v16089279 v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:25.422214 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:25.422268 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:25.422275 7faa3795f700 10 mon.4@4(peon).log v15758974 update_from_paxos
2013-05-16 05:15:25.422454 7faa3795f700 10 mon.4@4(peon).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:25.422477 7faa3795f700 10 mon.4@4(peon).log v15758974 log
2013-05-16 05:15:25.422510 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:25.422550 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:25.801823 7faa3715e700 10 mon.4@4(peon).pg v16089279 preprocess_query pg_stats(33 pgs tid 28217 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:25.801916 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10359 request pg_stats(33 pgs tid 28217 v 3959) v1
2013-05-16 05:15:26.146704 7faa3715e700 10 mon.4@4(peon).pg v16089279 preprocess_query pg_stats(22 pgs tid 28151 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:26.146784 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10360 request pg_stats(22 pgs tid 28151 v 3959) v1
2013-05-16 05:15:28.292083 7faa3715e700 10 mon.4@4(peon).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags none ) v1
2013-05-16 05:15:28.292088 7faa3715e700 10 mon.4@4(peon).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags none ) v1
2013-05-16 05:15:28.527480 7faa3715e700 7 mon.4@4(peon).pg v16089279 update_from_paxos applying incremental 16089280
2013-05-16 05:15:28.527970 7faa3715e700 10 mon.4@4(peon).pg v16089280 v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:28.528021 7faa3715e700 10 mon.4@4(peon).pg v16089280 map_pg_creates to 0 pgs
2013-05-16 05:15:28.528024 7faa3715e700 10 mon.4@4(peon).pg v16089280 send_pg_creates to 0 pgs
2013-05-16 05:15:28.528026 7faa3715e700 10 mon.4@4(peon).pg v16089280 update_logger
2013-05-16 05:15:28.528093 7faa3715e700 10 mon.4@4(peon).log v15758974 update_from_paxos
2013-05-16 05:15:28.528149 7faa3715e700 10 mon.4@4(peon).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:28.528229 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:28.528289 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(28 pgs tid 28216) v1 to unknown.0 :/0
2013-05-16 05:15:28.528314 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(18 pgs tid 28150) v1 to unknown.0 :/0
2013-05-16 05:15:30.422766 7faa3795f700 10 mon.4@4(peon).pg v16089280 v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:30.422844 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:30.422888 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:30.422897 7faa3795f700 10 mon.4@4(peon).log v15758974 update_from_paxos
2013-05-16 05:15:30.422941 7faa3795f700 10 mon.4@4(peon).log v15758974 update_from_paxos version 15758974 summary v 15758974
2013-05-16 05:15:30.422957 7faa3795f700 10 mon.4@4(peon).log v15758974 log
2013-05-16 05:15:30.422997 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:30.423049 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:32.568329 7faa3715e700 10 mon.4@4(peon).pg v16089280 preprocess_query pg_stats(34 pgs tid 28218 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:32.568433 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10361 request pg_stats(34 pgs tid 28218 v 3959) v1
2013-05-16 05:15:32.568486 7faa3715e700 10 mon.4@4(peon).pg v16089280 preprocess_query pg_stats(19 pgs tid 28152 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:32.568542 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10362 request pg_stats(19 pgs tid 28152 v 3959) v1
2013-05-16 05:15:32.568617 7faa3715e700 10 mon.4@4(peon).log v15758975 update_from_paxos
2013-05-16 05:15:32.568665 7faa3715e700 10 mon.4@4(peon).log v15758975 update_from_paxos version 15758975 summary v 15758974
2013-05-16 05:15:32.568696 7faa3715e700 10 mon.4@4(peon).log v15758975 update_from_paxos latest full 15758974
2013-05-16 05:15:32.568733 7faa3715e700 7 mon.4@4(peon).log v15758975 update_from_paxos applying incremental log 15758975 2013-05-16 05:15:23.470978 mon.0 188.65.144.4:6789/0 13189 : [INF] pgmap v16089279: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 245KB/s rd, 747KB/s wr, 148op/s
2013-05-16 05:15:32.568798 7faa3715e700 10 mon.4@4(peon).log v15758975 check_subs
2013-05-16 05:15:32.568825 7faa3715e700 10 mon.4@4(peon).log v15758975 check_sub client wants log-info ver 15758975
2013-05-16 05:15:32.568842 7faa3715e700 10 mon.4@4(peon).log v15758975 _create_sub_incremental level 1 ver 15758975 cur summary ver 15758975
2013-05-16 05:15:32.569132 7faa3715e700 10 mon.4@4(peon).log v15758975 _create_sub_incremental incremental message ready (1 entries)
2013-05-16 05:15:32.569267 7faa3715e700 1 mon.4@4(peon).log v15758975 check_sub sending message to client.? 188.65.144.4:0/15705 with 1 entries (version 15758975)
2013-05-16 05:15:32.569303 7faa3715e700 10 mon.4@4(peon).log v15758975 check_sub client wants log-info ver 15758975
2013-05-16 05:15:32.569328 7faa3715e700 10 mon.4@4(peon).log v15758975 _create_sub_incremental level 1 ver 15758975 cur summary ver 15758975
2013-05-16 05:15:32.569375 7faa3715e700 10 mon.4@4(peon).log v15758975 _create_sub_incremental incremental message ready (1 entries)
2013-05-16 05:15:32.569402 7faa3715e700 1 mon.4@4(peon).log v15758975 check_sub sending message to client.? 188.65.144.6:0/26623 with 1 entries (version 15758975)
2013-05-16 05:15:32.569464 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:35.423312 7faa3795f700 10 mon.4@4(peon).pg v16089280 v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:35.423394 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:35.423452 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:35.423457 7faa3795f700 10 mon.4@4(peon).log v15758975 update_from_paxos
2013-05-16 05:15:35.423516 7faa3795f700 10 mon.4@4(peon).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:35.423540 7faa3795f700 10 mon.4@4(peon).log v15758975 log
2013-05-16 05:15:35.423578 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:35.423858 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:35.803052 7faa3715e700 10 mon.4@4(peon).pg v16089280 preprocess_query pg_stats(38 pgs tid 28219 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:35.803152 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10363 request pg_stats(38 pgs tid 28219 v 3959) v1
2013-05-16 05:15:37.712281 7faa3715e700 7 mon.4@4(peon).pg v16089280 update_from_paxos applying incremental 16089281
2013-05-16 05:15:37.713640 7faa3715e700 10 mon.4@4(peon).pg v16089281 v16089281: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7640KB/s rd, 2360KB/s wr, 899op/s
2013-05-16 05:15:37.713756 7faa3715e700 10 mon.4@4(peon).pg v16089281 map_pg_creates to 0 pgs
2013-05-16 05:15:37.713760 7faa3715e700 10 mon.4@4(peon).pg v16089281 send_pg_creates to 0 pgs
2013-05-16 05:15:37.713762 7faa3715e700 10 mon.4@4(peon).pg v16089281 update_logger
2013-05-16 05:15:37.713765 7faa3715e700 10 mon.4@4(peon).pg v16089281 preprocess_query pg_stats(23 pgs tid 28153 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:37.713857 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10364 request pg_stats(23 pgs tid 28153 v 3959) v1
2013-05-16 05:15:37.713935 7faa3715e700 10 mon.4@4(peon).log v15758975 update_from_paxos
2013-05-16 05:15:37.713971 7faa3715e700 10 mon.4@4(peon).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:37.714041 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:37.714160 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(33 pgs tid 28217) v1 to unknown.0 :/0
2013-05-16 05:15:37.714207 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(22 pgs tid 28151) v1 to unknown.0 :/0
2013-05-16 05:15:37.958782 7faa3795f700 10 mon.4@4(peon).data_health(12158) service_tick
2013-05-16 05:15:37.958812 7faa3795f700 0 mon.4@4(peon).data_health(12158) update_stats avail 54% total 30758848 used 12518828 avail 16677552
2013-05-16 05:15:37.958819 7faa3795f700 10 mon.4@4(peon).data_health(12158) share_stats
2013-05-16 05:15:40.079976 7faa3715e700 7 mon.4@4(peon).pg v16089281 update_from_paxos applying incremental 16089282
2013-05-16 05:15:40.080705 7faa3715e700 10 mon.4@4(peon).pg v16089282 v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:40.080772 7faa3715e700 10 mon.4@4(peon).pg v16089282 map_pg_creates to 0 pgs
2013-05-16 05:15:40.080776 7faa3715e700 10 mon.4@4(peon).pg v16089282 send_pg_creates to 0 pgs
2013-05-16 05:15:40.080778 7faa3715e700 10 mon.4@4(peon).pg v16089282 update_logger
2013-05-16 05:15:40.080842 7faa3715e700 10 mon.4@4(peon).log v15758975 update_from_paxos
2013-05-16 05:15:40.080881 7faa3715e700 10 mon.4@4(peon).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:40.080916 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:40.080977 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(34 pgs tid 28218) v1 to unknown.0 :/0
2013-05-16 05:15:40.081007 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(19 pgs tid 28152) v1 to unknown.0 :/0
2013-05-16 05:15:40.081026 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(38 pgs tid 28219) v1 to unknown.0 :/0
2013-05-16 05:15:40.081065 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(23 pgs tid 28153) v1 to unknown.0 :/0
2013-05-16 05:15:40.424137 7faa3795f700 10 mon.4@4(peon).pg v16089282 v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:40.424275 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:40.424310 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:40.424316 7faa3795f700 10 mon.4@4(peon).log v15758975 update_from_paxos
2013-05-16 05:15:40.424351 7faa3795f700 10 mon.4@4(peon).log v15758975 update_from_paxos version 15758975 summary v 15758975
2013-05-16 05:15:40.424368 7faa3795f700 10 mon.4@4(peon).log v15758975 log
2013-05-16 05:15:40.424412 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:40.424459 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:40.803306 7faa3715e700 10 mon.4@4(peon).pg v16089282 preprocess_query pg_stats(29 pgs tid 28220 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:40.803411 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10365 request pg_stats(29 pgs tid 28220 v 3959) v1
2013-05-16 05:15:41.881974 7faa3715e700 10 mon.4@4(peon).pg v16089282 preprocess_query pg_stats(18 pgs tid 28154 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:41.882165 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10366 request pg_stats(18 pgs tid 28154 v 3959) v1
2013-05-16 05:15:41.882361 7faa3715e700 10 mon.4@4(peon).log v15758976 update_from_paxos
2013-05-16 05:15:41.882478 7faa3715e700 10 mon.4@4(peon).log v15758976 update_from_paxos version 15758976 summary v 15758975
2013-05-16 05:15:41.882518 7faa3715e700 10 mon.4@4(peon).log v15758976 update_from_paxos latest full 15758975
2013-05-16 05:15:41.883091 7faa3715e700 7 mon.4@4(peon).log v15758976 update_from_paxos applying incremental log 15758976 2013-05-16 05:15:28.522542 mon.0 188.65.144.4:6789/0 13190 : [INF] pgmap v16089280: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 2214KB/s rd, 1756KB/s wr, 421op/s
2013-05-16 05:15:41.883545 7faa3715e700 7 mon.4@4(peon).log v15758976 update_from_paxos applying incremental log 15758976 2013-05-16 05:15:37.702942 mon.0 188.65.144.4:6789/0 13191 : [INF] pgmap v16089281: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7640KB/s rd, 2360KB/s wr, 899op/s
2013-05-16 05:15:41.883710 7faa3715e700 10 mon.4@4(peon).log v15758976 check_subs
2013-05-16 05:15:41.883737 7faa3715e700 10 mon.4@4(peon).log v15758976 check_sub client wants log-info ver 15758976
2013-05-16 05:15:41.884511 7faa3715e700 10 mon.4@4(peon).log v15758976 _create_sub_incremental level 1 ver 15758976 cur summary ver 15758976
2013-05-16 05:15:41.884823 7faa3715e700 10 mon.4@4(peon).log v15758976 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:41.884849 7faa3715e700 1 mon.4@4(peon).log v15758976 check_sub sending message to client.? 188.65.144.4:0/15705 with 2 entries (version 15758976)
2013-05-16 05:15:41.885047 7faa3715e700 10 mon.4@4(peon).log v15758976 check_sub client wants log-info ver 15758976
2013-05-16 05:15:41.885090 7faa3715e700 10 mon.4@4(peon).log v15758976 _create_sub_incremental level 1 ver 15758976 cur summary ver 15758976
2013-05-16 05:15:41.885243 7faa3715e700 10 mon.4@4(peon).log v15758976 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:41.885276 7faa3715e700 1 mon.4@4(peon).log v15758976 check_sub sending message to client.? 188.65.144.6:0/26623 with 2 entries (version 15758976)
2013-05-16 05:15:41.885405 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:45.424909 7faa3795f700 10 mon.4@4(peon).pg v16089282 v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:45.424991 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:45.425179 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:45.425188 7faa3795f700 10 mon.4@4(peon).log v15758976 update_from_paxos
2013-05-16 05:15:45.425266 7faa3795f700 10 mon.4@4(peon).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:45.425291 7faa3795f700 10 mon.4@4(peon).log v15758976 log
2013-05-16 05:15:45.425345 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:45.425391 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:47.355915 7faa3715e700 7 mon.4@4(peon).pg v16089282 update_from_paxos applying incremental 16089283
2013-05-16 05:15:47.356029 7faa3715e700 10 mon.4@4(peon).pg v16089283 v16089283: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7797KB/s rd, 4129KB/s wr, 1105op/s
2013-05-16 05:15:47.356061 7faa3715e700 10 mon.4@4(peon).pg v16089283 map_pg_creates to 0 pgs
2013-05-16 05:15:47.356064 7faa3715e700 10 mon.4@4(peon).pg v16089283 send_pg_creates to 0 pgs
2013-05-16 05:15:47.356067 7faa3715e700 10 mon.4@4(peon).pg v16089283 update_logger
2013-05-16 05:15:47.356070 7faa3715e700 10 mon.4@4(peon).pg v16089283 preprocess_query pg_stats(32 pgs tid 28221 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:47.356159 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10367 request pg_stats(32 pgs tid 28221 v 3959) v1
2013-05-16 05:15:47.356240 7faa3715e700 10 mon.4@4(peon).pg v16089283 preprocess_query pg_stats(21 pgs tid 28155 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:47.356300 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10368 request pg_stats(21 pgs tid 28155 v 3959) v1
2013-05-16 05:15:47.356497 7faa3715e700 10 mon.4@4(peon).log v15758976 update_from_paxos
2013-05-16 05:15:47.356539 7faa3715e700 10 mon.4@4(peon).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:47.356583 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:50.425929 7faa3795f700 7 mon.4@4(peon).pg v16089283 update_from_paxos applying incremental 16089284
2013-05-16 05:15:50.426570 7faa3795f700 10 mon.4@4(peon).pg v16089284 v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:15:50.426808 7faa3795f700 10 mon.4@4(peon).pg v16089284 map_pg_creates to 0 pgs
2013-05-16 05:15:50.426816 7faa3795f700 10 mon.4@4(peon).pg v16089284 send_pg_creates to 0 pgs
2013-05-16 05:15:50.426818 7faa3795f700 10 mon.4@4(peon).pg v16089284 update_logger
2013-05-16 05:15:50.426821 7faa3795f700 10 mon.4@4(peon).pg v16089284 v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:15:50.426879 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:50.426916 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:50.426928 7faa3795f700 10 mon.4@4(peon).log v15758976 update_from_paxos
2013-05-16 05:15:50.426964 7faa3795f700 10 mon.4@4(peon).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:50.426981 7faa3795f700 10 mon.4@4(peon).log v15758976 log
2013-05-16 05:15:50.427133 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:50.427186 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:51.358378 7faa3715e700 10 mon.4@4(peon).pg v16089284 preprocess_query pg_stats(34 pgs tid 28222 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:51.358527 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10369 request pg_stats(34 pgs tid 28222 v 3959) v1
2013-05-16 05:15:51.358593 7faa3715e700 10 mon.4@4(peon).pg v16089284 preprocess_query pg_stats(29 pgs tid 28156 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:51.358686 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10370 request pg_stats(29 pgs tid 28156 v 3959) v1
2013-05-16 05:15:51.358828 7faa3715e700 10 mon.4@4(peon).log v15758976 update_from_paxos
2013-05-16 05:15:51.359045 7faa3715e700 10 mon.4@4(peon).log v15758976 update_from_paxos version 15758976 summary v 15758976
2013-05-16 05:15:51.359248 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:51.359329 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(29 pgs tid 28220) v1 to unknown.0 :/0
2013-05-16 05:15:51.359389 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(18 pgs tid 28154) v1 to unknown.0 :/0
2013-05-16 05:15:51.359412 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(32 pgs tid 28221) v1 to unknown.0 :/0
2013-05-16 05:15:51.359448 7faa3715e700 10 mon.4@4(peon) e8 handle_route pg_stats_ack(21 pgs tid 28155) v1 to unknown.0 :/0
2013-05-16 05:15:52.971834 7faa3715e700 10 mon.4@4(peon).data_health(12158) service_dispatch mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:52.971841 7faa3715e700 10 mon.4@4(peon).data_health(12158) handle_tell mon_health( service 1 op tell e 0 r 0 flags ) v1
2013-05-16 05:15:53.849082 7faa3715e700 10 mon.4@4(peon).log v15758977 update_from_paxos
2013-05-16 05:15:53.849120 7faa3715e700 10 mon.4@4(peon).log v15758977 update_from_paxos version 15758977 summary v 15758976
2013-05-16 05:15:53.849168 7faa3715e700 10 mon.4@4(peon).log v15758977 update_from_paxos latest full 15758976
2013-05-16 05:15:53.849252 7faa3715e700 7 mon.4@4(peon).log v15758977 update_from_paxos applying incremental log 15758977 2013-05-16 05:15:40.070153 mon.0 188.65.144.4:6789/0 13192 : [INF] pgmap v16089282: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 10453KB/s rd, 3755KB/s wr, 1320op/s
2013-05-16 05:15:53.849291 7faa3715e700 7 mon.4@4(peon).log v15758977 update_from_paxos applying incremental log 15758977 2013-05-16 05:15:47.338721 mon.0 188.65.144.4:6789/0 13193 : [INF] pgmap v16089283: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 7797KB/s rd, 4129KB/s wr, 1105op/s
2013-05-16 05:15:53.849368 7faa3715e700 10 mon.4@4(peon).log v15758977 check_subs
2013-05-16 05:15:53.849392 7faa3715e700 10 mon.4@4(peon).log v15758977 check_sub client wants log-info ver 15758977
2013-05-16 05:15:53.849418 7faa3715e700 10 mon.4@4(peon).log v15758977 _create_sub_incremental level 1 ver 15758977 cur summary ver 15758977
2013-05-16 05:15:53.849634 7faa3715e700 10 mon.4@4(peon).log v15758977 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:53.849657 7faa3715e700 1 mon.4@4(peon).log v15758977 check_sub sending message to client.? 188.65.144.4:0/15705 with 2 entries (version 15758977)
2013-05-16 05:15:53.849696 7faa3715e700 10 mon.4@4(peon).log v15758977 check_sub client wants log-info ver 15758977
2013-05-16 05:15:53.849730 7faa3715e700 10 mon.4@4(peon).log v15758977 _create_sub_incremental level 1 ver 15758977 cur summary ver 15758977
2013-05-16 05:15:53.849808 7faa3715e700 10 mon.4@4(peon).log v15758977 _create_sub_incremental incremental message ready (2 entries)
2013-05-16 05:15:53.849826 7faa3715e700 1 mon.4@4(peon).log v15758977 check_sub sending message to client.? 188.65.144.6:0/26623 with 2 entries (version 15758977)
2013-05-16 05:15:53.849895 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:55.427451 7faa3795f700 10 mon.4@4(peon).pg v16089284 v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:15:55.427553 7faa3795f700 10 mon.4@4(peon).mds e4483 e4483: 1/1/1 up {0=0=up:active}
2013-05-16 05:15:55.427594 7faa3795f700 10 mon.4@4(peon).osd e3959 e3959: 7 osds: 7 up, 7 in
2013-05-16 05:15:55.427599 7faa3795f700 10 mon.4@4(peon).log v15758977 update_from_paxos
2013-05-16 05:15:55.427641 7faa3795f700 10 mon.4@4(peon).log v15758977 update_from_paxos version 15758977 summary v 15758977
2013-05-16 05:15:55.427666 7faa3795f700 10 mon.4@4(peon).log v15758977 log
2013-05-16 05:15:55.427867 7faa3795f700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:15:55.427901 7faa3795f700 10 mon.4@4(peon).auth v8461 auth
2013-05-16 05:15:57.249093 7faa3715e700 7 mon.4@4(peon).pg v16089284 update_from_paxos applying incremental 16089285
2013-05-16 05:15:57.249602 7faa3715e700 10 mon.4@4(peon).pg v16089285 v16089285: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6541KB/s rd, 4026KB/s wr, 761op/s
2013-05-16 05:15:57.249661 7faa3715e700 10 mon.4@4(peon).pg v16089285 map_pg_creates to 0 pgs
2013-05-16 05:15:57.249664 7faa3715e700 10 mon.4@4(peon).pg v16089285 send_pg_creates to 0 pgs
2013-05-16 05:15:57.249667 7faa3715e700 10 mon.4@4(peon).pg v16089285 update_logger
2013-05-16 05:15:57.249670 7faa3715e700 10 mon.4@4(peon).pg v16089285 preprocess_query pg_stats(32 pgs tid 28223 v 3959) v1 from osd.6 188.65.144.10:6800/23197
2013-05-16 05:15:57.250253 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10371 request pg_stats(32 pgs tid 28223 v 3959) v1
2013-05-16 05:15:57.250422 7faa3715e700 10 mon.4@4(peon).pg v16089285 preprocess_query pg_stats(25 pgs tid 28157 v 3959) v1 from osd.3 188.65.144.7:6801/20087
2013-05-16 05:15:57.250595 7faa3715e700 10 mon.4@4(peon) e8 forward_request 10372 request pg_stats(25 pgs tid 28157 v 3959) v1
2013-05-16 05:15:57.250677 7faa3715e700 10 mon.4@4(peon).log v15758977 update_from_paxos
2013-05-16 05:15:57.250738 7faa3715e700 10 mon.4@4(peon).log v15758977 update_from_paxos version 15758977 summary v 15758977
2013-05-16 05:15:57.250784 7faa3715e700 10 mon.4@4(peon).auth v8461 update_from_paxos
2013-05-16 05:16:04.187097 7faa3715e700 7 mon.4@4(peon).log v15758978 update_from_paxos applying incremental log 15758978 2013-05-16 05:15:51.354050 mon.0 188.65.144.4:6789/0 13194 : [INF] pgmap v16089284: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 4986KB/s rd, 4777KB/s wr, 654op/s
2013-05-16 05:16:04.187144 7faa3715e700 7 mon.4@4(peon).log v15758978 update_from_paxos applying incremental log 15758978 2013-05-16 05:15:57.238066 mon.0 188.65.144.4:6789/0 13195 : [INF] pgmap v16089285: 768 pgs: 768 active+clean; 1534 GB data, 3066 GB used, 3107 GB / 6359 GB avail; 6541KB/s rd, 4026KB/s wr, 761op/s
(2-2/2)