Started by timer [EnvInject] - Loading node environment variables. Building remotely on 54.37.30.144+ceph_ansible_docker_centos7__6d58dfb0-9e5a-4a7a-84b6-770492124873 (libvirt vagrant centos7) in workspace /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt Cloning the remote Git repository Cloning repository https://github.com/ceph/ceph.git > git init /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt # timeout=10 Fetching upstream changes from https://github.com/ceph/ceph.git > git --version # timeout=10 > git fetch --tags --progress https://github.com/ceph/ceph.git +refs/heads/*:refs/remotes/origin/* # timeout=20 > git config remote.origin.url https://github.com/ceph/ceph.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url https://github.com/ceph/ceph.git # timeout=10 Fetching upstream changes from https://github.com/ceph/ceph.git > git fetch --tags --progress https://github.com/ceph/ceph.git +refs/heads/*:refs/remotes/origin/* # timeout=20 > git rev-parse origin/master^{commit} # timeout=10 Checking out Revision 235f2119010484c12c5bd29421aeef7d44df38a1 (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f 235f2119010484c12c5bd29421aeef7d44df38a1 # timeout=20 Commit message: "Merge pull request #21002 from smithfarm/wip-23437" > git rev-list 235f2119010484c12c5bd29421aeef7d44df38a1 # timeout=10 [EnvInject] - Inject global passwords. [EnvInject] - Mask passwords that will be passed as build parameters. [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SUBCOMMAND=lvm DISTRO=centos7 SCENARIO=dmcrypt CEPH_BRANCH=master OBJECTSTORE=bluestore [EnvInject] - Variables injected successfully. [ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt] $ /bin/bash /tmp/jenkins7208727664354420319.sh ++ mktemp -td venv.XXXXXXXXXX + TEMPVENV=/tmp/venv.nAG333WFaE + VENV=/tmp/venv.nAG333WFaE/bin + set -ex ++ mktemp -td tox.XXXXXXXXXX + WORKDIR=/tmp/tox.45zLJjIuoI + pkgs=("tox") + install_python_packages 'pkgs[@]' + virtualenv /tmp/venv.nAG333WFaE New python executable in /tmp/venv.nAG333WFaE/bin/python Installing Setuptools..............................................................................................................................................................................................................................done. Installing Pip.....................................................................................................................................................................................................................................................................................................................................done. + PIP_SDIST_INDEX=/home/jenkins-build/.cache/pip + mkdir -p /home/jenkins-build/.cache/pip + echo 'Ensuring latest pip is installed' Ensuring latest pip is installed + /tmp/venv.nAG333WFaE/bin/pip install --upgrade --exists-action=i --download=/home/jenkins-build/.cache/pip pip Downloading/unpacking pip File was already downloaded /home/jenkins-build/.cache/pip/pip-9.0.3.tar.gz Running setup.py egg_info for package pip /usr/lib64/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'python_requires' warnings.warn(msg) warning: no previously-included files found matching '.coveragerc' warning: no previously-included files found matching '.mailmap' warning: no previously-included files found matching '.travis.yml' warning: no previously-included files found matching '.landscape.yml' warning: no previously-included files found matching 'pip/_vendor/Makefile' warning: no previously-included files found matching 'tox.ini' warning: no previously-included files found matching 'dev-requirements.txt' warning: no previously-included files found matching 'appveyor.yml' no previously-included directories found matching '.github' no previously-included directories found matching '.travis' no previously-included directories found matching 'docs/_build' no previously-included directories found matching 'contrib' no previously-included directories found matching 'tasks' no previously-included directories found matching 'tests' Successfully downloaded pip Cleaning up... + /tmp/venv.nAG333WFaE/bin/pip install --upgrade --exists-action=i --find-links=file:///home/jenkins-build/.cache/pip --no-index pip Ignoring indexes: https://pypi.python.org/simple/ Unpacking /home/jenkins-build/.cache/pip/pip-9.0.3.tar.gz Running setup.py egg_info for package pip /usr/lib64/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'python_requires' warnings.warn(msg) warning: no previously-included files found matching '.coveragerc' warning: no previously-included files found matching '.mailmap' warning: no previously-included files found matching '.travis.yml' warning: no previously-included files found matching '.landscape.yml' warning: no previously-included files found matching 'pip/_vendor/Makefile' warning: no previously-included files found matching 'tox.ini' warning: no previously-included files found matching 'dev-requirements.txt' warning: no previously-included files found matching 'appveyor.yml' no previously-included directories found matching '.github' no previously-included directories found matching '.travis' no previously-included directories found matching 'docs/_build' no previously-included directories found matching 'contrib' no previously-included directories found matching 'tasks' no previously-included directories found matching 'tests' Installing collected packages: pip Found existing installation: pip 1.4.1 Uninstalling pip: Successfully uninstalled pip Running setup.py install for pip /usr/lib64/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'python_requires' warnings.warn(msg) warning: no previously-included files found matching '.coveragerc' warning: no previously-included files found matching '.mailmap' warning: no previously-included files found matching '.travis.yml' warning: no previously-included files found matching '.landscape.yml' warning: no previously-included files found matching 'pip/_vendor/Makefile' warning: no previously-included files found matching 'tox.ini' warning: no previously-included files found matching 'dev-requirements.txt' warning: no previously-included files found matching 'appveyor.yml' no previously-included directories found matching '.github' no previously-included directories found matching '.travis' no previously-included directories found matching 'docs/_build' no previously-included directories found matching 'contrib' no previously-included directories found matching 'tasks' no previously-included directories found matching 'tests' Installing pip script to /tmp/venv.nAG333WFaE/bin Installing pip2.7 script to /tmp/venv.nAG333WFaE/bin Installing pip2 script to /tmp/venv.nAG333WFaE/bin Successfully installed pip Cleaning up... + echo 'Updating setuptools' Updating setuptools + /tmp/venv.nAG333WFaE/bin/pip install --upgrade --exists-action=i --download=/home/jenkins-build/.cache/pip setuptools DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead. Collecting setuptools File was already downloaded /home/jenkins-build/.cache/pip/setuptools-39.0.1-py2.py3-none-any.whl Successfully downloaded setuptools + pkgs=("${!1}") + for package in '${pkgs[@]}' + echo tox tox + /tmp/venv.nAG333WFaE/bin/pip install --upgrade --exists-action=i --download=/home/jenkins-build/.cache/pip tox DEPRECATION: pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead. Collecting tox File was already downloaded /home/jenkins-build/.cache/pip/tox-2.9.1-py2.py3-none-any.whl Collecting virtualenv>=1.11.2; python_version != "3.2" (from tox) File was already downloaded /home/jenkins-build/.cache/pip/virtualenv-15.2.0-py2.py3-none-any.whl Collecting pluggy<1.0,>=0.3.0 (from tox) Saved /home/jenkins-build/.cache/pip/pluggy-0.6.0-py2.py3-none-any.whl Collecting six (from tox) File was already downloaded /home/jenkins-build/.cache/pip/six-1.11.0-py2.py3-none-any.whl Collecting py>=1.4.17 (from tox) File was already downloaded /home/jenkins-build/.cache/pip/py-1.5.3-py2.py3-none-any.whl Successfully downloaded tox virtualenv pluggy six py + /tmp/venv.nAG333WFaE/bin/pip install --upgrade --exists-action=i --find-links=file:///home/jenkins-build/.cache/pip --no-index tox Collecting tox Collecting virtualenv>=1.11.2; python_version != "3.2" (from tox) Collecting pluggy<1.0,>=0.3.0 (from tox) Collecting six (from tox) Collecting py>=1.4.17 (from tox) Installing collected packages: virtualenv, pluggy, six, py, tox Successfully installed pluggy-0.6.0 py-1.5.3 six-1.11.0 tox-2.9.1 virtualenv-15.2.0 + delete_libvirt_vms ++ sudo virsh list --all --name + libvirt_vms= + sudo find /var/lib/libvirt/images/ -type f -delete + sudo virsh pool-refresh default Pool default refreshed + clear_libvirt_networks ++ sudo virsh net-list --all --name + networks= + restart_libvirt_services + test -f /etc/redhat-release + sudo service libvirtd restart Redirecting to /bin/systemctl restart libvirtd.service + sudo service libvirt-guests restart Redirecting to /bin/systemctl restart libvirt-guests.service + update_vagrant_boxes ++ vagrant box outdated --global ++ grep 'is outdated' ++ awk '{ print $2 }' ++ tr -d ''\''' + outdated_boxes= + '[' -n '' ']' + cd src/ceph-volume/ceph_volume/tests/functional/lvm + VAGRANT_RELOAD_FLAGS='--debug --no-provision' + CEPH_DEV_BRANCH=master + /tmp/venv.nAG333WFaE/bin/tox --workdir=/tmp/tox.45zLJjIuoI -vre centos7-bluestore-dmcrypt -- --provider=libvirt using tox.ini: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/tox.ini using tox-2.9.1 from /tmp/venv.nAG333WFaE/lib/python2.7/site-packages/tox/__init__.pyc centos7-bluestore-dmcrypt create: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt /tmp/tox.45zLJjIuoI$ /tmp/venv.nAG333WFaE/bin/python -m virtualenv --python /tmp/venv.nAG333WFaE/bin/python centos7-bluestore-dmcrypt >/tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/log/centos7-bluestore-dmcrypt-0.log centos7-bluestore-dmcrypt installdeps: ansible==2.4.1, testinfra==1.7.1, pytest-xdist /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm$ /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/pip install ansible==2.4.1 testinfra==1.7.1 pytest-xdist >/tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/log/centos7-bluestore-dmcrypt-1.log /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm$ /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/pip freeze >/tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/log/centos7-bluestore-dmcrypt-2.log centos7-bluestore-dmcrypt installed: ansible==2.4.1.0,apipkg==1.4,asn1crypto==0.24.0,attrs==17.4.0,bcrypt==3.1.4,cffi==1.11.5,cryptography==2.2.1,enum34==1.1.6,execnet==1.5.0,funcsigs==1.0.2,idna==2.6,ipaddress==1.0.19,Jinja2==2.10,MarkupSafe==1.0,more-itertools==4.1.0,paramiko==2.4.1,pluggy==0.6.0,py==1.5.3,pyasn1==0.4.2,pycparser==2.18,PyNaCl==1.2.1,pytest==3.5.0,pytest-forked==0.2,pytest-xdist==1.22.2,PyYAML==3.12,six==1.11.0,testinfra==1.7.1 centos7-bluestore-dmcrypt runtests: PYTHONHASHSEED='3939781514' centos7-bluestore-dmcrypt runtests: commands[0] | git clone -b master --single-branch https://github.com/ceph/ceph-ansible.git /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt$ /usr/bin/git clone -b master --single-branch https://github.com/ceph/ceph-ansible.git /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible Cloning into '/tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible'... centos7-bluestore-dmcrypt runtests: commands[1] | vagrant up --no-provision --provider=libvirt /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt$ /usr/bin/vagrant up --no-provision --provider=libvirt Bringing machine 'mon0' up with 'libvirt' provider... Bringing machine 'osd0' up with 'libvirt' provider... ==> mon0: Box 'centos/7' could not be found. Attempting to find and install... mon0: Box Provider: libvirt mon0: Box Version: >= 0 ==> mon0: Loading metadata for box 'centos/7' mon0: URL: https://vagrantcloud.com/centos/7 ==> mon0: Adding box 'centos/7' (v1802.01) for provider: libvirt mon0: Downloading: https://vagrantcloud.com/centos/boxes/7/versions/1802.01/providers/libvirt.box  mon0: Progress: 0% (Rate: 0/s, Estimated time remaining: --:--:--)  mon0: Progress: 0% (Rate: 0/s, Estimated time remaining: --:--:--)  mon0: Progress: 0% (Rate: 1938k/s, Estimated time remaining: 0:03:10)  mon0: Progress: 6% (Rate: 12.1M/s, Estimated time remaining: 0:00:26)  mon0: Progress: 12% (Rate: 14.6M/s, Estimated time remaining: 0:00:20)  mon0: Progress: 19% (Rate: 16.4M/s, Estimated time remaining: 0:00:16)  mon0: Progress: 24% (Rate: 16.7M/s, Estimated time remaining: 0:00:15)  mon0: Progress: 29% (Rate: 19.0M/s, Estimated time remaining: 0:00:14)  mon0: Progress: 32% (Rate: 17.0M/s, Estimated time remaining: 0:00:14)  mon0: Progress: 35% (Rate: 15.0M/s, Estimated time remaining: 0:00:14)  mon0: Progress: 38% (Rate: 12.8M/s, Estimated time remaining: 0:00:14)  mon0: Progress: 41% (Rate: 10.9M/s, Estimated time remaining: 0:00:14)  mon0: Progress: 43% (Rate: 9.7M/s, Estimated time remaining: 0:00:14)  mon0: Progress: 46% (Rate: 9521k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 49% (Rate: 9442k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 51% (Rate: 8736k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 53% (Rate: 8085k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 54% (Rate: 7479k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 56% (Rate: 6767k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 58% (Rate: 5985k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 60% (Rate: 5687k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 61% (Rate: 5833k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 63% (Rate: 5918k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 65% (Rate: 5902k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 67% (Rate: 5997k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 68% (Rate: 5944k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 69% (Rate: 5555k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 70% (Rate: 4811k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 71% (Rate: 4179k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 72% (Rate: 3526k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 73% (Rate: 2880k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 73% (Rate: 2503k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 74% (Rate: 2327k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 74% (Rate: 2067k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 74% (Rate: 1780k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 75% (Rate: 1499k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 75% (Rate: 1337k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 75% (Rate: 1296k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 76% (Rate: 1292k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 76% (Rate: 1220k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 77% (Rate: 1207k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 77% (Rate: 1199k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 77% (Rate: 1123k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 77% (Rate: 1066k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 78% (Rate: 1089k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 78% (Rate: 1067k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 78% (Rate: 1024k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 79% (Rate: 960k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 79% (Rate: 875k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 79% (Rate: 782k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 79% (Rate: 713k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 79% (Rate: 646k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 80% (Rate: 668k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 80% (Rate: 691k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 80% (Rate: 708k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 80% (Rate: 724k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 80% (Rate: 727k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 81% (Rate: 726k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 81% (Rate: 730k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 81% (Rate: 726k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 81% (Rate: 729k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 82% (Rate: 758k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 82% (Rate: 790k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 82% (Rate: 829k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 82% (Rate: 904k/s, Estimated time remaining: 0:00:13)  mon0: Progress: 83% (Rate: 995k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 83% (Rate: 1080k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 84% (Rate: 1188k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 84% (Rate: 1277k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 84% (Rate: 1296k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 85% (Rate: 1303k/s, Estimated time remaining: 0:00:12)  mon0: Progress: 85% (Rate: 1334k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 85% (Rate: 1335k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 86% (Rate: 1303k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 86% (Rate: 1310k/s, Estimated time remaining: 0:00:11)  mon0: Progress: 87% (Rate: 1337k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 87% (Rate: 1339k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 87% (Rate: 1272k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 88% (Rate: 1237k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 88% (Rate: 1173k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 88% (Rate: 1091k/s, Estimated time remaining: 0:00:10)  mon0: Progress: 89% (Rate: 1012k/s, Estimated time remaining: 0:00:09)  mon0: Progress: 89% (Rate: 997k/s, Estimated time remaining: 0:00:09)  mon0: Progress: 89% (Rate: 1008k/s, Estimated time remaining: 0:00:09)  mon0: Progress: 89% (Rate: 1052k/s, Estimated time remaining: 0:00:09)  mon0: Progress: 90% (Rate: 1073k/s, Estimated time remaining: 0:00:09)  mon0: Progress: 90% (Rate: 1088k/s, Estimated time remaining: 0:00:08)  mon0: Progress: 90% (Rate: 1039k/s, Estimated time remaining: 0:00:08)  mon0: Progress: 91% (Rate: 960k/s, Estimated time remaining: 0:00:08)  mon0: Progress: 91% (Rate: 878k/s, Estimated time remaining: 0:00:08)  mon0: Progress: 91% (Rate: 807k/s, Estimated time remaining: 0:00:08)  mon0: Progress: 91% (Rate: 750k/s, Estimated time remaining: 0:00:08)  mon0: Progress: 91% (Rate: 737k/s, Estimated time remaining: 0:00:07)  mon0: Progress: 92% (Rate: 753k/s, Estimated time remaining: 0:00:07)  mon0: Progress: 92% (Rate: 762k/s, Estimated time remaining: 0:00:07)  mon0: Progress: 92% (Rate: 788k/s, Estimated time remaining: 0:00:07)  mon0: Progress: 92% (Rate: 781k/s, Estimated time remaining: 0:00:07)  mon0: Progress: 93% (Rate: 778k/s, Estimated time remaining: 0:00:07)  mon0: Progress: 93% (Rate: 780k/s, Estimated time remaining: 0:00:06)  mon0: Progress: 93% (Rate: 791k/s, Estimated time remaining: 0:00:06)  mon0: Progress: 93% (Rate: 787k/s, Estimated time remaining: 0:00:06)  mon0: Progress: 94% (Rate: 801k/s, Estimated time remaining: 0:00:06)  mon0: Progress: 94% (Rate: 836k/s, Estimated time remaining: 0:00:06)  mon0: Progress: 94% (Rate: 871k/s, Estimated time remaining: 0:00:05)  mon0: Progress: 94% (Rate: 929k/s, Estimated time remaining: 0:00:05)  mon0: Progress: 95% (Rate: 1035k/s, Estimated time remaining: 0:00:05)  mon0: Progress: 95% (Rate: 1218k/s, Estimated time remaining: 0:00:04)  mon0: Progress: 96% (Rate: 1524k/s, Estimated time remaining: 0:00:03)  mon0: Progress: 97% (Rate: 1962k/s, Estimated time remaining: 0:00:02)  mon0: Progress: 98% (Rate: 2423k/s, Estimated time remaining: 0:00:01)  mon0: Progress: 99% (Rate: 2766k/s, Estimated time remaining: --:--:--) ==> mon0: Successfully added box 'centos/7' (v1802.01) for 'libvirt'! ==> osd0: Box 'centos/7' could not be found. Attempting to find and install... osd0: Box Provider: libvirt osd0: Box Version: >= 0 ==> mon0: Uploading base box image as volume into libvirt storage... ==> osd0: Loading metadata for box 'centos/7'  osd0: URL: https://vagrantcloud.com/centos/7 Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 0% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 1% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 2% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 3% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 4% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 5% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 6% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 7% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 8% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 9% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 10% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 11% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 12% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 13% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 14% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 15% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 16% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17% Progress: 17%==> osd0: Adding box 'centos/7' (v1802.01) for provider: libvirt Progress: 17% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 18% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 19% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 20% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 21% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 22% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 23% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 24% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 25% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 26% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 27% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 28% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 29% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 30% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 31% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 32% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 33% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 34% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 35% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 36% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 37% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 38% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 39% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 40% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 41% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 42% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 43% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 44% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 45% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 46% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 47% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 48% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 49% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 50% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 51% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 52% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 53% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 54% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 55% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 56% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 57% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 58% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 59% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 60% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 61% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 62% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 63% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 64% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 65% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 66% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 67% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 68% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 69% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 70% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 71% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 72% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 73% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 74% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 75% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 76% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 77% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 78% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 79% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 80% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 81% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 82% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 83% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 84% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 85% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 86% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 87% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 88% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 89% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 90% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 91% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 92% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 93% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 94% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 95% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 96% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 97% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 98% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 99% Progress: 100% ==> osd0: Creating image (snapshot of base box volume). ==> mon0: Creating image (snapshot of base box volume). ==> mon0: Creating domain with the following settings... ==> mon0: -- Name: dmcrypt_mon0_1522019828_ce3306baa6bbaadd1e3b ==> mon0: -- Domain type: kvm ==> osd0: Creating domain with the following settings... ==> mon0: -- Cpus: 1 ==> mon0: ==> osd0: -- Name: dmcrypt_osd0_1522019828_38bafda9cfacaafa6f5b ==> osd0: -- Domain type: kvm ==> osd0: -- Cpus: 1 ==> osd0: ==> mon0: -- Feature: acpi ==> osd0: -- Feature: acpi ==> mon0: -- Feature: apic ==> osd0: -- Feature: apic ==> mon0: -- Feature: pae ==> osd0: -- Feature: pae ==> mon0: -- Memory: 512M ==> osd0: -- Memory: 512M ==> osd0: -- Management MAC: ==> osd0: -- Loader: ==> osd0: -- Base box: centos/7 ==> osd0: -- Storage pool: default ==> mon0: -- Management MAC: ==> osd0: -- Image: /var/lib/libvirt/images/dmcrypt_osd0_1522019828_38bafda9cfacaafa6f5b.img (41G) ==> osd0: -- Volume Cache: default ==> mon0: -- Loader: ==> osd0: -- Kernel: ==> mon0: -- Base box: centos/7 ==> osd0: -- Initrd: ==> osd0: -- Graphics Type: vnc ==> osd0: -- Graphics Port: -1 ==> osd0: -- Graphics IP: 127.0.0.1 ==> osd0: -- Graphics Password: Not defined ==> osd0: -- Video Type: cirrus ==> osd0: -- Video VRAM: 9216 ==> osd0: -- Sound Type: ==> osd0: -- Keymap: en-us ==> osd0: -- TPM Path: ==> osd0: -- Disks: hda(qcow2,12G), hdb(qcow2,12G), hdc(qcow2,12G), hdd(qcow2,12G) ==> osd0: -- Disk(hda): /var/lib/libvirt/images/disk-0-0-1522019827.disk ==> osd0: -- Disk(hdb): /var/lib/libvirt/images/disk-0-1-1522019827.disk ==> osd0: -- Disk(hdc): /var/lib/libvirt/images/disk-0-2-1522019827.disk ==> osd0: -- Disk(hdd): /var/lib/libvirt/images/disk-0-3-1522019827.disk ==> osd0: -- INPUT: type=mouse, bus=ps2 ==> mon0: -- Storage pool: default ==> mon0: -- Image: /var/lib/libvirt/images/dmcrypt_mon0_1522019828_ce3306baa6bbaadd1e3b.img (41G) ==> mon0: -- Volume Cache: default ==> mon0: -- Kernel: ==> mon0: -- Initrd: ==> osd0: Creating shared folders metadata... ==> mon0: -- Graphics Type: vnc ==> mon0: -- Graphics Port: -1 ==> mon0: -- Graphics IP: 127.0.0.1 ==> mon0: -- Graphics Password: Not defined ==> mon0: -- Video Type: cirrus ==> osd0: Starting domain. ==> mon0: -- Video VRAM: 9216 ==> mon0: -- Sound Type: ==> mon0: -- Keymap: en-us ==> osd0: Waiting for domain to get an IP address... ==> mon0: -- TPM Path: ==> mon0: -- INPUT: type=mouse, bus=ps2 ==> mon0: Creating shared folders metadata... ==> mon0: Starting domain. ==> mon0: Waiting for domain to get an IP address... ==> osd0: Waiting for SSH to become available... ==> mon0: Waiting for SSH to become available... ==> mon0: Setting hostname... ==> osd0: Setting hostname... ==> mon0: Configuring and enabling network interfaces... ==> osd0: Configuring and enabling network interfaces... mon0: SSH address: 192.168.121.59:22 mon0: SSH username: vagrant mon0: SSH auth method: private key osd0: SSH address: 192.168.121.134:22 osd0: SSH username: vagrant osd0: SSH auth method: private key ==> mon0: Rsyncing folder: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/ => /vagrant ==> osd0: Rsyncing folder: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/ => /vagrant ==> mon0: Machine not provisioned because `--no-provision` is specified. ==> osd0: Machine not provisioned because `--no-provision` is specified. centos7-bluestore-dmcrypt runtests: commands[2] | bash /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/../scripts/generate_ssh_config.sh /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt$ /usr/bin/bash /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/../scripts/generate_ssh_config.sh /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt centos7-bluestore-dmcrypt runtests: commands[3] | ansible-playbook -vv -i /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt$ /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/ansible-playbook -vv -i /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml ansible-playbook 2.4.1.0 config file = None configured module search path = [u'/home/jenkins-build/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/lib/python2.7/site-packages/ansible executable location = /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/ansible-playbook python version = 2.7.5 (default, Aug 4 2017, 00:39:18) [GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] No config file found; using defaults PLAYBOOK: lvm_setup.yml ******************************************************** 1 plays in /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml PLAY [osds] ******************************************************************** META: ran handlers TASK [create physical volume] ************************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:8 changed: [osd0] => { "changed": true, "cmd": [ "pvcreate", "/dev/sdb" ], "delta": "0:00:00.328110", "end": "2018-03-25 23:20:07.040718", "failed": false, "failed_when_result": false, "rc": 0, "start": "2018-03-25 23:20:06.712608" } STDOUT: Physical volume "/dev/sdb" successfully created. TASK [create volume group] ***************************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:12 changed: [osd0] => { "changed": true, "cmd": [ "vgcreate", "test_group", "/dev/sdb" ], "delta": "0:00:00.228792", "end": "2018-03-25 23:20:10.138134", "failed": false, "failed_when_result": false, "rc": 0, "start": "2018-03-25 23:20:09.909342" } STDOUT: Volume group "test_group" successfully created TASK [create logical volume 1] ************************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:16 changed: [osd0] => { "changed": true, "cmd": [ "lvcreate", "--yes", "-l", "50%FREE", "-n", "data-lv1", "test_group" ], "delta": "0:00:00.194110", "end": "2018-03-25 23:20:13.253018", "failed": false, "failed_when_result": false, "rc": 0, "start": "2018-03-25 23:20:13.058908" } STDOUT: Logical volume "data-lv1" created. TASK [create logical volume 2] ************************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:20 changed: [osd0] => { "changed": true, "cmd": [ "lvcreate", "--yes", "-l", "50%FREE", "-n", "data-lv2", "test_group" ], "delta": "0:00:00.298355", "end": "2018-03-25 23:20:16.331089", "failed": false, "failed_when_result": false, "rc": 0, "start": "2018-03-25 23:20:16.032734" } STDOUT: Logical volume "data-lv2" created. TASK [remove /dev/sdc1 if it exists] ******************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:27 ok: [osd0] => { "changed": false, "disk": { "dev": "/dev/sdc", "logical_block": 512, "model": "ATA QEMU HARDDISK", "physical_block": 512, "size": 12582912.0, "table": "unknown", "unit": "kib" }, "failed": false, "partitions": [], "script": "" } TASK [remove /dev/sdc2 if it exists] ******************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:33 ok: [osd0] => { "changed": false, "disk": { "dev": "/dev/sdc", "logical_block": 512, "model": "ATA QEMU HARDDISK", "physical_block": 512, "size": 12582912.0, "table": "unknown", "unit": "kib" }, "failed": false, "partitions": [], "script": "" } TASK [partition /dev/sdc for journals] ***************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:39 changed: [osd0] => { "changed": true, "disk": { "dev": "/dev/sdc", "logical_block": 512, "model": "ATA QEMU HARDDISK", "physical_block": 512, "size": 100.0, "table": "gpt", "unit": "%" }, "failed": false, "partitions": [ { "begin": 0.01, "end": 50.0, "flags": [], "fstype": "", "name": "primary", "num": 1, "size": 50.0, "unit": "%" } ], "script": "unit % mklabel gpt mkpart primary 0% 50%" } TASK [partition /dev/sdc for journals] ***************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:49 changed: [osd0] => { "changed": true, "disk": { "dev": "/dev/sdc", "logical_block": 512, "model": "ATA QEMU HARDDISK", "physical_block": 512, "size": 100.0, "table": "gpt", "unit": "%" }, "failed": false, "partitions": [ { "begin": 0.01, "end": 50.0, "flags": [], "fstype": "", "name": "primary", "num": 1, "size": 50.0, "unit": "%" }, { "begin": 50.0, "end": 100.0, "flags": [], "fstype": "", "name": "primary", "num": 2, "size": 50.0, "unit": "%" } ], "script": "unit % mkpart primary 50% 100%" } TASK [create journals vg from /dev/sdc2] *************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:59 changed: [osd0] => { "changed": true, "failed": false } TASK [create journal1 lv] ****************************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/lvm_setup.yml:64 changed: [osd0] => { "changed": true, "cmd": [ "lvcreate", "--yes", "-l", "100%FREE", "-n", "journal1", "journals" ], "delta": "0:00:00.128293", "end": "2018-03-25 23:20:36.032849", "failed": false, "failed_when_result": false, "rc": 0, "start": "2018-03-25 23:20:35.904556" } STDOUT: Logical volume "journal1" created. META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* osd0 : ok=10 changed=8 unreachable=0 failed=0 centos7-bluestore-dmcrypt runtests: commands[4] | ansible-playbook -vv -i /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/setup.yml /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt$ /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/ansible-playbook -vv -i /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/setup.yml ansible-playbook 2.4.1.0 config file = None configured module search path = [u'/home/jenkins-build/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/lib/python2.7/site-packages/ansible executable location = /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/ansible-playbook python version = 2.7.5 (default, Aug 4 2017, 00:39:18) [GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] No config file found; using defaults PLAYBOOK: setup.yml ************************************************************ 1 plays in /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/setup.yml PLAY [osds] ******************************************************************** META: ran handlers TASK [partition /dev/sdd for lvm data usage] *********************************** task path: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/setup.yml:8 changed: [osd0] => { "changed": true, "disk": { "dev": "/dev/sdd", "logical_block": 512, "model": "ATA QEMU HARDDISK", "physical_block": 512, "size": 100.0, "table": "gpt", "unit": "%" }, "failed": false, "partitions": [ { "begin": 0.01, "end": 50.0, "flags": [], "fstype": "", "name": "primary", "num": 1, "size": 50.0, "unit": "%" } ], "script": "unit % mklabel gpt mkpart primary 0% 50%" } TASK [partition /dev/sdd lvm journals] ***************************************** task path: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/setup.yml:18 changed: [osd0] => { "changed": true, "disk": { "dev": "/dev/sdd", "logical_block": 512, "model": "ATA QEMU HARDDISK", "physical_block": 512, "size": 100.0, "table": "gpt", "unit": "%" }, "failed": false, "partitions": [ { "begin": 0.01, "end": 50.0, "flags": [], "fstype": "", "name": "primary", "num": 1, "size": 50.0, "unit": "%" }, { "begin": 50.0, "end": 100.0, "flags": [], "fstype": "", "name": "primary", "num": 2, "size": 50.0, "unit": "%" } ], "script": "unit % mkpart primary 50% 100%" } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* osd0 : ok=2 changed=2 unreachable=0 failed=0 centos7-bluestore-dmcrypt runtests: commands[5] | ansible-playbook -vv -i /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample --extra-vars fetch_directory=/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch ceph_dev_branch=master ceph_dev_sha1=latest /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt$ /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/ansible-playbook -vv -i /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample --extra-vars fetch_directory=/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch ceph_dev_branch=master ceph_dev_sha1=latest ansible-playbook 2.4.1.0 config file = None configured module search path = [u'/home/jenkins-build/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/lib/python2.7/site-packages/ansible executable location = /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/ansible-playbook python version = 2.7.5 (default, Aug 4 2017, 00:39:18) [GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] No config file found; using defaults [DEPRECATION WARNING]: The use of 'include' for tasks has been deprecated. Use 'import_tasks' for static inclusions or 'include_tasks' for dynamic inclusions. This feature will be removed in a future release. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. [DEPRECATION WARNING]: include is kept for backwards compatibility but usage is discouraged. The module documentation details page may explain more about this rationale.. This feature will be removed in a future release. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml [DEPRECATION WARNING]: The use of 'static' has been deprecated. Use 'import_tasks' for static inclusion, or 'include_tasks' for dynamic inclusion. This feature will be removed in a future release. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/start_monitor.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/secure_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/start_docker_monitor.yml [DEPRECATION WARNING]: docker is kept for backwards compatibility but usage is discouraged. The module documentation details page may explain more about this rationale.. This feature will be removed in a future release. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/configure_ceph_command_aliases.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/fetch_configs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/crush_rules.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/set_osd_pool_default_pg_num.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/create_mds_filesystems.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/calamari.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mgr/tasks/pre_requisite.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mgr/tasks/docker/main.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mgr/tasks/docker/copy_configs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mgr/tasks/docker/start_docker_mgr.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-agent/tasks/pre_requisite.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-agent/tasks/start_agent.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/build_devices.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/copy_configs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_gpt.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mds/tasks/non_containerized.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mds/tasks/containerized.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-nfs/tasks/pre_requisite_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-nfs/tasks/pre_requisite_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-nfs/tasks/create_rgw_nfs_user.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-nfs/tasks/ganesha_selinux_fix.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-nfs/tasks/start_nfs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-restapi/tasks/pre_requisite.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-restapi/tasks/start_restapi.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-restapi/tasks/docker/main.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-restapi/tasks/docker/copy_configs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-restapi/tasks/docker/start_docker_restapi.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-rbd-mirror/tasks/pre_requisite.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-rbd-mirror/tasks/start_rbd_mirror.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-rbd-mirror/tasks/configure_mirroring.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-rbd-mirror/tasks/docker/main.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-rbd-mirror/tasks/docker/copy_configs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-rbd-mirror/tasks/docker/start_docker_rbd_mirror.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-client/tasks/pre_requisite.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-client/tasks/create_users_keys.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_cluster.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-iscsi-gw/tasks/check_mandatory_vars.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-iscsi-gw/tasks/prerequisites.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-iscsi-gw/tasks/deploy_ssl_keys.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-iscsi-gw/tasks/generate_crt.yml statically imported: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-iscsi-gw/tasks/configure_iscsi.yml PLAYBOOK: site.yml.sample ****************************************************** 12 plays in /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample [WARNING]: Could not match supplied host pattern, ignoring: agents [WARNING]: Could not match supplied host pattern, ignoring: mdss [WARNING]: Could not match supplied host pattern, ignoring: rgws [WARNING]: Could not match supplied host pattern, ignoring: nfss [WARNING]: Could not match supplied host pattern, ignoring: restapis [WARNING]: Could not match supplied host pattern, ignoring: rbdmirrors [WARNING]: Could not match supplied host pattern, ignoring: clients [WARNING]: Could not match supplied host pattern, ignoring: mgrs [WARNING]: Could not match supplied host pattern, ignoring: iscsi-gws PLAY [mons,agents,osds,mdss,rgws,nfss,restapis,rbdmirrors,clients,mgrs,iscsi-gws] *** META: ran handlers TASK [check for python2] ******************************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:28 ok: [mon0] => { "changed": false, "failed": false, "stat": { "atime": 1522019987.2515776, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1519840170.3801672, "dev": 64768, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 93962, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/usr/bin/python2.7", "lnk_target": "python2", "mimetype": "inode/symlink", "mode": "0777", "mtime": 1519840170.3801672, "nlink": 1, "path": "/usr/bin/python", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 7, "uid": 0, "version": null, "wgrp": true, "woth": true, "writeable": false, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } ok: [osd0] => { "changed": false, "failed": false, "stat": { "atime": 1522019986.832627, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1519840170.3801672, "dev": 64768, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 93962, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/usr/bin/python2.7", "lnk_target": "python2", "mimetype": "inode/symlink", "mode": "0777", "mtime": 1519840170.3801672, "nlink": 1, "path": "/usr/bin/python", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 7, "uid": 0, "version": null, "wgrp": true, "woth": true, "writeable": false, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [install python2 for debian based systems] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:34 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [install python2 for fedora] ********************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:40 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [install python2 for opensuse] ******************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:46 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [gather facts] ************************************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:52 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [gather and delegate facts] *********************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:57 ok: [mon0 -> osd0] => (item=osd0) ok: [osd0 -> osd0] => (item=osd0) ok: [mon0 -> mon0] => (item=mon0) ok: [osd0 -> mon0] => (item=mon0) TASK [install required packages for fedora > 23] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:65 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } META: ran handlers META: ran handlers PLAY [mons] ******************************************************************** TASK [set ceph monitor install 'In Progress'] ********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:75 ok: [mon0] => { "ansible_stats": { "aggregate": true, "data": { "installer_phase_ceph_mon": { "start": "20180326012059Z", "status": "In Progress" } }, "per_host": false }, "changed": false, "failed": false } META: ran handlers TASK [ceph-defaults : check for a mon container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for an osd container] ****************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:11 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a mds container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:20 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a rgw container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:29 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a mgr container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:38 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a rbd mirror container] ************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:47 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a nfs container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:56 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph mon socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:2 ok: [mon0] => { "changed": false, "cmd": "stat --printf=%n /var/run/ceph/ceph-mon*.asok", "delta": "0:00:00.008446", "end": "2018-03-25 23:21:03.066521", "failed": false, "failed_when_result": false, "rc": 1, "start": "2018-03-25 23:21:03.058075" } STDERR: stat: cannot stat ‘/var/run/ceph/ceph-mon*.asok’: No such file or directory MSG: non-zero return code TASK [ceph-defaults : check if the ceph mon socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:11 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph mon socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:21 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph osd socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:30 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph osd socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:40 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph osd socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:50 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph mds socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:59 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph mds socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:69 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph mds socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:79 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph rgw socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:88 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph rgw socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:98 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph rgw socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:108 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph mgr socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:117 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph mgr socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:127 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph mgr socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:137 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph rbd mirror socket] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:146 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph rbd mirror socket is in-use] *********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:156 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph rbd mirror socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:166 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph nfs ganesha socket] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:175 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph nfs ganesha socket is in-use] ********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:184 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph nfs ganesha socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:194 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if it is atomic host] ****************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:2 ok: [mon0] => { "changed": false, "failed": false, "stat": { "exists": false } } TASK [ceph-defaults : set_fact is_atomic] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:7 ok: [mon0] => { "ansible_facts": { "is_atomic": false }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact monitor_name ansible_hostname] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:11 ok: [mon0] => { "ansible_facts": { "monitor_name": "mon0" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact monitor_name ansible_fqdn] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:17 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact docker_exec_cmd] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:23 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : is ceph running already?] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:33 ok: [mon0 -> mon0] => { "changed": false, "cmd": [ "timeout", "5", "ceph", "--cluster", "ceph", "fsid" ], "delta": "0:00:00.004551", "end": "2018-03-25 23:21:09.979033", "failed": false, "failed_when_result": false, "rc": 127, "start": "2018-03-25 23:21:09.974482" } STDERR: timeout: failed to run command ‘ceph’: No such file or directory MSG: non-zero return code TASK [ceph-defaults : check if /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch directory exists] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:45 ok: [mon0 -> localhost] => { "changed": false, "failed": false, "stat": { "exists": false } } TASK [ceph-defaults : set_fact ceph_current_fsid rc 1] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:55 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : create a local fetch directory if it does not exist] ***** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:62 ok: [mon0 -> localhost] => { "changed": false, "failed": false, "gid": 1001, "group": "jenkins-build", "mode": "0775", "owner": "jenkins-build", "path": "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch", "size": 4096, "state": "directory", "uid": 1001 } TASK [ceph-defaults : set_fact fsid ceph_current_fsid.stdout] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:73 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact ceph_release ceph_stable_release] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:80 ok: [mon0] => { "ansible_facts": { "ceph_release": "dummy" }, "changed": false, "failed": false } TASK [ceph-defaults : generate cluster fsid] *********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:84 changed: [mon0 -> localhost] => { "changed": true, "cmd": "python -c 'import uuid; print(str(uuid.uuid4()))' | tee /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/ceph_cluster_uuid.conf", "delta": "0:00:00.063559", "end": "2018-03-26 01:21:11.631701", "failed": false, "rc": 0, "start": "2018-03-26 01:21:11.568142" } STDOUT: fad7031c-4bd3-4328-9716-4fbddc9c14b8 TASK [ceph-defaults : reuse cluster fsid when cluster is already running] ****** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:95 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : read cluster fsid if it already exists] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:104 ok: [mon0 -> localhost] => { "changed": false, "cmd": [ "cat", "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/ceph_cluster_uuid.conf" ], "delta": "0:00:00.006761", "end": "2018-03-26 01:21:12.013536", "failed": false, "rc": 0, "start": "2018-03-26 01:21:12.006775" } STDOUT: fad7031c-4bd3-4328-9716-4fbddc9c14b8 TASK [ceph-defaults : set_fact fsid] ******************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:116 ok: [mon0] => { "ansible_facts": { "fsid": "fad7031c-4bd3-4328-9716-4fbddc9c14b8" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact mds_name ansible_hostname] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:122 ok: [mon0] => { "ansible_facts": { "mds_name": "mon0" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact mds_name ansible_fqdn] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:128 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact rbd_client_directory_owner ceph] **************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:134 ok: [mon0] => { "ansible_facts": { "rbd_client_directory_owner": "ceph" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact rbd_client_directory_group rbd_client_directory_group] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:141 ok: [mon0] => { "ansible_facts": { "rbd_client_directory_group": "ceph" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact rbd_client_directory_mode 0770] ***************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:148 ok: [mon0] => { "ansible_facts": { "rbd_client_directory_mode": "0770" }, "changed": false, "failed": false } TASK [ceph-defaults : resolve device link(s)] ********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:155 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact build devices from resolved symlinks] *********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:165 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact build final devices list] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:174 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact ceph_uid for Debian based system] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:182 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact ceph_uid for Red Hat based system] ************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:189 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported system] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported architecture] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:8 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:14 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution for red hat ceph storage] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:20 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : determine if node is registered with subscription-manager] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:28 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unregistered red hat rhcs linux] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:39 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution for ubuntu cloud archive] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:48 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported openSUSE distribution] ***************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:55 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported ansible version] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:62 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported ansible version] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:68 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail if systemd is not present] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:75 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution for iscsi gateways] ******* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:81 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution version for iscsi gateways] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:88 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure an installation origin was chosen] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure a repository was chosen] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:10 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail if local scenario is enabled on debian] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:19 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure ceph_stable_release is set] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:26 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure ceph_stable_release is correct] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:36 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : verify that a repository type was chosen for ceph rhcs version] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:46 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : verify that ceph_rhcs_cdn_debian_repo url is valid for red hat storage] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:56 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure monitor_interface, monitor_address or monitor_address_block is defined] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:68 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure radosgw_interface, radosgw_address or radosgw_address_block is defined] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:77 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include checks/check_firewall.yml] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:8 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include misc/configure_firewall_rpm.yml] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:15 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml for mon0 TASK [ceph-common : check firewalld installation on redhat or suse] ************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:2 ok: [mon0] => { "changed": false, "cmd": [ "rpm", "-q", "firewalld" ], "delta": "0:00:00.053587", "end": "2018-03-25 23:21:16.874816", "failed": false, "rc": 0, "start": "2018-03-25 23:21:16.821229" } STDOUT: firewalld-0.4.4.4-6.el7.noarch TASK [ceph-common : open monitor ports] **************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:13 NOTIFIED HANDLER restart firewalld changed: [mon0] => { "changed": true, "failed": false } MSG: Permanent operation, Changed service ceph-mon to enabled, (offline operation: only on-disk configs were altered) TASK [ceph-common : open osd ports] ******************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:28 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open rgw ports] ******************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:43 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open mds ports] ******************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:58 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open nfs ports] ******************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:73 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open nfs ports (portmapper)] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:88 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open restapi ports] **************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:103 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open rbdmirror ports] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:118 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open iscsi ports] ****************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:133 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-common : restart firewalld] ****************************** changed: [mon0] => { "changed": true, "enabled": true, "failed": false, "name": "firewalld", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dbus.service basic.target polkit.service system.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "network-pre.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "iptables.service ip6tables.service shutdown.target ipset.service ebtables.service", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "man:firewalld(1)", "EnvironmentFile": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "1885", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "1885", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "firewalld.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Requires": "basic.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "disabled", "Wants": "network-pre.target system.slice", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } META: ran handlers TASK [ceph-common : include installs/install_on_redhat.yml] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:22 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_on_redhat.yml for mon0 TASK [ceph-common : include configure_redhat_repository_installation.yml] ****** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_on_redhat.yml:2 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml for mon0 TASK [ceph-common : include redhat_community_repository.yml] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include redhat_rhcs_repository.yml] ************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:7 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include redhat_dev_repository.yml] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:12 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/redhat_dev_repository.yml for mon0 TASK [ceph-common : fetch ceph red hat development repository] ***************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/redhat_dev_repository.yml:2 ok: [mon0] => { "changed": false, "connection": "close", "content": "[ceph]\nname=ceph packages for \\$basearch\nbaseurl=https://3.chacra.ceph.com/r/ceph/master/235f2119010484c12c5bd29421aeef7d44df38a1/centos/7/flavors/default/\\$basearch\nenabled=1\ngpgcheck=0\ntype=rpm-md\n\n[ceph-noarch]\nname=ceph noarch packages\nbaseurl=https://3.chacra.ceph.com/r/ceph/master/235f2119010484c12c5bd29421aeef7d44df38a1/centos/7/flavors/default/noarch\nenabled=1\ngpgcheck=0\ntype=rpm-md\n\n[ceph-source]\nname=ceph source packages\nbaseurl=https://3.chacra.ceph.com/r/ceph/master/235f2119010484c12c5bd29421aeef7d44df38a1/centos/7/flavors/default/SRPMS\nenabled=1\ngpgcheck=0\ntype=rpm-md\n", "content_length": "588", "content_type": "text/plain; charset=UTF-8", "cookies": {}, "date": "Sun, 25 Mar 2018 23:21:30 GMT", "failed": false, "redirected": true, "server": "nginx", "status": 200, "strict_transport_security": "max-age=31536000", "url": "https://3.chacra.ceph.com/repos/ceph/master/235f2119010484c12c5bd29421aeef7d44df38a1/centos/7/flavors/default/repo" } MSG: OK (588 bytes) TASK [ceph-common : configure ceph red hat development repository] ************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/redhat_dev_repository.yml:8 changed: [mon0] => { "changed": true, "checksum": "2ac7da5709e8b9f29f66731b0a6a0fd2382914d3", "dest": "/etc/yum.repos.d/ceph-dev.repo", "failed": false, "gid": 0, "group": "root", "md5sum": "4ef618a664a182aad7f6489c09e36c5c", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:system_conf_t:s0", "size": 588, "src": "/home/vagrant/.ansible/tmp/ansible-tmp-1522020090.86-150707586504524/source", "state": "file", "uid": 0 } TASK [ceph-common : include redhat_custom_repository.yml] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:17 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : purge yum cache] ******************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:23 ok: [mon0] => { "changed": false, "cmd": [ "yum", "clean", "all" ], "delta": "0:00:00.226523", "end": "2018-03-25 23:21:39.328848", "failed": false, "rc": 0, "start": "2018-03-25 23:21:39.102325" } STDOUT: Loaded plugins: fastestmirror Cleaning repos: base ceph ceph-noarch ceph-source extras updates Cleaning up everything Maybe you want: rm -rf /var/cache/yum, to also free up space taken by orphaned data from disabled or removed repos TASK [ceph-common : include configure_redhat_local_installation.yml] *********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_on_redhat.yml:7 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include install_redhat_packages.yml] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_on_redhat.yml:12 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml for mon0 TASK [ceph-common : install redhat dependencies] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install centos dependencies] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:9 changed: [mon0] => { "changed": true, "failed": false, "rc": 0, "results": [ "python-pycurl-7.19.0-19.el7.x86_64 providing python-pycurl is already installed", "libselinux-python-2.5-11.el7.x86_64 providing libselinux-python is already installed", "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirror.ate.info\n * extras: mirror.guru\n * updates: mirror.guru\nResolving Dependencies\n--> Running transaction check\n---> Package epel-release.noarch 0:7-9 will be installed\n---> Package hdparm.x86_64 0:9.43-5.el7 will be installed\n---> Package python-setuptools.noarch 0:0.9.8-7.el7 will be installed\n--> Processing Dependency: python-backports-ssl_match_hostname for package: python-setuptools-0.9.8-7.el7.noarch\n--> Running transaction check\n---> Package python-backports-ssl_match_hostname.noarch 0:3.4.0.2-4.el7 will be installed\n--> Processing Dependency: python-backports for package: python-backports-ssl_match_hostname-3.4.0.2-4.el7.noarch\n--> Running transaction check\n---> Package python-backports.x86_64 0:1.0-8.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling:\n epel-release noarch 7-9 extras 14 k\n hdparm x86_64 9.43-5.el7 base 83 k\n python-setuptools noarch 0.9.8-7.el7 base 397 k\nInstalling for dependencies:\n python-backports x86_64 1.0-8.el7 base 5.8 k\n python-backports-ssl_match_hostname noarch 3.4.0.2-4.el7 base 12 k\n\nTransaction Summary\n================================================================================\nInstall 3 Packages (+2 Dependent packages)\n\nTotal download size: 512 k\nInstalled size: 2.1 M\nDownloading packages:\nPublic key for epel-release-7-9.noarch.rpm is not installed\nPublic key for python-backports-1.0-8.el7.x86_64.rpm is not installed\n--------------------------------------------------------------------------------\nTotal 1.5 MB/s | 512 kB 00:00 \nRetrieving key from file:///etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-7\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : python-backports-1.0-8.el7.x86_64 1/5 \n Installing : python-backports-ssl_match_hostname-3.4.0.2-4.el7.noarch 2/5 \n Installing : python-setuptools-0.9.8-7.el7.noarch 3/5 \n Installing : epel-release-7-9.noarch 4/5 \n Installing : hdparm-9.43-5.el7.x86_64 5/5 \n Verifying : python-backports-1.0-8.el7.x86_64 1/5 \n Verifying : hdparm-9.43-5.el7.x86_64 2/5 \n Verifying : python-setuptools-0.9.8-7.el7.noarch 3/5 \n Verifying : epel-release-7-9.noarch 4/5 \n Verifying : python-backports-ssl_match_hostname-3.4.0.2-4.el7.noarch 5/5 \n\nInstalled:\n epel-release.noarch 0:7-9 hdparm.x86_64 0:9.43-5.el7 \n python-setuptools.noarch 0:0.9.8-7.el7 \n\nDependency Installed:\n python-backports.x86_64 0:1.0-8.el7 \n python-backports-ssl_match_hostname.noarch 0:3.4.0.2-4.el7 \n\nComplete!\n" ] } MSG: warning: /var/cache/yum/x86_64/7/extras/packages/epel-release-7-9.noarch.rpm: Header V3 RSA/SHA256 Signature, key ID f4a80eb5: NOKEY Importing GPG key 0xF4A80EB5: Userid : "CentOS-7 Key (CentOS 7 Official Signing Key) " Fingerprint: 6341 ab27 53d7 8a78 a7c2 7bb1 24c6 a8a7 f4a8 0eb5 Package : centos-release-7-4.1708.el7.centos.x86_64 (@anaconda) From : /etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-7 TASK [ceph-common : install redhat ceph-common] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:16 changed: [mon0] => { "changed": true, "failed": false, "rc": 0, "results": [ "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirror.ate.info\n * epel: mirrors.ircam.fr\n * extras: mirror.guru\n * updates: mirror.guru\nResolving Dependencies\n--> Running transaction check\n---> Package ceph-common.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: librados2 = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-rados = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-rgw = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libcephfs2 = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-rbd = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-cephfs = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: librbd1 = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-requests for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-prettytable for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libibverbs.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libtcmalloc.so.4()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libleveldb.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: librbd.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libsnappy.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: librados.so.2()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libradosstriper.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libfuse.so.2()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libbabeltrace.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libbabeltrace-ctf.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libcephfs.so.2()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libceph-common.so.0()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Running transaction check\n---> Package fuse-libs.x86_64 0:2.9.2-8.el7 will be installed\n---> Package gperftools-libs.x86_64 0:2.4-8.el7 will be installed\n--> Processing Dependency: libunwind.so.8()(64bit) for package: gperftools-libs-2.4-8.el7.x86_64\n---> Package leveldb.x86_64 0:1.12.0-11.el7 will be installed\n---> Package libbabeltrace.x86_64 0:1.2.4-3.el7 will be installed\n---> Package libcephfs2.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package libibverbs.x86_64 0:13-7.el7 will be installed\n--> Processing Dependency: rdma-core(x86-64) = 13-7.el7 for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: perl(warnings) for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: perl(strict) for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: perl(Getopt::Long) for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: perl(File::Basename) for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: /usr/bin/perl for package: libibverbs-13-7.el7.x86_64\n---> Package librados2.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: liblttng-ust.so.0()(64bit) for package: 2:librados2-13.0.1-3240.g235f211.el7.x86_64\n---> Package libradosstriper1.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package librbd1.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package python-cephfs.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package python-prettytable.noarch 0:0.7.2-3.el7 will be installed\n---> Package python-rados.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package python-rbd.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package python-requests.noarch 0:2.6.0-1.el7_1 will be installed\n--> Processing Dependency: python-urllib3 >= 1.10.2-1 for package: python-requests-2.6.0-1.el7_1.noarch\n---> Package python-rgw.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: librgw2 = 2:13.0.1-3240.g235f211.el7 for package: 2:python-rgw-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: librgw.so.2()(64bit) for package: 2:python-rgw-13.0.1-3240.g235f211.el7.x86_64\n---> Package snappy.x86_64 0:1.1.0-3.el7 will be installed\n--> Running transaction check\n---> Package librgw2.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package libunwind.x86_64 2:1.2-2.el7 will be installed\n---> Package lttng-ust.x86_64 0:2.4.1-4.el7 will be installed\n--> Processing Dependency: liburcu-bp.so.1()(64bit) for package: lttng-ust-2.4.1-4.el7.x86_64\n--> Processing Dependency: liburcu-cds.so.1()(64bit) for package: lttng-ust-2.4.1-4.el7.x86_64\n---> Package perl.x86_64 4:5.16.3-292.el7 will be installed\n--> Processing Dependency: perl-libs = 4:5.16.3-292.el7 for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Socket) >= 1.3 for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Scalar::Util) >= 1.10 for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl-macros for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl-libs for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(threads::shared) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(threads) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(constant) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Time::Local) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Time::HiRes) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Storable) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Socket) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Scalar::Util) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Pod::Simple::XHTML) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Pod::Simple::Search) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Filter::Util::Call) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Temp) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Spec::Unix) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Spec::Functions) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Spec) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Path) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Exporter) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Cwd) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Carp) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: libperl.so()(64bit) for package: 4:perl-5.16.3-292.el7.x86_64\n---> Package perl-Getopt-Long.noarch 0:2.40-2.el7 will be installed\n--> Processing Dependency: perl(Pod::Usage) >= 1.14 for package: perl-Getopt-Long-2.40-2.el7.noarch\n--> Processing Dependency: perl(Text::ParseWords) for package: perl-Getopt-Long-2.40-2.el7.noarch\n---> Package python-urllib3.noarch 0:1.10.2-3.el7 will be installed\n--> Processing Dependency: python-six for package: python-urllib3-1.10.2-3.el7.noarch\n---> Package rdma-core.x86_64 0:13-7.el7 will be installed\n--> Processing Dependency: pciutils for package: rdma-core-13-7.el7.x86_64\n--> Running transaction check\n---> Package pciutils.x86_64 0:3.5.1-2.el7 will be installed\n---> Package perl-Carp.noarch 0:1.26-244.el7 will be installed\n---> Package perl-Exporter.noarch 0:5.68-3.el7 will be installed\n---> Package perl-File-Path.noarch 0:2.09-2.el7 will be installed\n---> Package perl-File-Temp.noarch 0:0.23.01-3.el7 will be installed\n---> Package perl-Filter.x86_64 0:1.49-3.el7 will be installed\n---> Package perl-PathTools.x86_64 0:3.40-5.el7 will be installed\n---> Package perl-Pod-Simple.noarch 1:3.28-4.el7 will be installed\n--> Processing Dependency: perl(Pod::Escapes) >= 1.04 for package: 1:perl-Pod-Simple-3.28-4.el7.noarch\n--> Processing Dependency: perl(Encode) for package: 1:perl-Pod-Simple-3.28-4.el7.noarch\n---> Package perl-Pod-Usage.noarch 0:1.63-3.el7 will be installed\n--> Processing Dependency: perl(Pod::Text) >= 3.15 for package: perl-Pod-Usage-1.63-3.el7.noarch\n--> Processing Dependency: perl-Pod-Perldoc for package: perl-Pod-Usage-1.63-3.el7.noarch\n---> Package perl-Scalar-List-Utils.x86_64 0:1.27-248.el7 will be installed\n---> Package perl-Socket.x86_64 0:2.010-4.el7 will be installed\n---> Package perl-Storable.x86_64 0:2.45-3.el7 will be installed\n---> Package perl-Text-ParseWords.noarch 0:3.29-4.el7 will be installed\n---> Package perl-Time-HiRes.x86_64 4:1.9725-3.el7 will be installed\n---> Package perl-Time-Local.noarch 0:1.2300-2.el7 will be installed\n---> Package perl-constant.noarch 0:1.27-2.el7 will be installed\n---> Package perl-libs.x86_64 4:5.16.3-292.el7 will be installed\n---> Package perl-macros.x86_64 4:5.16.3-292.el7 will be installed\n---> Package perl-threads.x86_64 0:1.87-4.el7 will be installed\n---> Package perl-threads-shared.x86_64 0:1.43-6.el7 will be installed\n---> Package python-six.noarch 0:1.9.0-2.el7 will be installed\n---> Package userspace-rcu.x86_64 0:0.7.16-1.el7 will be installed\n--> Running transaction check\n---> Package perl-Encode.x86_64 0:2.51-7.el7 will be installed\n---> Package perl-Pod-Escapes.noarch 1:1.04-292.el7 will be installed\n---> Package perl-Pod-Perldoc.noarch 0:3.20-4.el7 will be installed\n--> Processing Dependency: perl(parent) for package: perl-Pod-Perldoc-3.20-4.el7.noarch\n--> Processing Dependency: perl(HTTP::Tiny) for package: perl-Pod-Perldoc-3.20-4.el7.noarch\n---> Package perl-podlators.noarch 0:2.5.1-3.el7 will be installed\n--> Running transaction check\n---> Package perl-HTTP-Tiny.noarch 0:0.033-3.el7 will be installed\n---> Package perl-parent.noarch 1:0.225-244.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling:\n ceph-common x86_64 2:13.0.1-3240.g235f211.el7 ceph 13 M\nInstalling for dependencies:\n fuse-libs x86_64 2.9.2-8.el7 base 93 k\n gperftools-libs x86_64 2.4-8.el7 base 272 k\n leveldb x86_64 1.12.0-11.el7 epel 161 k\n libbabeltrace x86_64 1.2.4-3.el7 epel 147 k\n libcephfs2 x86_64 2:13.0.1-3240.g235f211.el7 ceph 423 k\n libibverbs x86_64 13-7.el7 base 194 k\n librados2 x86_64 2:13.0.1-3240.g235f211.el7 ceph 2.7 M\n libradosstriper1 x86_64 2:13.0.1-3240.g235f211.el7 ceph 332 k\n librbd1 x86_64 2:13.0.1-3240.g235f211.el7 ceph 1.2 M\n librgw2 x86_64 2:13.0.1-3240.g235f211.el7 ceph 1.9 M\n libunwind x86_64 2:1.2-2.el7 base 57 k\n lttng-ust x86_64 2.4.1-4.el7 epel 176 k\n pciutils x86_64 3.5.1-2.el7 base 93 k\n perl x86_64 4:5.16.3-292.el7 base 8.0 M\n perl-Carp noarch 1.26-244.el7 base 19 k\n perl-Encode x86_64 2.51-7.el7 base 1.5 M\n perl-Exporter noarch 5.68-3.el7 base 28 k\n perl-File-Path noarch 2.09-2.el7 base 26 k\n perl-File-Temp noarch 0.23.01-3.el7 base 56 k\n perl-Filter x86_64 1.49-3.el7 base 76 k\n perl-Getopt-Long noarch 2.40-2.el7 base 56 k\n perl-HTTP-Tiny noarch 0.033-3.el7 base 38 k\n perl-PathTools x86_64 3.40-5.el7 base 82 k\n perl-Pod-Escapes noarch 1:1.04-292.el7 base 51 k\n perl-Pod-Perldoc noarch 3.20-4.el7 base 87 k\n perl-Pod-Simple noarch 1:3.28-4.el7 base 216 k\n perl-Pod-Usage noarch 1.63-3.el7 base 27 k\n perl-Scalar-List-Utils x86_64 1.27-248.el7 base 36 k\n perl-Socket x86_64 2.010-4.el7 base 49 k\n perl-Storable x86_64 2.45-3.el7 base 77 k\n perl-Text-ParseWords noarch 3.29-4.el7 base 14 k\n perl-Time-HiRes x86_64 4:1.9725-3.el7 base 45 k\n perl-Time-Local noarch 1.2300-2.el7 base 24 k\n perl-constant noarch 1.27-2.el7 base 19 k\n perl-libs x86_64 4:5.16.3-292.el7 base 688 k\n perl-macros x86_64 4:5.16.3-292.el7 base 43 k\n perl-parent noarch 1:0.225-244.el7 base 12 k\n perl-podlators noarch 2.5.1-3.el7 base 112 k\n perl-threads x86_64 1.87-4.el7 base 49 k\n perl-threads-shared x86_64 1.43-6.el7 base 39 k\n python-cephfs x86_64 2:13.0.1-3240.g235f211.el7 ceph 82 k\n python-prettytable noarch 0.7.2-3.el7 base 37 k\n python-rados x86_64 2:13.0.1-3240.g235f211.el7 ceph 183 k\n python-rbd x86_64 2:13.0.1-3240.g235f211.el7 ceph 126 k\n python-requests noarch 2.6.0-1.el7_1 base 94 k\n python-rgw x86_64 2:13.0.1-3240.g235f211.el7 ceph 74 k\n python-six noarch 1.9.0-2.el7 base 29 k\n python-urllib3 noarch 1.10.2-3.el7 base 101 k\n rdma-core x86_64 13-7.el7 base 43 k\n snappy x86_64 1.1.0-3.el7 base 40 k\n userspace-rcu x86_64 0.7.16-1.el7 epel 73 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package (+51 Dependent packages)\n\nTotal download size: 33 M\nInstalled size: 108 M\nDownloading packages:\nPublic key for leveldb-1.12.0-11.el7.x86_64.rpm is not installed\n--------------------------------------------------------------------------------\nTotal 7.4 MB/s | 33 MB 00:04 \nRetrieving key from file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : snappy-1.1.0-3.el7.x86_64 1/52 \n Installing : leveldb-1.12.0-11.el7.x86_64 2/52 \n Installing : 1:perl-parent-0.225-244.el7.noarch 3/52 \n Installing : perl-HTTP-Tiny-0.033-3.el7.noarch 4/52 \n Installing : perl-podlators-2.5.1-3.el7.noarch 5/52 \n Installing : perl-Pod-Perldoc-3.20-4.el7.noarch 6/52 \n Installing : perl-Text-ParseWords-3.29-4.el7.noarch 7/52 \n Installing : 1:perl-Pod-Escapes-1.04-292.el7.noarch 8/52 \n Installing : perl-Encode-2.51-7.el7.x86_64 9/52 \n Installing : perl-Pod-Usage-1.63-3.el7.noarch 10/52 \n Installing : 4:perl-macros-5.16.3-292.el7.x86_64 11/52 \n Installing : 4:perl-libs-5.16.3-292.el7.x86_64 12/52 \n Installing : perl-Storable-2.45-3.el7.x86_64 13/52 \n Installing : 4:perl-Time-HiRes-1.9725-3.el7.x86_64 14/52 \n Installing : perl-constant-1.27-2.el7.noarch 15/52 \n Installing : perl-Time-Local-1.2300-2.el7.noarch 16/52 \n Installing : perl-Socket-2.010-4.el7.x86_64 17/52 \n Installing : perl-Carp-1.26-244.el7.noarch 18/52 \n Installing : perl-PathTools-3.40-5.el7.x86_64 19/52 \n Installing : perl-Scalar-List-Utils-1.27-248.el7.x86_64 20/52 \n Installing : perl-Exporter-5.68-3.el7.noarch 21/52 \n Installing : perl-File-Temp-0.23.01-3.el7.noarch 22/52 \n Installing : perl-File-Path-2.09-2.el7.noarch 23/52 \n Installing : perl-threads-shared-1.43-6.el7.x86_64 24/52 \n Installing : perl-threads-1.87-4.el7.x86_64 25/52 \n Installing : perl-Filter-1.49-3.el7.x86_64 26/52 \n Installing : 1:perl-Pod-Simple-3.28-4.el7.noarch 27/52 \n Installing : perl-Getopt-Long-2.40-2.el7.noarch 28/52 \n Installing : 4:perl-5.16.3-292.el7.x86_64 29/52 \n Installing : userspace-rcu-0.7.16-1.el7.x86_64 30/52 \n Installing : lttng-ust-2.4.1-4.el7.x86_64 31/52 \n Installing : python-prettytable-0.7.2-3.el7.noarch 32/52 \n Installing : 2:libunwind-1.2-2.el7.x86_64 33/52 \n Installing : gperftools-libs-2.4-8.el7.x86_64 34/52 \n Installing : fuse-libs-2.9.2-8.el7.x86_64 35/52 \n Installing : pciutils-3.5.1-2.el7.x86_64 36/52 \n Installing : rdma-core-13-7.el7.x86_64 37/52 \n Installing : libibverbs-13-7.el7.x86_64 38/52 \n Installing : 2:librados2-13.0.1-3240.g235f211.el7.x86_64 39/52 \n Installing : 2:python-rados-13.0.1-3240.g235f211.el7.x86_64 40/52 \n Installing : 2:libcephfs2-13.0.1-3240.g235f211.el7.x86_64 41/52 \n Installing : 2:librbd1-13.0.1-3240.g235f211.el7.x86_64 42/52 \n Installing : 2:python-rbd-13.0.1-3240.g235f211.el7.x86_64 43/52 \n Installing : 2:python-cephfs-13.0.1-3240.g235f211.el7.x86_64 44/52 \n Installing : 2:libradosstriper1-13.0.1-3240.g235f211.el7.x86_64 45/52 \n Installing : 2:librgw2-13.0.1-3240.g235f211.el7.x86_64 46/52 \n Installing : 2:python-rgw-13.0.1-3240.g235f211.el7.x86_64 47/52 \n Installing : python-six-1.9.0-2.el7.noarch 48/52 \n Installing : python-urllib3-1.10.2-3.el7.noarch 49/52 \n Installing : python-requests-2.6.0-1.el7_1.noarch 50/52 \n Installing : libbabeltrace-1.2.4-3.el7.x86_64 51/52 \n Installing : 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64 52/52 \n Verifying : perl-HTTP-Tiny-0.033-3.el7.noarch 1/52 \n Verifying : 2:python-rbd-13.0.1-3240.g235f211.el7.x86_64 2/52 \n Verifying : leveldb-1.12.0-11.el7.x86_64 3/52 \n Verifying : perl-Storable-2.45-3.el7.x86_64 4/52 \n Verifying : rdma-core-13-7.el7.x86_64 5/52 \n Verifying : 4:perl-Time-HiRes-1.9725-3.el7.x86_64 6/52 \n Verifying : 2:libradosstriper1-13.0.1-3240.g235f211.el7.x86_64 7/52 \n Verifying : 2:python-cephfs-13.0.1-3240.g235f211.el7.x86_64 8/52 \n Verifying : perl-constant-1.27-2.el7.noarch 9/52 \n Verifying : perl-PathTools-3.40-5.el7.x86_64 10/52 \n Verifying : 4:perl-macros-5.16.3-292.el7.x86_64 11/52 \n Verifying : python-urllib3-1.10.2-3.el7.noarch 12/52 \n Verifying : 2:librados2-13.0.1-3240.g235f211.el7.x86_64 13/52 \n Verifying : libbabeltrace-1.2.4-3.el7.x86_64 14/52 \n Verifying : 4:perl-5.16.3-292.el7.x86_64 15/52 \n Verifying : perl-File-Temp-0.23.01-3.el7.noarch 16/52 \n Verifying : 1:perl-Pod-Simple-3.28-4.el7.noarch 17/52 \n Verifying : perl-Time-Local-1.2300-2.el7.noarch 18/52 \n Verifying : perl-podlators-2.5.1-3.el7.noarch 19/52 \n Verifying : 4:perl-libs-5.16.3-292.el7.x86_64 20/52 \n Verifying : perl-Pod-Perldoc-3.20-4.el7.noarch 21/52 \n Verifying : python-six-1.9.0-2.el7.noarch 22/52 \n Verifying : 2:librgw2-13.0.1-3240.g235f211.el7.x86_64 23/52 \n Verifying : perl-Socket-2.010-4.el7.x86_64 24/52 \n Verifying : pciutils-3.5.1-2.el7.x86_64 25/52 \n Verifying : perl-Carp-1.26-244.el7.noarch 26/52 \n Verifying : 2:libcephfs2-13.0.1-3240.g235f211.el7.x86_64 27/52 \n Verifying : gperftools-libs-2.4-8.el7.x86_64 28/52 \n Verifying : perl-threads-shared-1.43-6.el7.x86_64 29/52 \n Verifying : 2:python-rados-13.0.1-3240.g235f211.el7.x86_64 30/52 \n Verifying : 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64 31/52 \n Verifying : lttng-ust-2.4.1-4.el7.x86_64 32/52 \n Verifying : perl-Scalar-List-Utils-1.27-248.el7.x86_64 33/52 \n Verifying : fuse-libs-2.9.2-8.el7.x86_64 34/52 \n Verifying : 2:libunwind-1.2-2.el7.x86_64 35/52 \n Verifying : perl-Pod-Usage-1.63-3.el7.noarch 36/52 \n Verifying : snappy-1.1.0-3.el7.x86_64 37/52 \n Verifying : perl-Encode-2.51-7.el7.x86_64 38/52 \n Verifying : perl-Exporter-5.68-3.el7.noarch 39/52 \n Verifying : python-prettytable-0.7.2-3.el7.noarch 40/52 \n Verifying : 2:python-rgw-13.0.1-3240.g235f211.el7.x86_64 41/52 \n Verifying : libibverbs-13-7.el7.x86_64 42/52 \n Verifying : perl-Getopt-Long-2.40-2.el7.noarch 43/52 \n Verifying : perl-File-Path-2.09-2.el7.noarch 44/52 \n Verifying : python-requests-2.6.0-1.el7_1.noarch 45/52 \n Verifying : perl-threads-1.87-4.el7.x86_64 46/52 \n Verifying : userspace-rcu-0.7.16-1.el7.x86_64 47/52 \n Verifying : 2:librbd1-13.0.1-3240.g235f211.el7.x86_64 48/52 \n Verifying : perl-Filter-1.49-3.el7.x86_64 49/52 \n Verifying : perl-Text-ParseWords-3.29-4.el7.noarch 50/52 \n Verifying : 1:perl-parent-0.225-244.el7.noarch 51/52 \n Verifying : 1:perl-Pod-Escapes-1.04-292.el7.noarch 52/52 \n\nInstalled:\n ceph-common.x86_64 2:13.0.1-3240.g235f211.el7 \n\nDependency Installed:\n fuse-libs.x86_64 0:2.9.2-8.el7 \n gperftools-libs.x86_64 0:2.4-8.el7 \n leveldb.x86_64 0:1.12.0-11.el7 \n libbabeltrace.x86_64 0:1.2.4-3.el7 \n libcephfs2.x86_64 2:13.0.1-3240.g235f211.el7 \n libibverbs.x86_64 0:13-7.el7 \n librados2.x86_64 2:13.0.1-3240.g235f211.el7 \n libradosstriper1.x86_64 2:13.0.1-3240.g235f211.el7 \n librbd1.x86_64 2:13.0.1-3240.g235f211.el7 \n librgw2.x86_64 2:13.0.1-3240.g235f211.el7 \n libunwind.x86_64 2:1.2-2.el7 \n lttng-ust.x86_64 0:2.4.1-4.el7 \n pciutils.x86_64 0:3.5.1-2.el7 \n perl.x86_64 4:5.16.3-292.el7 \n perl-Carp.noarch 0:1.26-244.el7 \n perl-Encode.x86_64 0:2.51-7.el7 \n perl-Exporter.noarch 0:5.68-3.el7 \n perl-File-Path.noarch 0:2.09-2.el7 \n perl-File-Temp.noarch 0:0.23.01-3.el7 \n perl-Filter.x86_64 0:1.49-3.el7 \n perl-Getopt-Long.noarch 0:2.40-2.el7 \n perl-HTTP-Tiny.noarch 0:0.033-3.el7 \n perl-PathTools.x86_64 0:3.40-5.el7 \n perl-Pod-Escapes.noarch 1:1.04-292.el7 \n perl-Pod-Perldoc.noarch 0:3.20-4.el7 \n perl-Pod-Simple.noarch 1:3.28-4.el7 \n perl-Pod-Usage.noarch 0:1.63-3.el7 \n perl-Scalar-List-Utils.x86_64 0:1.27-248.el7 \n perl-Socket.x86_64 0:2.010-4.el7 \n perl-Storable.x86_64 0:2.45-3.el7 \n perl-Text-ParseWords.noarch 0:3.29-4.el7 \n perl-Time-HiRes.x86_64 4:1.9725-3.el7 \n perl-Time-Local.noarch 0:1.2300-2.el7 \n perl-constant.noarch 0:1.27-2.el7 \n perl-libs.x86_64 4:5.16.3-292.el7 \n perl-macros.x86_64 4:5.16.3-292.el7 \n perl-parent.noarch 1:0.225-244.el7 \n perl-podlators.noarch 0:2.5.1-3.el7 \n perl-threads.x86_64 0:1.87-4.el7 \n perl-threads-shared.x86_64 0:1.43-6.el7 \n python-cephfs.x86_64 2:13.0.1-3240.g235f211.el7 \n python-prettytable.noarch 0:0.7.2-3.el7 \n python-rados.x86_64 2:13.0.1-3240.g235f211.el7 \n python-rbd.x86_64 2:13.0.1-3240.g235f211.el7 \n python-requests.noarch 0:2.6.0-1.el7_1 \n python-rgw.x86_64 2:13.0.1-3240.g235f211.el7 \n python-six.noarch 0:1.9.0-2.el7 \n python-urllib3.noarch 0:1.10.2-3.el7 \n rdma-core.x86_64 0:13-7.el7 \n snappy.x86_64 0:1.1.0-3.el7 \n userspace-rcu.x86_64 0:0.7.16-1.el7 \n\nComplete!\n" ] } MSG: warning: /var/cache/yum/x86_64/7/epel/packages/leveldb-1.12.0-11.el7.x86_64.rpm: Header V3 RSA/SHA256 Signature, key ID 352c64e5: NOKEY Importing GPG key 0x352C64E5: Userid : "Fedora EPEL (7) " Fingerprint: 91e9 7d7c 4a5e 96f1 7f3e 888f 6a2f aea2 352c 64e5 Package : epel-release-7-9.noarch (@extras) From : /etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7 TASK [ceph-common : install redhat ceph-mon package] *************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:21 changed: [mon0] => { "changed": true, "failed": false, "rc": 0, "results": [ "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirror.ate.info\n * epel: mirror.switch.ch\n * extras: mirror.guru\n * updates: mirror.guru\nResolving Dependencies\n--> Running transaction check\n---> Package ceph-mon.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: ceph-base = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-mon-13.0.1-3240.g235f211.el7.x86_64\n--> Running transaction check\n---> Package ceph-base.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: ceph-selinux = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: psmisc for package: 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: cryptsetup for package: 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64\n--> Running transaction check\n---> Package ceph-selinux.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package cryptsetup.x86_64 0:1.7.4-3.el7_4.1 will be installed\n---> Package psmisc.x86_64 0:22.20-15.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n ceph-mon x86_64 2:13.0.1-3240.g235f211.el7 ceph 3.6 M\nInstalling for dependencies:\n ceph-base x86_64 2:13.0.1-3240.g235f211.el7 ceph 4.6 M\n ceph-selinux x86_64 2:13.0.1-3240.g235f211.el7 ceph 19 k\n cryptsetup x86_64 1.7.4-3.el7_4.1 updates 128 k\n psmisc x86_64 22.20-15.el7 base 141 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package (+4 Dependent packages)\n\nTotal download size: 8.5 M\nInstalled size: 34 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 4.3 MB/s | 8.5 MB 00:01 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : cryptsetup-1.7.4-3.el7_4.1.x86_64 1/5 \n Installing : psmisc-22.20-15.el7.x86_64 2/5 \n Installing : 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64 3/5 \n Installing : 2:ceph-selinux-13.0.1-3240.g235f211.el7.x86_64 4/5 \n/usr/lib/python2.7/site-packages/ceph_disk/main.py:5689: UserWarning: \n*******************************************************************************\nThis tool is now deprecated in favor of ceph-volume.\nIt is recommended to use ceph-volume for OSD deployments. For details see:\n\n http://docs.ceph.com/docs/master/ceph-volume/#migrating\n\n*******************************************************************************\n\n warnings.warn(DEPRECATION_WARNING)\n/usr/lib/python2.7/site-packages/ceph_disk/main.py:5721: UserWarning: \n*******************************************************************************\nThis tool is now deprecated in favor of ceph-volume.\nIt is recommended to use ceph-volume for OSD deployments. For details see:\n\n http://docs.ceph.com/docs/master/ceph-volume/#migrating\n\n*******************************************************************************\n\n warnings.warn(DEPRECATION_WARNING)\n Installing : 2:ceph-mon-13.0.1-3240.g235f211.el7.x86_64 5/5 \n Verifying : psmisc-22.20-15.el7.x86_64 1/5 \n Verifying : 2:ceph-mon-13.0.1-3240.g235f211.el7.x86_64 2/5 \n Verifying : cryptsetup-1.7.4-3.el7_4.1.x86_64 3/5 \n Verifying : 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64 4/5 \n Verifying : 2:ceph-selinux-13.0.1-3240.g235f211.el7.x86_64 5/5 \n\nInstalled:\n ceph-mon.x86_64 2:13.0.1-3240.g235f211.el7 \n\nDependency Installed:\n ceph-base.x86_64 2:13.0.1-3240.g235f211.el7 \n ceph-selinux.x86_64 2:13.0.1-3240.g235f211.el7 \n cryptsetup.x86_64 0:1.7.4-3.el7_4.1 \n psmisc.x86_64 0:22.20-15.el7 \n\nComplete!\n" ] } TASK [ceph-common : install redhat ceph-osd package] *************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:28 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install redhat ceph-fuse package] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:35 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install redhat ceph-base package] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:42 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install redhat ceph-test package] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:49 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install redhat ceph-radosgw package] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:56 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include installs/install_on_suse.yml] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:31 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include installs/install_on_debian.yml] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:40 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include installs/install_on_clear.yml] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:49 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include ntp debian setup tasks] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:58 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include ntp rpm setup tasks] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:66 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/ntp_rpm.yml for mon0 TASK [ceph-common : install ntp] *********************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/ntp_rpm.yml:2 changed: [mon0] => { "changed": true, "failed": false, "rc": 0, "results": [ "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirror.ate.info\n * epel: mirror.switch.ch\n * extras: mirror.guru\n * updates: mirror.guru\nResolving Dependencies\n--> Running transaction check\n---> Package ntp.x86_64 0:4.2.6p5-25.el7.centos.2 will be installed\n--> Processing Dependency: ntpdate = 4.2.6p5-25.el7.centos.2 for package: ntp-4.2.6p5-25.el7.centos.2.x86_64\n--> Processing Dependency: libopts.so.25()(64bit) for package: ntp-4.2.6p5-25.el7.centos.2.x86_64\n--> Running transaction check\n---> Package autogen-libopts.x86_64 0:5.18-5.el7 will be installed\n---> Package ntpdate.x86_64 0:4.2.6p5-25.el7.centos.2 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling:\n ntp x86_64 4.2.6p5-25.el7.centos.2 base 547 k\nInstalling for dependencies:\n autogen-libopts x86_64 5.18-5.el7 base 66 k\n ntpdate x86_64 4.2.6p5-25.el7.centos.2 base 86 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package (+2 Dependent packages)\n\nTotal download size: 699 k\nInstalled size: 1.6 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 1.6 MB/s | 699 kB 00:00 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : autogen-libopts-5.18-5.el7.x86_64 1/3 \n Installing : ntpdate-4.2.6p5-25.el7.centos.2.x86_64 2/3 \n Installing : ntp-4.2.6p5-25.el7.centos.2.x86_64 3/3 \n Verifying : ntp-4.2.6p5-25.el7.centos.2.x86_64 1/3 \n Verifying : ntpdate-4.2.6p5-25.el7.centos.2.x86_64 2/3 \n Verifying : autogen-libopts-5.18-5.el7.x86_64 3/3 \n\nInstalled:\n ntp.x86_64 0:4.2.6p5-25.el7.centos.2 \n\nDependency Installed:\n autogen-libopts.x86_64 0:5.18-5.el7 ntpdate.x86_64 0:4.2.6p5-25.el7.centos.2 \n\nComplete!\n" ] } TASK [ceph-common : start the ntp service] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/ntp_rpm.yml:7 changed: [mon0] => { "changed": true, "enabled": true, "failed": false, "name": "ntpd", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "-.mount ntpdate.service syslog.target basic.target system.slice systemd-journald.socket sntp.service tmp.mount", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "chronyd.service shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConflictedBy": "chronyd.service", "Conflicts": "shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "Network Time Service", "DevicePolicy": "auto", "EnvironmentFile": "/etc/sysconfig/ntpd (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/ntpd ; argv[]=/usr/sbin/ntpd -u ntp:ntp $OPTIONS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/ntpd.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "ntpd.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "1885", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "1885", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "ntpd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "yes", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Requires": "-.mount basic.target", "RequiresMountsFor": "/var/tmp", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "forking", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "disabled", "Wants": "system.slice", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [ceph-common : get ceph version] ****************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:74 ok: [mon0] => { "changed": false, "cmd": [ "ceph", "--version" ], "delta": "0:00:00.173836", "end": "2018-03-25 23:23:27.288910", "failed": false, "rc": 0, "start": "2018-03-25 23:23:27.115074" } STDOUT: ceph version 13.0.1-3240-g235f211 (235f2119010484c12c5bd29421aeef7d44df38a1) mimic (dev) TASK [ceph-common : set_fact ceph_version] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:80 ok: [mon0] => { "ansible_facts": { "ceph_version": "13.0.1-3240-g235f211" }, "changed": false, "failed": false } TASK [ceph-common : set_fact ceph_release jewel] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : set_fact ceph_release kraken] ****************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml:8 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : set_fact ceph_release luminous] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml:14 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : set_fact ceph_release mimic] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml:20 ok: [mon0] => { "ansible_facts": { "ceph_release": "mimic" }, "changed": false, "failed": false } TASK [ceph-common : check if /var/lib/ceph/mon/ceph-mon0/keyring already exists] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail if /var/lib/ceph/mon/ceph-mon0/keyring doesn't exist] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:7 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : get existing initial mon keyring if it already exists but not monitor_keyring.conf in /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:14 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : test existing initial mon keyring] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:22 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail if initial mon keyring found doesn't work] ************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:27 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : write initial mon keyring in /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/monitor_keyring.conf if it doesn't exist] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:33 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : put initial mon keyring in mon kv store] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:41 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : create ceph initial directories] *************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml:2 changed: [mon0] => (item=/etc/ceph) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/etc/ceph", "mode": "0755", "owner": "ceph", "path": "/etc/ceph", "secontext": "system_u:object_r:etc_t:s0", "size": 20, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 133, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/mon) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/mon", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/mon", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/osd) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/osd", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/osd", "secontext": "unconfined_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/mds) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/mds", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/mds", "secontext": "unconfined_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/tmp) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/tmp", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/tmp", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/radosgw) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/radosgw", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/radosgw", "secontext": "unconfined_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/bootstrap-rgw) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/bootstrap-rgw", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/bootstrap-rgw", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/bootstrap-mds) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/bootstrap-mds", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/bootstrap-mds", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/bootstrap-osd) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/bootstrap-osd", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/bootstrap-osd", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/lib/ceph/bootstrap-rbd) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/bootstrap-rbd", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/bootstrap-rbd", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } TASK [ceph-common : create rbd client directory] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml:2 ok: [mon0] => (item=/var/run/ceph) => { "changed": false, "failed": false, "gid": 167, "group": "ceph", "item": "/var/run/ceph", "mode": "0770", "owner": "ceph", "path": "/var/run/ceph", "secontext": "system_u:object_r:ceph_var_run_t:s0", "size": 40, "state": "directory", "uid": 167 } changed: [mon0] => (item=/var/log/ceph) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/log/ceph", "mode": "0770", "owner": "ceph", "path": "/var/log/ceph", "secontext": "system_u:object_r:ceph_log_t:s0", "size": 6, "state": "directory", "uid": 167 } TASK [ceph-common : configure cluster name] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml:2 changed: [mon0] => { "backup": "", "changed": true, "failed": false } MSG: line added TASK [ceph-common : check /etc/default/ceph exist] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml:22 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : configure cluster name] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml:30 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : configure cluster name] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml:42 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : configure TCMALLOC_MAX_TOTAL_THREAD_CACHE_BYTES for debian] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : configure TCMALLOC_MAX_TOTAL_THREAD_CACHE_BYTES for redhat] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml:15 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : create ceph conf directory] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:4 ok: [mon0] => { "changed": false, "failed": false, "gid": 167, "group": "ceph", "mode": "0755", "owner": "ceph", "path": "/etc/ceph", "secontext": "system_u:object_r:etc_t:s0", "size": 20, "state": "directory", "uid": 167 } TASK [ceph-config : generate ceph configuration file: ceph.conf] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:12 [DEPRECATION WARNING]: ansible.utils.unicode.to_bytes is deprecated. Use ansible.module_utils._text.to_bytes instead. This feature will be removed in version 2.4. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. [DEPRECATION WARNING]: ansible.utils.unicode.to_unicode is deprecated. Use ansible.module_utils._text.to_text instead. This feature will be removed in version 2.4. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. NOTIFIED HANDLER ceph-defaults : set _mon_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy mon restart script NOTIFIED HANDLER ceph-defaults : restart ceph mon daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph mon daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _mon_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _osd_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy osd restart script NOTIFIED HANDLER ceph-defaults : restart ceph osds daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph osds daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _osd_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _mds_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy mds restart script NOTIFIED HANDLER ceph-defaults : restart ceph mds daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph mds daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _mds_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _rgw_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy rgw restart script NOTIFIED HANDLER ceph-defaults : restart ceph rgw daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph rgw daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _rgw_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _mgr_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy mgr restart script NOTIFIED HANDLER ceph-defaults : restart ceph mgr daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph mgr daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _mgr_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _rbdmirror_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy rbd mirror restart script NOTIFIED HANDLER ceph-defaults : restart ceph rbd mirror daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph rbd mirror daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _rbdmirror_handler_called after restart changed: [mon0] => { "changed": true, "checksum": "cb71fd0cdb80058ca3f4d3d6ebff6b61ca8bcfe0", "dest": "/etc/ceph/ceph.conf", "failed": false, "gid": 167, "group": "ceph", "md5sum": "79bb8c78f52d233eb04c21b2afe2c1d4", "mode": "0644", "owner": "ceph", "secontext": "system_u:object_r:etc_t:s0", "size": 224, "src": "/home/vagrant/.ansible/tmp/ansible-tmp-1522020247.01-255047799063698/source", "state": "file", "uid": 167 } TASK [ceph-config : create a local fetch directory if it does not exist] ******* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:38 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : generate cluster uuid] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:54 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : read cluster uuid if it already exists] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:64 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : ensure /etc/ceph exists] *********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:76 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : generate ceph.conf configuration file] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:86 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : set fsid fact when generate_fsid = true] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:104 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set_fact docker_exec_cmd] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/main.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : make sure monitor_interface or monitor_address or monitor_address_block is configured] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/check_mandatory_vars.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : make sure pg num is set for cephfs pools] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/check_mandatory_vars.yml:10 skipping: [mon0] => (item={u'name': u'cephfs_data', u'pgs': u''}) => { "changed": false, "item": { "name": "cephfs_data", "pgs": "" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'name': u'cephfs_metadata', u'pgs': u''}) => { "changed": false, "item": { "name": "cephfs_metadata", "pgs": "" }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : generate monitor initial keyring] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:2 changed: [mon0 -> localhost] => { "changed": true, "cmd": "python2 -c \"import os ; import struct ; import time; import base64 ; key = os.urandom(16) ; header = struct.pack(' localhost] => { "changed": false, "cmd": [ "cat", "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/monitor_keyring.conf" ], "delta": "0:00:00.007348", "end": "2018-03-26 01:24:13.071890", "failed": false, "rc": 0, "start": "2018-03-26 01:24:13.064542" } STDOUT: AQCcL7haAAAAABAAAxylitkxMuvnrU9S0foD/w== TASK [ceph-mon : create monitor initial keyring] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:22 changed: [mon0] => { "changed": true, "cmd": [ "ceph-authtool", "/var/lib/ceph/tmp/keyring.mon.mon0", "--create-keyring", "--name=mon.", "--add-key=AQCcL7haAAAAABAAAxylitkxMuvnrU9S0foD/w==", "--cap", "mon", "allow *" ], "delta": "0:00:05.099617", "end": "2018-03-25 23:24:20.772074", "failed": false, "rc": 0, "start": "2018-03-25 23:24:15.672457" } STDOUT: creating /var/lib/ceph/tmp/keyring.mon.mon0 added entity mon. auth auth(auid = 18446744073709551615 key=AQCcL7haAAAAABAAAxylitkxMuvnrU9S0foD/w== with 0 caps) TASK [ceph-mon : set initial monitor key permissions] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:28 changed: [mon0] => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "mode": "0600", "owner": "ceph", "path": "/var/lib/ceph/tmp/keyring.mon.mon0", "secontext": "unconfined_u:object_r:ceph_var_lib_t:s0", "size": 77, "state": "file", "uid": 167 } TASK [ceph-mon : create (and fix ownership of) monitor directory] ************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:36 changed: [mon0] => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/mon/ceph-mon0", "secontext": "unconfined_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } TASK [ceph-mon : set_fact ceph_authtool_cap >= ceph_release_num.luminous] ****** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:45 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set_fact ceph_authtool_cap < ceph_release_num.luminous] ******* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:53 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : create custom admin keyring] ********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:61 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set ownership of admin keyring] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:70 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : import admin keyring into mon keyring] ************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:81 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : ceph monitor mkfs with keyring] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:88 changed: [mon0] => { "changed": true, "cmd": [ "ceph-mon", "--cluster", "ceph", "--setuser", "ceph", "--setgroup", "ceph", "--mkfs", "-i", "mon0", "--fsid", "fad7031c-4bd3-4328-9716-4fbddc9c14b8", "--keyring", "/var/lib/ceph/tmp/keyring.mon.mon0" ], "delta": "0:00:00.218715", "end": "2018-03-25 23:24:28.838096", "failed": false, "rc": 0, "start": "2018-03-25 23:24:28.619381" } TASK [ceph-mon : ceph monitor mkfs without keyring] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/deploy_monitors.yml:95 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : ensure systemd service override directory exists] ************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/start_monitor.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : add ceph-mon systemd service overrides] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/start_monitor.yml:10 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : start the monitor service] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/start_monitor.yml:20 ok: [mon0] => { "changed": false, "enabled": true, "failed": false, "name": "ceph-mon@mon0", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "time-sync.target system-ceph\\x5cx2dmon.slice -.mount basic.target tmp.mount network-online.target local-fs.target systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073575333887", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "Ceph cluster monitor daemon", "DevicePolicy": "closed", "Environment": "CLUSTER=ceph", "EnvironmentFile": "/etc/sysconfig/ceph (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/bin/ceph-mon ; argv[]=/usr/bin/ceph-mon -f --cluster ${CLUSTER} --id %i --setuser ceph --setgroup ceph ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/ceph-mon@.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "ceph-mon@mon0.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "1048576", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "1885", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "ceph-mon@mon0.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PartOf": "ceph-mon.target", "PermissionsStartOnly": "no", "PrivateDevices": "yes", "PrivateNetwork": "no", "PrivateTmp": "yes", "ProtectHome": "yes", "ProtectSystem": "full", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Requires": "basic.target -.mount", "RequiresMountsFor": "/var/tmp", "Restart": "on-failure", "RestartUSec": "10s", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-ceph\\x5cx2dmon.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "1800000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "simple", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "disabled", "Wants": "local-fs.target network-online.target time-sync.target system-ceph\\x5cx2dmon.slice", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [ceph-mon : enable the ceph-mon.target service] *************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/start_monitor.yml:29 ok: [mon0] => { "changed": false, "enabled": true, "failed": false, "name": "ceph-mon.target", "status": { "ActiveEnterTimestamp": "Sun 2018-03-25 23:23:11 UTC", "ActiveEnterTimestampMonotonic": "218736284", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "ceph-mon@mon0.service", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Sun 2018-03-25 23:23:11 UTC", "AssertTimestampMonotonic": "218736282", "Before": "multi-user.target ceph.target", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "ConditionResult": "yes", "ConditionTimestamp": "Sun 2018-03-25 23:23:11 UTC", "ConditionTimestampMonotonic": "218736281", "Conflicts": "shutdown.target", "ConsistsOf": "ceph-mon@mon0.service", "DefaultDependencies": "yes", "Description": "ceph target allowing to start/stop all ceph-mon@.service instances at once", "FragmentPath": "/usr/lib/systemd/system/ceph-mon.target", "Id": "ceph-mon.target", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Sun 2018-03-25 23:23:11 UTC", "InactiveExitTimestampMonotonic": "218736284", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "LoadState": "loaded", "Names": "ceph-mon.target", "NeedDaemonReload": "no", "OnFailureJobMode": "replace", "PartOf": "ceph.target", "RefuseManualStart": "no", "RefuseManualStop": "no", "StopWhenUnneeded": "no", "SubState": "active", "Transient": "no", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "WantedBy": "multi-user.target ceph.target", "Wants": "ceph-mon@mon0.service" } } TASK [ceph-mon : include ceph_keys.yml] **************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/main.yml:19 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml for mon0 TASK [ceph-mon : collect admin and bootstrap keys] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:2 ok: [mon0] => { "changed": false, "cmd": [ "ceph-create-keys", "--cluster", "ceph", "-i", "mon0", "-t", "30" ], "delta": "0:00:02.532670", "end": "2018-03-25 23:24:39.712432", "failed": false, "rc": 0, "start": "2018-03-25 23:24:37.179762" } STDERR: INFO:ceph-create-keys:Talking to monitor... Error ENOENT: failed to find client.admin in keyring INFO:ceph-create-keys:Talking to monitor... INFO:ceph-create-keys:Talking to monitor... INFO:ceph-create-keys:Talking to monitor... INFO:ceph-create-keys:Talking to monitor... TASK [ceph-mon : collect admin and bootstrap keys] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:10 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : wait for ceph.client.admin.keyring exists] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:21 ok: [mon0] => { "changed": false, "elapsed": 0, "failed": false, "gid": 167, "group": "ceph", "mode": "0600", "owner": "ceph", "path": "/etc/ceph/ceph.client.admin.keyring", "port": null, "search_regex": null, "secontext": "unconfined_u:object_r:etc_t:s0", "size": 63, "state": "file", "uid": 167 } TASK [ceph-mon : wait for ceph.client.admin.keyring exists] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:31 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : test if initial mon keyring is in mon kv store] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:39 ok: [mon0] => { "changed": false, "cmd": [ "ceph", "--cluster", "ceph", "config-key", "get", "initial_mon_keyring" ], "delta": "0:00:00.282196", "end": "2018-03-25 23:24:45.567381", "failed": false, "failed_when_result": false, "rc": 2, "start": "2018-03-25 23:24:45.285185" } STDERR: Error ENOENT: error obtaining 'initial_mon_keyring': (2) No such file or directory MSG: non-zero return code TASK [ceph-mon : put initial mon keyring in mon kv store] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:48 ok: [mon0] => { "changed": false, "cmd": [ "ceph", "--cluster", "ceph", "config-key", "put", "initial_mon_keyring", "AQCcL7haAAAAABAAAxylitkxMuvnrU9S0foD/w==" ], "delta": "0:00:00.316614", "end": "2018-03-25 23:24:48.395131", "failed": false, "rc": 0, "start": "2018-03-25 23:24:48.078517" } STDERR: set initial_mon_keyring TASK [ceph-mon : create ceph rest api keyring when mon is not containerized] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:57 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : create ceph mgr keyring(s) when mon is not containerized] ***** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:67 TASK [ceph-mon : find ceph keys] *********************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:79 ok: [mon0] => { "changed": false, "cmd": "ls -1 /etc/ceph/*.keyring", "delta": "0:00:00.009796", "end": "2018-03-25 23:24:51.177099", "failed": false, "rc": 0, "start": "2018-03-25 23:24:51.167303" } STDOUT: /etc/ceph/ceph.client.admin.keyring TASK [ceph-mon : set keys permissions] ***************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:87 ok: [mon0] => (item=/etc/ceph/ceph.client.admin.keyring) => { "changed": false, "failed": false, "gid": 167, "group": "ceph", "item": "/etc/ceph/ceph.client.admin.keyring", "mode": "0600", "owner": "ceph", "path": "/etc/ceph/ceph.client.admin.keyring", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 63, "state": "file", "uid": 167 } TASK [ceph-mon : set_fact bootstrap_rbd_keyring] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:98 ok: [mon0] => { "ansible_facts": { "bootstrap_rbd_keyring": "/var/lib/ceph/bootstrap-rbd/ceph.keyring" }, "changed": false, "failed": false } TASK [ceph-mon : copy keys to the ansible server] ****************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:104 changed: [mon0] => (item=/etc/ceph/ceph.client.admin.keyring) => { "changed": true, "checksum": "6d24ed5b75493c61626b6c530f15b0f0b42353f7", "dest": "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/fad7031c-4bd3-4328-9716-4fbddc9c14b8/etc/ceph/ceph.client.admin.keyring", "failed": false, "item": "/etc/ceph/ceph.client.admin.keyring", "md5sum": "7939e6c27be842a6a5d862bbab67d911", "remote_checksum": "6d24ed5b75493c61626b6c530f15b0f0b42353f7", "remote_md5sum": null } changed: [mon0] => (item=/var/lib/ceph/bootstrap-osd/ceph.keyring) => { "changed": true, "checksum": "35157895bacf561421cf9ab3a616dbe3adeac005", "dest": "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/fad7031c-4bd3-4328-9716-4fbddc9c14b8/var/lib/ceph/bootstrap-osd/ceph.keyring", "failed": false, "item": "/var/lib/ceph/bootstrap-osd/ceph.keyring", "md5sum": "02e40668450c887e3c7114a61f8d2951", "remote_checksum": "35157895bacf561421cf9ab3a616dbe3adeac005", "remote_md5sum": null } changed: [mon0] => (item=/var/lib/ceph/bootstrap-rgw/ceph.keyring) => { "changed": true, "checksum": "89e0282badf16d08cac55b13b7df325cc3211a9a", "dest": "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/fad7031c-4bd3-4328-9716-4fbddc9c14b8/var/lib/ceph/bootstrap-rgw/ceph.keyring", "failed": false, "item": "/var/lib/ceph/bootstrap-rgw/ceph.keyring", "md5sum": "97d117127c61b7777eb87fd5d362704c", "remote_checksum": "89e0282badf16d08cac55b13b7df325cc3211a9a", "remote_md5sum": null } changed: [mon0] => (item=/var/lib/ceph/bootstrap-mds/ceph.keyring) => { "changed": true, "checksum": "6f23b4dc1d2a424740e743a52f29334cc683d9fc", "dest": "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/fad7031c-4bd3-4328-9716-4fbddc9c14b8/var/lib/ceph/bootstrap-mds/ceph.keyring", "failed": false, "item": "/var/lib/ceph/bootstrap-mds/ceph.keyring", "md5sum": "07d8b7e415c33593a80f455b16c49f69", "remote_checksum": "6f23b4dc1d2a424740e743a52f29334cc683d9fc", "remote_md5sum": null } changed: [mon0] => (item=/var/lib/ceph/bootstrap-rbd/ceph.keyring) => { "changed": true, "checksum": "7e49f47156eb358cc054aeb04781a42239943738", "dest": "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/fad7031c-4bd3-4328-9716-4fbddc9c14b8/var/lib/ceph/bootstrap-rbd/ceph.keyring", "failed": false, "item": "/var/lib/ceph/bootstrap-rbd/ceph.keyring", "md5sum": "2bab908fb65bb3f19c7c07d9b5d1edae", "remote_checksum": "7e49f47156eb358cc054aeb04781a42239943738", "remote_md5sum": null } TASK [ceph-mon : drop in a motd script to report status when logging in] ******* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/ceph_keys.yml:119 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : collect all the pools] **************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/secure_cluster.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : secure the cluster] ******************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/secure_cluster.yml:7 TASK [ceph-mon : set_fact ceph_config_keys] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : register rbd bootstrap key] *********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:12 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : merge rbd bootstrap key to config and keys paths] ************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:18 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set_fact tmp_ceph_mgr_keys add mgr keys to config and keys paths] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:23 TASK [ceph-mon : set_fact ceph_mgr_keys convert mgr keys to an array] ********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:31 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set_fact ceph_config_keys merge mgr keys to config and keys paths] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:37 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : stat for ceph config and keys] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:43 TASK [ceph-mon : try to copy ceph keys] **************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:54 TASK [ceph-mon : try to copy ceph config] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:68 TASK [ceph-mon : set selinux permissions] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/copy_configs.yml:83 skipping: [mon0] => (item=/etc/ceph) => { "changed": false, "item": "/etc/ceph", "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=/var/lib/ceph) => { "changed": false, "item": "/var/lib/ceph", "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : populate kv_store with default ceph.conf] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/start_docker_monitor.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : populate kv_store with custom ceph.conf] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/start_docker_monitor.yml:18 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : delete populate-kv-store docker] ****************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/start_docker_monitor.yml:36 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : generate systemd unit file] *********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/start_docker_monitor.yml:43 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : systemd start mon container] ********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/start_docker_monitor.yml:54 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : configure ceph profile.d aliases] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/configure_ceph_command_aliases.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : wait for monitor socket to exist] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:12 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : ipv4 - force peer addition as potential bootstrap peer for cluster bringup - monitor_interface] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:19 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : ipv4 - force peer addition as potential bootstrap peer for cluster bringup - monitor_address] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:29 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : ipv4 - force peer addition as potential bootstrap peer for cluster bringup - monitor_address_block] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:39 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : ipv6 - force peer addition as potential bootstrap peer for cluster bringup - monitor_interface] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:49 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : ipv6 - force peer addition as potential bootstrap peer for cluster bringup - monitor_address] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:59 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : ipv6 - force peer addition as potential bootstrap peer for cluster bringup - monitor_address_block] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:69 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : push ceph files to the ansible server] ************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/fetch_configs.yml:2 TASK [ceph-mon : create ceph rest api keyring when mon is containerized] ******* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:83 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : create ceph mgr keyring(s) when mon is containerized] ********* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:96 TASK [ceph-mon : stat for ceph mgr key(s)] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:108 TASK [ceph-mon : fetch ceph mgr key(s)] **************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/docker/main.yml:120 TASK [ceph-mon : configure crush hierarchy] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/crush_rules.yml:2 skipping: [mon0] => (item=osd0) => { "changed": false, "item": "osd0", "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : create configured crush rules] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/crush_rules.yml:14 skipping: [mon0] => (item={u'default': False, u'type': u'host', u'name': u'HDD', u'root': u'HDD'}) => { "changed": false, "item": { "default": false, "name": "HDD", "root": "HDD", "type": "host" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'default': False, u'type': u'host', u'name': u'SSD', u'root': u'SSD'}) => { "changed": false, "item": { "default": false, "name": "SSD", "root": "SSD", "type": "host" }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : get id for new default crush rule] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/crush_rules.yml:21 skipping: [mon0] => (item={u'default': False, u'type': u'host', u'name': u'HDD', u'root': u'HDD'}) => { "changed": false, "item": { "default": false, "name": "HDD", "root": "HDD", "type": "host" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'default': False, u'type': u'host', u'name': u'SSD', u'root': u'SSD'}) => { "changed": false, "item": { "default": false, "name": "SSD", "root": "SSD", "type": "host" }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set_fact info_ceph_default_crush_rule_yaml] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/crush_rules.yml:33 skipping: [mon0] => (item={'_ansible_item_result': True, 'skipped': True, 'item': {u'default': False, u'root': u'HDD', u'name': u'HDD', u'type': u'host'}, 'skip_reason': u'Conditional result was False', 'changed': False, '_ansible_no_log': False}) => { "changed": false, "item": { "changed": false, "item": { "default": false, "name": "HDD", "root": "HDD", "type": "host" }, "skip_reason": "Conditional result was False", "skipped": true }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={'_ansible_item_result': True, 'skipped': True, 'item': {u'default': False, u'root': u'SSD', u'name': u'SSD', u'type': u'host'}, 'skip_reason': u'Conditional result was False', 'changed': False, '_ansible_no_log': False}) => { "changed": false, "item": { "changed": false, "item": { "default": false, "name": "SSD", "root": "SSD", "type": "host" }, "skip_reason": "Conditional result was False", "skipped": true }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set_fact osd_pool_default_crush_rule to osd_pool_default_crush_replicated_ruleset if release < luminous else osd_pool_default_crush_rule] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/crush_rules.yml:41 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : insert new default crush rule into daemon to prevent restart] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/crush_rules.yml:45 skipping: [mon0] => (item=mon0) => { "changed": false, "item": "mon0", "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : add new default crush rule to ceph.conf] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/crush_rules.yml:54 skipping: [mon0] => (item=mon0) => { "changed": false, "item": "mon0", "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : get default value for osd_pool_default_pg_num] **************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/set_osd_pool_default_pg_num.yml:5 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set_fact osd_pool_default_pg_num with pool_default_pg_num (backward compatibility)] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/set_osd_pool_default_pg_num.yml:16 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set_fact osd_pool_default_pg_num with default_pool_default_pg_num.stdout] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/set_osd_pool_default_pg_num.yml:21 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set_fact osd_pool_default_pg_num ceph_conf_overrides.global.osd_pool_default_pg_num] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/set_osd_pool_default_pg_num.yml:27 ok: [mon0] => { "ansible_facts": { "osd_pool_default_pg_num": "8" }, "changed": false, "failed": false } TASK [ceph-mon : create openstack pool(s)] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml:2 skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'images', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "images", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'volumes', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "volumes", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'vms', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "vms", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'backups', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "backups", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'metrics', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "metrics", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : assign rbd application to pool(s)] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml:17 skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'images', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "images", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'volumes', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "volumes", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'vms', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "vms", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'backups', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "backups", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'type': u'replicated', u'rule_name': u'replicated_rule', u'pgp_num': u'8', u'erasure_profile': u'', u'size': u'', u'name': u'metrics', u'pg_num': u'8'}) => { "changed": false, "item": { "erasure_profile": "", "name": "metrics", "pg_num": "8", "pgp_num": "8", "rule_name": "replicated_rule", "size": "", "type": "replicated" }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : create openstack key(s)] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml:26 skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.glance'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.glance", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder-backup'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder-backup", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=backups" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=metrics', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.gnocchi'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.gnocchi", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=metrics" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.openstack'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.openstack", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups" }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : check if openstack key(s) already exist(s)] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml:34 skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.glance'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.glance", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder-backup'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder-backup", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=backups" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=metrics', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.gnocchi'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.gnocchi", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=metrics" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.openstack'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.openstack", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups" }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : add openstack key(s) to ceph] ********************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml:41 skipping: [mon0] => (item=[{u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.glance'}, {'changed': False, 'skipped': True, 'item': {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.glance'}, 'skip_reason': u'Conditional result was False', '_ansible_item_result': True, '_ansible_no_log': False}]) => { "changed": false, "item": [ { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.glance", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images" }, { "_ansible_item_result": true, "_ansible_no_log": false, "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.glance", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images" }, "skip_reason": "Conditional result was False", "skipped": true } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[{u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder'}, {'changed': False, 'skipped': True, 'item': {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder'}, 'skip_reason': u'Conditional result was False', '_ansible_item_result': True, '_ansible_no_log': False}]) => { "changed": false, "item": [ { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images" }, { "_ansible_item_result": true, "_ansible_no_log": false, "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images" }, "skip_reason": "Conditional result was False", "skipped": true } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[{u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder-backup'}, {'changed': False, 'skipped': True, 'item': {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder-backup'}, 'skip_reason': u'Conditional result was False', '_ansible_item_result': True, '_ansible_no_log': False}]) => { "changed": false, "item": [ { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder-backup", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=backups" }, { "_ansible_item_result": true, "_ansible_no_log": false, "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder-backup", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=backups" }, "skip_reason": "Conditional result was False", "skipped": true } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[{u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=metrics', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.gnocchi'}, {'changed': False, 'skipped': True, 'item': {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=metrics', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.gnocchi'}, 'skip_reason': u'Conditional result was False', '_ansible_item_result': True, '_ansible_no_log': False}]) => { "changed": false, "item": [ { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.gnocchi", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=metrics" }, { "_ansible_item_result": true, "_ansible_no_log": false, "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.gnocchi", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=metrics" }, "skip_reason": "Conditional result was False", "skipped": true } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[{u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.openstack'}, {'changed': False, 'skipped': True, 'item': {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.openstack'}, 'skip_reason': u'Conditional result was False', '_ansible_item_result': True, '_ansible_no_log': False}]) => { "changed": false, "item": [ { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.openstack", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups" }, { "_ansible_item_result": true, "_ansible_no_log": false, "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.openstack", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups" }, "skip_reason": "Conditional result was False", "skipped": true } ], "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : fetch openstack key(s)] *************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml:49 skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.glance'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.glance", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder-backup'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder-backup", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=backups" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=metrics', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.gnocchi'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.gnocchi", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=metrics" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.openstack'}) => { "changed": false, "item": { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.openstack", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups" }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : copy to other mons the openstack key(s)] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml:56 skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.glance'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.glance", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder-backup'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder-backup", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=backups" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=metrics', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.gnocchi'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.gnocchi", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=metrics" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.openstack'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.openstack", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups" } ], "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : chmod openstack key(s) on the other mons and this mon] ******** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml:69 skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.glance'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.glance", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder-backup'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder-backup", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=backups" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=metrics', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.gnocchi'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.gnocchi", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=metrics" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.openstack'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.openstack", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups" } ], "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : setfacl for openstack key(s) on the other mons and this mon] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/openstack_config.yml:81 skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.glance'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.glance", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rx pool=images" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.cinder-backup'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.cinder-backup", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=backups" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=metrics', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.gnocchi'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.gnocchi", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=metrics" } ], "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=[u'mon0', {u'osd_cap': u'allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups', u'mon_cap': u'allow r', u'mode': u'0600', u'acls': [], u'key': u'$(ceph-authtool --gen-print-key)', u'name': u'client.openstack'}]) => { "changed": false, "item": [ "mon0", { "acls": [], "key": "$(ceph-authtool --gen-print-key)", "mode": "0600", "mon_cap": "allow r", "name": "client.openstack", "osd_cap": "allow class-read object_prefix rbd_children, allow rwx pool=images, allow rwx pool=vms, allow rwx pool=volumes, allow rwx pool=backups" } ], "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : create filesystem pools] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/create_mds_filesystems.yml:2 skipping: [mon0] => (item={u'name': u'cephfs_data', u'pgs': u''}) => { "changed": false, "item": { "name": "cephfs_data", "pgs": "" }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item={u'name': u'cephfs_metadata', u'pgs': u''}) => { "changed": false, "item": { "name": "cephfs_metadata", "pgs": "" }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : check if ceph filesystem already exists] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/create_mds_filesystems.yml:8 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : create ceph filesystem] *************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/create_mds_filesystems.yml:14 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : allow multimds] *********************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/create_mds_filesystems.yml:20 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : set max_mds] ************************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/create_mds_filesystems.yml:27 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : install calamari server] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/calamari.yml:2 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : increase calamari logging level when debug is on] ************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/calamari.yml:9 skipping: [mon0] => (item=cthulhu) => { "changed": false, "item": "cthulhu", "skip_reason": "Conditional result was False", "skipped": true } skipping: [mon0] => (item=calamari_web) => { "changed": false, "item": "calamari_web", "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-mon : initialize the calamari server api] *************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-mon/tasks/calamari.yml:20 skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _mon_handler_called before restart] ******* ok: [mon0] => { "ansible_facts": { "_mon_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy mon restart script] ********************** changed: [mon0] => { "changed": true, "checksum": "32d47d3004f6ca6721f2b9c273b2b30f8d101f67", "dest": "/tmp/restart_mon_daemon.sh", "failed": false, "gid": 0, "group": "root", "md5sum": "980a406b982147fbb12953728afa3d4b", "mode": "0750", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 1117, "src": "/home/vagrant/.ansible/tmp/ansible-tmp-1522020310.7-9284183994233/source", "state": "file", "uid": 0 } RUNNING HANDLER [ceph-defaults : restart ceph mon daemon(s) - non container] *** skipping: [mon0] => (item=mon0) => { "changed": false, "item": "mon0", "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mon daemon(s) - container] ******* skipping: [mon0] => (item=mon0) => { "changed": false, "item": "mon0", "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _mon_handler_called after restart] ******** ok: [mon0] => { "ansible_facts": { "_mon_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _osd_handler_called before restart] ******* ok: [mon0] => { "ansible_facts": { "_osd_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy osd restart script] ********************** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph osds daemon(s) - non container] *** skipping: [mon0] => (item=osd0) => { "changed": false, "item": "osd0", "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph osds daemon(s) - container] ****** skipping: [mon0] => (item=osd0) => { "changed": false, "item": "osd0", "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _osd_handler_called after restart] ******** ok: [mon0] => { "ansible_facts": { "_osd_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _mds_handler_called before restart] ******* ok: [mon0] => { "ansible_facts": { "_mds_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy mds restart script] ********************** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mds daemon(s) - non container] *** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mds daemon(s) - container] ******* skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _mds_handler_called after restart] ******** ok: [mon0] => { "ansible_facts": { "_mds_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _rgw_handler_called before restart] ******* ok: [mon0] => { "ansible_facts": { "_rgw_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy rgw restart script] ********************** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph rgw daemon(s) - non container] *** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph rgw daemon(s) - container] ******* skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _rgw_handler_called after restart] ******** ok: [mon0] => { "ansible_facts": { "_rgw_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _rbdmirror_handler_called before restart] *** ok: [mon0] => { "ansible_facts": { "_rbdmirror_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy rbd mirror restart script] *************** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph rbd mirror daemon(s) - non container] *** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph rbd mirror daemon(s) - container] *** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _rbdmirror_handler_called after restart] *** ok: [mon0] => { "ansible_facts": { "_rbdmirror_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _mgr_handler_called before restart] ******* ok: [mon0] => { "ansible_facts": { "_mgr_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy mgr restart script] ********************** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mgr daemon(s) - non container] *** skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mgr daemon(s) - container] ******* skipping: [mon0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _mgr_handler_called after restart] ******** ok: [mon0] => { "ansible_facts": { "_mgr_handler_called": false }, "changed": false, "failed": false } META: ran handlers TASK [set ceph monitor install 'Complete'] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:88 ok: [mon0] => { "ansible_stats": { "aggregate": true, "data": { "installer_phase_ceph_mon": { "end": "20180326012517Z", "status": "Complete" } }, "per_host": false }, "changed": false, "failed": false } META: ran handlers PLAY [mgrs] ******************************************************************** skipping: no hosts matched PLAY [agents] ****************************************************************** skipping: no hosts matched PLAY [osds] ******************************************************************** TASK [set ceph osd install 'In Progress'] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:150 ok: [osd0] => { "ansible_stats": { "aggregate": true, "data": { "installer_phase_ceph_osd": { "start": "20180326012517Z", "status": "In Progress" } }, "per_host": false }, "changed": false, "failed": false } META: ran handlers TASK [ceph-defaults : check for a mon container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for an osd container] ****************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:11 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a mds container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:20 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a rgw container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:29 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a mgr container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:38 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a rbd mirror container] ************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:47 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a nfs container] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_running_containers.yml:56 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph mon socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph mon socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:11 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph mon socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:21 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph osd socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:30 ok: [osd0] => { "changed": false, "cmd": "stat --printf=%n /var/run/ceph/ceph-osd*.asok", "delta": "0:00:00.009724", "end": "2018-03-25 23:25:21.247498", "failed": false, "failed_when_result": false, "rc": 1, "start": "2018-03-25 23:25:21.237774" } STDERR: stat: cannot stat ‘/var/run/ceph/ceph-osd*.asok’: No such file or directory MSG: non-zero return code TASK [ceph-defaults : check if the ceph osd socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:40 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph osd socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:50 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph mds socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:59 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph mds socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:69 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph mds socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:79 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph rgw socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:88 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph rgw socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:98 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph rgw socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:108 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph mgr socket] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:117 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph mgr socket is in-use] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:127 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph mgr socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:137 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph rbd mirror socket] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:146 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph rbd mirror socket is in-use] *********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:156 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph rbd mirror socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:166 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check for a ceph nfs ganesha socket] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:175 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if the ceph nfs ganesha socket is in-use] ********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:184 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : remove ceph nfs ganesha socket if exists and not used by a process] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/check_socket_non_container.yml:194 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : check if it is atomic host] ****************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:2 ok: [osd0] => { "changed": false, "failed": false, "stat": { "exists": false } } TASK [ceph-defaults : set_fact is_atomic] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:7 ok: [osd0] => { "ansible_facts": { "is_atomic": false }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact monitor_name ansible_hostname] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:11 ok: [osd0] => { "ansible_facts": { "monitor_name": "osd0" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact monitor_name ansible_fqdn] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:17 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact docker_exec_cmd] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:23 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : is ceph running already?] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:33 ok: [osd0 -> mon0] => { "changed": false, "cmd": [ "timeout", "5", "ceph", "--cluster", "ceph", "fsid" ], "delta": "0:00:00.307293", "end": "2018-03-25 23:25:28.177866", "failed": false, "failed_when_result": false, "rc": 0, "start": "2018-03-25 23:25:27.870573" } STDOUT: fad7031c-4bd3-4328-9716-4fbddc9c14b8 TASK [ceph-defaults : check if /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch directory exists] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:45 ok: [osd0 -> localhost] => { "changed": false, "failed": false, "stat": { "atime": 1522020253.0707698, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7243c4e461ee2272d26aef9943885fb73b5708c6", "ctime": 1522020252.7007682, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 1001, "gr_name": "jenkins-build", "inode": 1408045, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "md5": "de22e57f0d392a335a7c931812f811a3", "mimetype": "text/plain", "mode": "0664", "mtime": 1522020252.7007682, "nlink": 1, "path": "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/monitor_keyring.conf", "pw_name": "jenkins-build", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 41, "uid": 1001, "version": "18446744072729388008", "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [ceph-defaults : set_fact ceph_current_fsid rc 1] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:55 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : create a local fetch directory if it does not exist] ***** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:62 ok: [osd0 -> localhost] => { "changed": false, "failed": false, "gid": 1001, "group": "jenkins-build", "mode": "0775", "owner": "jenkins-build", "path": "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch", "size": 4096, "state": "directory", "uid": 1001 } TASK [ceph-defaults : set_fact fsid ceph_current_fsid.stdout] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:73 ok: [osd0] => { "ansible_facts": { "fsid": "fad7031c-4bd3-4328-9716-4fbddc9c14b8" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact ceph_release ceph_stable_release] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:80 ok: [osd0] => { "ansible_facts": { "ceph_release": "dummy" }, "changed": false, "failed": false } TASK [ceph-defaults : generate cluster fsid] *********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:84 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : reuse cluster fsid when cluster is already running] ****** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:95 ok: [osd0 -> localhost] => { "changed": false, "cmd": "echo fad7031c-4bd3-4328-9716-4fbddc9c14b8 | tee /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/ceph_cluster_uuid.conf", "failed": false, "rc": 0 } STDOUT: skipped, since /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/ceph_cluster_uuid.conf exists TASK [ceph-defaults : read cluster fsid if it already exists] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:104 ok: [osd0 -> localhost] => { "changed": false, "cmd": [ "cat", "/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/ceph_cluster_uuid.conf" ], "delta": "0:00:00.006497", "end": "2018-03-26 01:25:29.932182", "failed": false, "rc": 0, "start": "2018-03-26 01:25:29.925685" } STDOUT: fad7031c-4bd3-4328-9716-4fbddc9c14b8 TASK [ceph-defaults : set_fact fsid] ******************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:116 ok: [osd0] => { "ansible_facts": { "fsid": "fad7031c-4bd3-4328-9716-4fbddc9c14b8" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact mds_name ansible_hostname] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:122 ok: [osd0] => { "ansible_facts": { "mds_name": "osd0" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact mds_name ansible_fqdn] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:128 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact rbd_client_directory_owner ceph] **************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:134 ok: [osd0] => { "ansible_facts": { "rbd_client_directory_owner": "ceph" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact rbd_client_directory_group rbd_client_directory_group] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:141 ok: [osd0] => { "ansible_facts": { "rbd_client_directory_group": "ceph" }, "changed": false, "failed": false } TASK [ceph-defaults : set_fact rbd_client_directory_mode 0770] ***************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:148 ok: [osd0] => { "ansible_facts": { "rbd_client_directory_mode": "0770" }, "changed": false, "failed": false } TASK [ceph-defaults : resolve device link(s)] ********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:155 TASK [ceph-defaults : set_fact build devices from resolved symlinks] *********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:165 TASK [ceph-defaults : set_fact build final devices list] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:174 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact ceph_uid for Debian based system] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:182 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-defaults : set_fact ceph_uid for Red Hat based system] ************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-defaults/tasks/facts.yml:189 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported system] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported architecture] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:8 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:14 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution for red hat ceph storage] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:20 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : determine if node is registered with subscription-manager] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:28 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unregistered red hat rhcs linux] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:39 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution for ubuntu cloud archive] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:48 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported openSUSE distribution] ***************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:55 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported ansible version] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:62 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported ansible version] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:68 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail if systemd is not present] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:75 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution for iscsi gateways] ******* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:81 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail on unsupported distribution version for iscsi gateways] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_system.yml:88 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure an installation origin was chosen] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure a repository was chosen] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:10 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail if local scenario is enabled on debian] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:19 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure ceph_stable_release is set] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:26 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure ceph_stable_release is correct] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:36 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : verify that a repository type was chosen for ceph rhcs version] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:46 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : verify that ceph_rhcs_cdn_debian_repo url is valid for red hat storage] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:56 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure monitor_interface, monitor_address or monitor_address_block is defined] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:68 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : make sure radosgw_interface, radosgw_address or radosgw_address_block is defined] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/checks/check_mandatory_vars.yml:77 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include checks/check_firewall.yml] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:8 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include misc/configure_firewall_rpm.yml] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:15 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml for osd0 TASK [ceph-common : check firewalld installation on redhat or suse] ************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:2 ok: [osd0] => { "changed": false, "cmd": [ "rpm", "-q", "firewalld" ], "delta": "0:00:00.093412", "end": "2018-03-25 23:25:34.757169", "failed": false, "rc": 0, "start": "2018-03-25 23:25:34.663757" } STDOUT: firewalld-0.4.4.4-6.el7.noarch TASK [ceph-common : open monitor ports] **************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:13 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open osd ports] ******************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:28 NOTIFIED HANDLER restart firewalld changed: [osd0] => { "changed": true, "failed": false } MSG: Permanent operation, Changed service ceph to enabled, (offline operation: only on-disk configs were altered) TASK [ceph-common : open rgw ports] ******************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:43 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open mds ports] ******************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:58 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open nfs ports] ******************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:73 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open nfs ports (portmapper)] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:88 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open restapi ports] **************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:103 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open rbdmirror ports] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:118 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : open iscsi ports] ****************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/configure_firewall_rpm.yml:133 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-common : restart firewalld] ****************************** changed: [osd0] => { "changed": true, "enabled": true, "failed": false, "name": "firewalld", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dbus.service polkit.service system.slice basic.target", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "network-pre.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "ip6tables.service ipset.service ebtables.service iptables.service shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "man:firewalld(1)", "EnvironmentFile": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "1885", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "1885", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "firewalld.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Requires": "basic.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "disabled", "Wants": "network-pre.target system.slice", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } META: ran handlers TASK [ceph-common : include installs/install_on_redhat.yml] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:22 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_on_redhat.yml for osd0 TASK [ceph-common : include configure_redhat_repository_installation.yml] ****** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_on_redhat.yml:2 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml for osd0 TASK [ceph-common : include redhat_community_repository.yml] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include redhat_rhcs_repository.yml] ************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:7 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include redhat_dev_repository.yml] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:12 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/redhat_dev_repository.yml for osd0 TASK [ceph-common : fetch ceph red hat development repository] ***************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/redhat_dev_repository.yml:2 ok: [osd0] => { "changed": false, "connection": "close", "content": "[ceph]\nname=ceph packages for \\$basearch\nbaseurl=https://3.chacra.ceph.com/r/ceph/master/235f2119010484c12c5bd29421aeef7d44df38a1/centos/7/flavors/default/\\$basearch\nenabled=1\ngpgcheck=0\ntype=rpm-md\n\n[ceph-noarch]\nname=ceph noarch packages\nbaseurl=https://3.chacra.ceph.com/r/ceph/master/235f2119010484c12c5bd29421aeef7d44df38a1/centos/7/flavors/default/noarch\nenabled=1\ngpgcheck=0\ntype=rpm-md\n\n[ceph-source]\nname=ceph source packages\nbaseurl=https://3.chacra.ceph.com/r/ceph/master/235f2119010484c12c5bd29421aeef7d44df38a1/centos/7/flavors/default/SRPMS\nenabled=1\ngpgcheck=0\ntype=rpm-md\n", "content_length": "588", "content_type": "text/plain; charset=UTF-8", "cookies": {}, "date": "Sun, 25 Mar 2018 23:25:47 GMT", "failed": false, "redirected": true, "server": "nginx", "status": 200, "strict_transport_security": "max-age=31536000", "url": "https://3.chacra.ceph.com/repos/ceph/master/235f2119010484c12c5bd29421aeef7d44df38a1/centos/7/flavors/default/repo" } MSG: OK (588 bytes) TASK [ceph-common : configure ceph red hat development repository] ************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/redhat_dev_repository.yml:8 changed: [osd0] => { "changed": true, "checksum": "2ac7da5709e8b9f29f66731b0a6a0fd2382914d3", "dest": "/etc/yum.repos.d/ceph-dev.repo", "failed": false, "gid": 0, "group": "root", "md5sum": "4ef618a664a182aad7f6489c09e36c5c", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:system_conf_t:s0", "size": 588, "src": "/home/vagrant/.ansible/tmp/ansible-tmp-1522020348.12-158893308976259/source", "state": "file", "uid": 0 } TASK [ceph-common : include redhat_custom_repository.yml] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:17 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : purge yum cache] ******************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/configure_redhat_repository_installation.yml:23 ok: [osd0] => { "changed": false, "cmd": [ "yum", "clean", "all" ], "delta": "0:00:00.318940", "end": "2018-03-25 23:25:56.740485", "failed": false, "rc": 0, "start": "2018-03-25 23:25:56.421545" } STDOUT: Loaded plugins: fastestmirror Cleaning repos: base ceph ceph-noarch ceph-source extras updates Cleaning up everything Maybe you want: rm -rf /var/cache/yum, to also free up space taken by orphaned data from disabled or removed repos TASK [ceph-common : include configure_redhat_local_installation.yml] *********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_on_redhat.yml:7 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include install_redhat_packages.yml] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_on_redhat.yml:12 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml for osd0 TASK [ceph-common : install redhat dependencies] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install centos dependencies] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:9 changed: [osd0] => { "changed": true, "failed": false, "rc": 0, "results": [ "python-pycurl-7.19.0-19.el7.x86_64 providing python-pycurl is already installed", "libselinux-python-2.5-11.el7.x86_64 providing libselinux-python is already installed", "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirrors.ovh.net\n * extras: mirrors.standaloneinstaller.com\n * updates: centos.mirrors.ovh.net\nResolving Dependencies\n--> Running transaction check\n---> Package epel-release.noarch 0:7-9 will be installed\n---> Package hdparm.x86_64 0:9.43-5.el7 will be installed\n---> Package python-setuptools.noarch 0:0.9.8-7.el7 will be installed\n--> Processing Dependency: python-backports-ssl_match_hostname for package: python-setuptools-0.9.8-7.el7.noarch\n--> Running transaction check\n---> Package python-backports-ssl_match_hostname.noarch 0:3.4.0.2-4.el7 will be installed\n--> Processing Dependency: python-backports for package: python-backports-ssl_match_hostname-3.4.0.2-4.el7.noarch\n--> Running transaction check\n---> Package python-backports.x86_64 0:1.0-8.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling:\n epel-release noarch 7-9 extras 14 k\n hdparm x86_64 9.43-5.el7 base 83 k\n python-setuptools noarch 0.9.8-7.el7 base 397 k\nInstalling for dependencies:\n python-backports x86_64 1.0-8.el7 base 5.8 k\n python-backports-ssl_match_hostname noarch 3.4.0.2-4.el7 base 12 k\n\nTransaction Summary\n================================================================================\nInstall 3 Packages (+2 Dependent packages)\n\nTotal download size: 512 k\nInstalled size: 2.1 M\nDownloading packages:\nPublic key for epel-release-7-9.noarch.rpm is not installed\nPublic key for python-backports-1.0-8.el7.x86_64.rpm is not installed\n--------------------------------------------------------------------------------\nTotal 1.6 MB/s | 512 kB 00:00 \nRetrieving key from file:///etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-7\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : python-backports-1.0-8.el7.x86_64 1/5 \n Installing : python-backports-ssl_match_hostname-3.4.0.2-4.el7.noarch 2/5 \n Installing : python-setuptools-0.9.8-7.el7.noarch 3/5 \n Installing : epel-release-7-9.noarch 4/5 \n Installing : hdparm-9.43-5.el7.x86_64 5/5 \n Verifying : python-backports-1.0-8.el7.x86_64 1/5 \n Verifying : hdparm-9.43-5.el7.x86_64 2/5 \n Verifying : python-setuptools-0.9.8-7.el7.noarch 3/5 \n Verifying : epel-release-7-9.noarch 4/5 \n Verifying : python-backports-ssl_match_hostname-3.4.0.2-4.el7.noarch 5/5 \n\nInstalled:\n epel-release.noarch 0:7-9 hdparm.x86_64 0:9.43-5.el7 \n python-setuptools.noarch 0:0.9.8-7.el7 \n\nDependency Installed:\n python-backports.x86_64 0:1.0-8.el7 \n python-backports-ssl_match_hostname.noarch 0:3.4.0.2-4.el7 \n\nComplete!\n" ] } MSG: warning: /var/cache/yum/x86_64/7/extras/packages/epel-release-7-9.noarch.rpm: Header V3 RSA/SHA256 Signature, key ID f4a80eb5: NOKEY Importing GPG key 0xF4A80EB5: Userid : "CentOS-7 Key (CentOS 7 Official Signing Key) " Fingerprint: 6341 ab27 53d7 8a78 a7c2 7bb1 24c6 a8a7 f4a8 0eb5 Package : centos-release-7-4.1708.el7.centos.x86_64 (@anaconda) From : /etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-7 TASK [ceph-common : install redhat ceph-common] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:16 changed: [osd0] => { "changed": true, "failed": false, "rc": 0, "results": [ "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirrors.ovh.net\n * epel: mirror.switch.ch\n * extras: mirrors.standaloneinstaller.com\n * updates: centos.mirrors.ovh.net\nResolving Dependencies\n--> Running transaction check\n---> Package ceph-common.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: librados2 = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-rados = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-rgw = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libcephfs2 = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-rbd = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-cephfs = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: librbd1 = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-requests for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: python-prettytable for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libibverbs.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libtcmalloc.so.4()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libleveldb.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: librbd.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libsnappy.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: librados.so.2()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libradosstriper.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libfuse.so.2()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libbabeltrace.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libbabeltrace-ctf.so.1()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libcephfs.so.2()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: libceph-common.so.0()(64bit) for package: 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64\n--> Running transaction check\n---> Package fuse-libs.x86_64 0:2.9.2-8.el7 will be installed\n---> Package gperftools-libs.x86_64 0:2.4-8.el7 will be installed\n--> Processing Dependency: libunwind.so.8()(64bit) for package: gperftools-libs-2.4-8.el7.x86_64\n---> Package leveldb.x86_64 0:1.12.0-11.el7 will be installed\n---> Package libbabeltrace.x86_64 0:1.2.4-3.el7 will be installed\n---> Package libcephfs2.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package libibverbs.x86_64 0:13-7.el7 will be installed\n--> Processing Dependency: rdma-core(x86-64) = 13-7.el7 for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: perl(warnings) for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: perl(strict) for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: perl(Getopt::Long) for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: perl(File::Basename) for package: libibverbs-13-7.el7.x86_64\n--> Processing Dependency: /usr/bin/perl for package: libibverbs-13-7.el7.x86_64\n---> Package librados2.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: liblttng-ust.so.0()(64bit) for package: 2:librados2-13.0.1-3240.g235f211.el7.x86_64\n---> Package libradosstriper1.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package librbd1.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package python-cephfs.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package python-prettytable.noarch 0:0.7.2-3.el7 will be installed\n---> Package python-rados.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package python-rbd.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package python-requests.noarch 0:2.6.0-1.el7_1 will be installed\n--> Processing Dependency: python-urllib3 >= 1.10.2-1 for package: python-requests-2.6.0-1.el7_1.noarch\n---> Package python-rgw.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: librgw2 = 2:13.0.1-3240.g235f211.el7 for package: 2:python-rgw-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: librgw.so.2()(64bit) for package: 2:python-rgw-13.0.1-3240.g235f211.el7.x86_64\n---> Package snappy.x86_64 0:1.1.0-3.el7 will be installed\n--> Running transaction check\n---> Package librgw2.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package libunwind.x86_64 2:1.2-2.el7 will be installed\n---> Package lttng-ust.x86_64 0:2.4.1-4.el7 will be installed\n--> Processing Dependency: liburcu-bp.so.1()(64bit) for package: lttng-ust-2.4.1-4.el7.x86_64\n--> Processing Dependency: liburcu-cds.so.1()(64bit) for package: lttng-ust-2.4.1-4.el7.x86_64\n---> Package perl.x86_64 4:5.16.3-292.el7 will be installed\n--> Processing Dependency: perl-libs = 4:5.16.3-292.el7 for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Socket) >= 1.3 for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Scalar::Util) >= 1.10 for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl-macros for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl-libs for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(threads::shared) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(threads) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(constant) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Time::Local) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Time::HiRes) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Storable) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Socket) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Scalar::Util) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Pod::Simple::XHTML) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Pod::Simple::Search) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Filter::Util::Call) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Temp) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Spec::Unix) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Spec::Functions) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Spec) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(File::Path) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Exporter) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Cwd) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: perl(Carp) for package: 4:perl-5.16.3-292.el7.x86_64\n--> Processing Dependency: libperl.so()(64bit) for package: 4:perl-5.16.3-292.el7.x86_64\n---> Package perl-Getopt-Long.noarch 0:2.40-2.el7 will be installed\n--> Processing Dependency: perl(Pod::Usage) >= 1.14 for package: perl-Getopt-Long-2.40-2.el7.noarch\n--> Processing Dependency: perl(Text::ParseWords) for package: perl-Getopt-Long-2.40-2.el7.noarch\n---> Package python-urllib3.noarch 0:1.10.2-3.el7 will be installed\n--> Processing Dependency: python-six for package: python-urllib3-1.10.2-3.el7.noarch\n---> Package rdma-core.x86_64 0:13-7.el7 will be installed\n--> Processing Dependency: pciutils for package: rdma-core-13-7.el7.x86_64\n--> Running transaction check\n---> Package pciutils.x86_64 0:3.5.1-2.el7 will be installed\n---> Package perl-Carp.noarch 0:1.26-244.el7 will be installed\n---> Package perl-Exporter.noarch 0:5.68-3.el7 will be installed\n---> Package perl-File-Path.noarch 0:2.09-2.el7 will be installed\n---> Package perl-File-Temp.noarch 0:0.23.01-3.el7 will be installed\n---> Package perl-Filter.x86_64 0:1.49-3.el7 will be installed\n---> Package perl-PathTools.x86_64 0:3.40-5.el7 will be installed\n---> Package perl-Pod-Simple.noarch 1:3.28-4.el7 will be installed\n--> Processing Dependency: perl(Pod::Escapes) >= 1.04 for package: 1:perl-Pod-Simple-3.28-4.el7.noarch\n--> Processing Dependency: perl(Encode) for package: 1:perl-Pod-Simple-3.28-4.el7.noarch\n---> Package perl-Pod-Usage.noarch 0:1.63-3.el7 will be installed\n--> Processing Dependency: perl(Pod::Text) >= 3.15 for package: perl-Pod-Usage-1.63-3.el7.noarch\n--> Processing Dependency: perl-Pod-Perldoc for package: perl-Pod-Usage-1.63-3.el7.noarch\n---> Package perl-Scalar-List-Utils.x86_64 0:1.27-248.el7 will be installed\n---> Package perl-Socket.x86_64 0:2.010-4.el7 will be installed\n---> Package perl-Storable.x86_64 0:2.45-3.el7 will be installed\n---> Package perl-Text-ParseWords.noarch 0:3.29-4.el7 will be installed\n---> Package perl-Time-HiRes.x86_64 4:1.9725-3.el7 will be installed\n---> Package perl-Time-Local.noarch 0:1.2300-2.el7 will be installed\n---> Package perl-constant.noarch 0:1.27-2.el7 will be installed\n---> Package perl-libs.x86_64 4:5.16.3-292.el7 will be installed\n---> Package perl-macros.x86_64 4:5.16.3-292.el7 will be installed\n---> Package perl-threads.x86_64 0:1.87-4.el7 will be installed\n---> Package perl-threads-shared.x86_64 0:1.43-6.el7 will be installed\n---> Package python-six.noarch 0:1.9.0-2.el7 will be installed\n---> Package userspace-rcu.x86_64 0:0.7.16-1.el7 will be installed\n--> Running transaction check\n---> Package perl-Encode.x86_64 0:2.51-7.el7 will be installed\n---> Package perl-Pod-Escapes.noarch 1:1.04-292.el7 will be installed\n---> Package perl-Pod-Perldoc.noarch 0:3.20-4.el7 will be installed\n--> Processing Dependency: perl(parent) for package: perl-Pod-Perldoc-3.20-4.el7.noarch\n--> Processing Dependency: perl(HTTP::Tiny) for package: perl-Pod-Perldoc-3.20-4.el7.noarch\n---> Package perl-podlators.noarch 0:2.5.1-3.el7 will be installed\n--> Running transaction check\n---> Package perl-HTTP-Tiny.noarch 0:0.033-3.el7 will be installed\n---> Package perl-parent.noarch 1:0.225-244.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling:\n ceph-common x86_64 2:13.0.1-3240.g235f211.el7 ceph 13 M\nInstalling for dependencies:\n fuse-libs x86_64 2.9.2-8.el7 base 93 k\n gperftools-libs x86_64 2.4-8.el7 base 272 k\n leveldb x86_64 1.12.0-11.el7 epel 161 k\n libbabeltrace x86_64 1.2.4-3.el7 epel 147 k\n libcephfs2 x86_64 2:13.0.1-3240.g235f211.el7 ceph 423 k\n libibverbs x86_64 13-7.el7 base 194 k\n librados2 x86_64 2:13.0.1-3240.g235f211.el7 ceph 2.7 M\n libradosstriper1 x86_64 2:13.0.1-3240.g235f211.el7 ceph 332 k\n librbd1 x86_64 2:13.0.1-3240.g235f211.el7 ceph 1.2 M\n librgw2 x86_64 2:13.0.1-3240.g235f211.el7 ceph 1.9 M\n libunwind x86_64 2:1.2-2.el7 base 57 k\n lttng-ust x86_64 2.4.1-4.el7 epel 176 k\n pciutils x86_64 3.5.1-2.el7 base 93 k\n perl x86_64 4:5.16.3-292.el7 base 8.0 M\n perl-Carp noarch 1.26-244.el7 base 19 k\n perl-Encode x86_64 2.51-7.el7 base 1.5 M\n perl-Exporter noarch 5.68-3.el7 base 28 k\n perl-File-Path noarch 2.09-2.el7 base 26 k\n perl-File-Temp noarch 0.23.01-3.el7 base 56 k\n perl-Filter x86_64 1.49-3.el7 base 76 k\n perl-Getopt-Long noarch 2.40-2.el7 base 56 k\n perl-HTTP-Tiny noarch 0.033-3.el7 base 38 k\n perl-PathTools x86_64 3.40-5.el7 base 82 k\n perl-Pod-Escapes noarch 1:1.04-292.el7 base 51 k\n perl-Pod-Perldoc noarch 3.20-4.el7 base 87 k\n perl-Pod-Simple noarch 1:3.28-4.el7 base 216 k\n perl-Pod-Usage noarch 1.63-3.el7 base 27 k\n perl-Scalar-List-Utils x86_64 1.27-248.el7 base 36 k\n perl-Socket x86_64 2.010-4.el7 base 49 k\n perl-Storable x86_64 2.45-3.el7 base 77 k\n perl-Text-ParseWords noarch 3.29-4.el7 base 14 k\n perl-Time-HiRes x86_64 4:1.9725-3.el7 base 45 k\n perl-Time-Local noarch 1.2300-2.el7 base 24 k\n perl-constant noarch 1.27-2.el7 base 19 k\n perl-libs x86_64 4:5.16.3-292.el7 base 688 k\n perl-macros x86_64 4:5.16.3-292.el7 base 43 k\n perl-parent noarch 1:0.225-244.el7 base 12 k\n perl-podlators noarch 2.5.1-3.el7 base 112 k\n perl-threads x86_64 1.87-4.el7 base 49 k\n perl-threads-shared x86_64 1.43-6.el7 base 39 k\n python-cephfs x86_64 2:13.0.1-3240.g235f211.el7 ceph 82 k\n python-prettytable noarch 0.7.2-3.el7 base 37 k\n python-rados x86_64 2:13.0.1-3240.g235f211.el7 ceph 183 k\n python-rbd x86_64 2:13.0.1-3240.g235f211.el7 ceph 126 k\n python-requests noarch 2.6.0-1.el7_1 base 94 k\n python-rgw x86_64 2:13.0.1-3240.g235f211.el7 ceph 74 k\n python-six noarch 1.9.0-2.el7 base 29 k\n python-urllib3 noarch 1.10.2-3.el7 base 101 k\n rdma-core x86_64 13-7.el7 base 43 k\n snappy x86_64 1.1.0-3.el7 base 40 k\n userspace-rcu x86_64 0.7.16-1.el7 epel 73 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package (+51 Dependent packages)\n\nTotal download size: 33 M\nInstalled size: 108 M\nDownloading packages:\nPublic key for leveldb-1.12.0-11.el7.x86_64.rpm is not installed\n--------------------------------------------------------------------------------\nTotal 8.3 MB/s | 33 MB 00:04 \nRetrieving key from file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : snappy-1.1.0-3.el7.x86_64 1/52 \n Installing : leveldb-1.12.0-11.el7.x86_64 2/52 \n Installing : 1:perl-parent-0.225-244.el7.noarch 3/52 \n Installing : perl-HTTP-Tiny-0.033-3.el7.noarch 4/52 \n Installing : perl-podlators-2.5.1-3.el7.noarch 5/52 \n Installing : perl-Pod-Perldoc-3.20-4.el7.noarch 6/52 \n Installing : perl-Text-ParseWords-3.29-4.el7.noarch 7/52 \n Installing : 1:perl-Pod-Escapes-1.04-292.el7.noarch 8/52 \n Installing : perl-Encode-2.51-7.el7.x86_64 9/52 \n Installing : perl-Pod-Usage-1.63-3.el7.noarch 10/52 \n Installing : 4:perl-macros-5.16.3-292.el7.x86_64 11/52 \n Installing : 4:perl-libs-5.16.3-292.el7.x86_64 12/52 \n Installing : perl-Storable-2.45-3.el7.x86_64 13/52 \n Installing : 4:perl-Time-HiRes-1.9725-3.el7.x86_64 14/52 \n Installing : perl-constant-1.27-2.el7.noarch 15/52 \n Installing : perl-Time-Local-1.2300-2.el7.noarch 16/52 \n Installing : perl-Socket-2.010-4.el7.x86_64 17/52 \n Installing : perl-Carp-1.26-244.el7.noarch 18/52 \n Installing : perl-PathTools-3.40-5.el7.x86_64 19/52 \n Installing : perl-Scalar-List-Utils-1.27-248.el7.x86_64 20/52 \n Installing : perl-Exporter-5.68-3.el7.noarch 21/52 \n Installing : perl-File-Temp-0.23.01-3.el7.noarch 22/52 \n Installing : perl-File-Path-2.09-2.el7.noarch 23/52 \n Installing : perl-threads-shared-1.43-6.el7.x86_64 24/52 \n Installing : perl-threads-1.87-4.el7.x86_64 25/52 \n Installing : perl-Filter-1.49-3.el7.x86_64 26/52 \n Installing : 1:perl-Pod-Simple-3.28-4.el7.noarch 27/52 \n Installing : perl-Getopt-Long-2.40-2.el7.noarch 28/52 \n Installing : 4:perl-5.16.3-292.el7.x86_64 29/52 \n Installing : userspace-rcu-0.7.16-1.el7.x86_64 30/52 \n Installing : lttng-ust-2.4.1-4.el7.x86_64 31/52 \n Installing : python-prettytable-0.7.2-3.el7.noarch 32/52 \n Installing : 2:libunwind-1.2-2.el7.x86_64 33/52 \n Installing : gperftools-libs-2.4-8.el7.x86_64 34/52 \n Installing : fuse-libs-2.9.2-8.el7.x86_64 35/52 \n Installing : pciutils-3.5.1-2.el7.x86_64 36/52 \n Installing : rdma-core-13-7.el7.x86_64 37/52 \n Installing : libibverbs-13-7.el7.x86_64 38/52 \n Installing : 2:librados2-13.0.1-3240.g235f211.el7.x86_64 39/52 \n Installing : 2:python-rados-13.0.1-3240.g235f211.el7.x86_64 40/52 \n Installing : 2:libcephfs2-13.0.1-3240.g235f211.el7.x86_64 41/52 \n Installing : 2:librbd1-13.0.1-3240.g235f211.el7.x86_64 42/52 \n Installing : 2:python-rbd-13.0.1-3240.g235f211.el7.x86_64 43/52 \n Installing : 2:python-cephfs-13.0.1-3240.g235f211.el7.x86_64 44/52 \n Installing : 2:libradosstriper1-13.0.1-3240.g235f211.el7.x86_64 45/52 \n Installing : 2:librgw2-13.0.1-3240.g235f211.el7.x86_64 46/52 \n Installing : 2:python-rgw-13.0.1-3240.g235f211.el7.x86_64 47/52 \n Installing : python-six-1.9.0-2.el7.noarch 48/52 \n Installing : python-urllib3-1.10.2-3.el7.noarch 49/52 \n Installing : python-requests-2.6.0-1.el7_1.noarch 50/52 \n Installing : libbabeltrace-1.2.4-3.el7.x86_64 51/52 \n Installing : 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64 52/52 \n Verifying : perl-HTTP-Tiny-0.033-3.el7.noarch 1/52 \n Verifying : 2:python-rbd-13.0.1-3240.g235f211.el7.x86_64 2/52 \n Verifying : leveldb-1.12.0-11.el7.x86_64 3/52 \n Verifying : perl-Storable-2.45-3.el7.x86_64 4/52 \n Verifying : rdma-core-13-7.el7.x86_64 5/52 \n Verifying : 4:perl-Time-HiRes-1.9725-3.el7.x86_64 6/52 \n Verifying : 2:libradosstriper1-13.0.1-3240.g235f211.el7.x86_64 7/52 \n Verifying : 2:python-cephfs-13.0.1-3240.g235f211.el7.x86_64 8/52 \n Verifying : perl-constant-1.27-2.el7.noarch 9/52 \n Verifying : perl-PathTools-3.40-5.el7.x86_64 10/52 \n Verifying : 4:perl-macros-5.16.3-292.el7.x86_64 11/52 \n Verifying : python-urllib3-1.10.2-3.el7.noarch 12/52 \n Verifying : 2:librados2-13.0.1-3240.g235f211.el7.x86_64 13/52 \n Verifying : libbabeltrace-1.2.4-3.el7.x86_64 14/52 \n Verifying : 4:perl-5.16.3-292.el7.x86_64 15/52 \n Verifying : perl-File-Temp-0.23.01-3.el7.noarch 16/52 \n Verifying : 1:perl-Pod-Simple-3.28-4.el7.noarch 17/52 \n Verifying : perl-Time-Local-1.2300-2.el7.noarch 18/52 \n Verifying : perl-podlators-2.5.1-3.el7.noarch 19/52 \n Verifying : 4:perl-libs-5.16.3-292.el7.x86_64 20/52 \n Verifying : perl-Pod-Perldoc-3.20-4.el7.noarch 21/52 \n Verifying : python-six-1.9.0-2.el7.noarch 22/52 \n Verifying : 2:librgw2-13.0.1-3240.g235f211.el7.x86_64 23/52 \n Verifying : perl-Socket-2.010-4.el7.x86_64 24/52 \n Verifying : pciutils-3.5.1-2.el7.x86_64 25/52 \n Verifying : perl-Carp-1.26-244.el7.noarch 26/52 \n Verifying : 2:libcephfs2-13.0.1-3240.g235f211.el7.x86_64 27/52 \n Verifying : gperftools-libs-2.4-8.el7.x86_64 28/52 \n Verifying : perl-threads-shared-1.43-6.el7.x86_64 29/52 \n Verifying : 2:python-rados-13.0.1-3240.g235f211.el7.x86_64 30/52 \n Verifying : 2:ceph-common-13.0.1-3240.g235f211.el7.x86_64 31/52 \n Verifying : lttng-ust-2.4.1-4.el7.x86_64 32/52 \n Verifying : perl-Scalar-List-Utils-1.27-248.el7.x86_64 33/52 \n Verifying : fuse-libs-2.9.2-8.el7.x86_64 34/52 \n Verifying : 2:libunwind-1.2-2.el7.x86_64 35/52 \n Verifying : perl-Pod-Usage-1.63-3.el7.noarch 36/52 \n Verifying : snappy-1.1.0-3.el7.x86_64 37/52 \n Verifying : perl-Encode-2.51-7.el7.x86_64 38/52 \n Verifying : perl-Exporter-5.68-3.el7.noarch 39/52 \n Verifying : python-prettytable-0.7.2-3.el7.noarch 40/52 \n Verifying : 2:python-rgw-13.0.1-3240.g235f211.el7.x86_64 41/52 \n Verifying : libibverbs-13-7.el7.x86_64 42/52 \n Verifying : perl-Getopt-Long-2.40-2.el7.noarch 43/52 \n Verifying : perl-File-Path-2.09-2.el7.noarch 44/52 \n Verifying : python-requests-2.6.0-1.el7_1.noarch 45/52 \n Verifying : perl-threads-1.87-4.el7.x86_64 46/52 \n Verifying : userspace-rcu-0.7.16-1.el7.x86_64 47/52 \n Verifying : 2:librbd1-13.0.1-3240.g235f211.el7.x86_64 48/52 \n Verifying : perl-Filter-1.49-3.el7.x86_64 49/52 \n Verifying : perl-Text-ParseWords-3.29-4.el7.noarch 50/52 \n Verifying : 1:perl-parent-0.225-244.el7.noarch 51/52 \n Verifying : 1:perl-Pod-Escapes-1.04-292.el7.noarch 52/52 \n\nInstalled:\n ceph-common.x86_64 2:13.0.1-3240.g235f211.el7 \n\nDependency Installed:\n fuse-libs.x86_64 0:2.9.2-8.el7 \n gperftools-libs.x86_64 0:2.4-8.el7 \n leveldb.x86_64 0:1.12.0-11.el7 \n libbabeltrace.x86_64 0:1.2.4-3.el7 \n libcephfs2.x86_64 2:13.0.1-3240.g235f211.el7 \n libibverbs.x86_64 0:13-7.el7 \n librados2.x86_64 2:13.0.1-3240.g235f211.el7 \n libradosstriper1.x86_64 2:13.0.1-3240.g235f211.el7 \n librbd1.x86_64 2:13.0.1-3240.g235f211.el7 \n librgw2.x86_64 2:13.0.1-3240.g235f211.el7 \n libunwind.x86_64 2:1.2-2.el7 \n lttng-ust.x86_64 0:2.4.1-4.el7 \n pciutils.x86_64 0:3.5.1-2.el7 \n perl.x86_64 4:5.16.3-292.el7 \n perl-Carp.noarch 0:1.26-244.el7 \n perl-Encode.x86_64 0:2.51-7.el7 \n perl-Exporter.noarch 0:5.68-3.el7 \n perl-File-Path.noarch 0:2.09-2.el7 \n perl-File-Temp.noarch 0:0.23.01-3.el7 \n perl-Filter.x86_64 0:1.49-3.el7 \n perl-Getopt-Long.noarch 0:2.40-2.el7 \n perl-HTTP-Tiny.noarch 0:0.033-3.el7 \n perl-PathTools.x86_64 0:3.40-5.el7 \n perl-Pod-Escapes.noarch 1:1.04-292.el7 \n perl-Pod-Perldoc.noarch 0:3.20-4.el7 \n perl-Pod-Simple.noarch 1:3.28-4.el7 \n perl-Pod-Usage.noarch 0:1.63-3.el7 \n perl-Scalar-List-Utils.x86_64 0:1.27-248.el7 \n perl-Socket.x86_64 0:2.010-4.el7 \n perl-Storable.x86_64 0:2.45-3.el7 \n perl-Text-ParseWords.noarch 0:3.29-4.el7 \n perl-Time-HiRes.x86_64 4:1.9725-3.el7 \n perl-Time-Local.noarch 0:1.2300-2.el7 \n perl-constant.noarch 0:1.27-2.el7 \n perl-libs.x86_64 4:5.16.3-292.el7 \n perl-macros.x86_64 4:5.16.3-292.el7 \n perl-parent.noarch 1:0.225-244.el7 \n perl-podlators.noarch 0:2.5.1-3.el7 \n perl-threads.x86_64 0:1.87-4.el7 \n perl-threads-shared.x86_64 0:1.43-6.el7 \n python-cephfs.x86_64 2:13.0.1-3240.g235f211.el7 \n python-prettytable.noarch 0:0.7.2-3.el7 \n python-rados.x86_64 2:13.0.1-3240.g235f211.el7 \n python-rbd.x86_64 2:13.0.1-3240.g235f211.el7 \n python-requests.noarch 0:2.6.0-1.el7_1 \n python-rgw.x86_64 2:13.0.1-3240.g235f211.el7 \n python-six.noarch 0:1.9.0-2.el7 \n python-urllib3.noarch 0:1.10.2-3.el7 \n rdma-core.x86_64 0:13-7.el7 \n snappy.x86_64 0:1.1.0-3.el7 \n userspace-rcu.x86_64 0:0.7.16-1.el7 \n\nComplete!\n" ] } MSG: warning: /var/cache/yum/x86_64/7/epel/packages/leveldb-1.12.0-11.el7.x86_64.rpm: Header V3 RSA/SHA256 Signature, key ID 352c64e5: NOKEY Importing GPG key 0x352C64E5: Userid : "Fedora EPEL (7) " Fingerprint: 91e9 7d7c 4a5e 96f1 7f3e 888f 6a2f aea2 352c 64e5 Package : epel-release-7-9.noarch (@extras) From : /etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7 TASK [ceph-common : install redhat ceph-mon package] *************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:21 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install redhat ceph-osd package] *************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:28 changed: [osd0] => { "changed": true, "failed": false, "rc": 0, "results": [ "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirrors.ovh.net\n * epel: mirror.switch.ch\n * extras: mirrors.standaloneinstaller.com\n * updates: centos.mirrors.ovh.net\nResolving Dependencies\n--> Running transaction check\n---> Package ceph-osd.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: ceph-base = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-osd-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: gdisk for package: 2:ceph-osd-13.0.1-3240.g235f211.el7.x86_64\n--> Running transaction check\n---> Package ceph-base.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n--> Processing Dependency: ceph-selinux = 2:13.0.1-3240.g235f211.el7 for package: 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: psmisc for package: 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64\n--> Processing Dependency: cryptsetup for package: 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64\n---> Package gdisk.x86_64 0:0.8.6-5.el7 will be installed\n--> Processing Dependency: libicuuc.so.50()(64bit) for package: gdisk-0.8.6-5.el7.x86_64\n--> Processing Dependency: libicuio.so.50()(64bit) for package: gdisk-0.8.6-5.el7.x86_64\n--> Running transaction check\n---> Package ceph-selinux.x86_64 2:13.0.1-3240.g235f211.el7 will be installed\n---> Package cryptsetup.x86_64 0:1.7.4-3.el7_4.1 will be installed\n---> Package libicu.x86_64 0:50.1.2-15.el7 will be installed\n---> Package psmisc.x86_64 0:22.20-15.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n ceph-osd x86_64 2:13.0.1-3240.g235f211.el7 ceph 12 M\nInstalling for dependencies:\n ceph-base x86_64 2:13.0.1-3240.g235f211.el7 ceph 4.6 M\n ceph-selinux x86_64 2:13.0.1-3240.g235f211.el7 ceph 19 k\n cryptsetup x86_64 1.7.4-3.el7_4.1 updates 128 k\n gdisk x86_64 0.8.6-5.el7 base 187 k\n libicu x86_64 50.1.2-15.el7 base 6.9 M\n psmisc x86_64 22.20-15.el7 base 141 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package (+6 Dependent packages)\n\nTotal download size: 24 M\nInstalled size: 86 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 9.4 MB/s | 24 MB 00:02 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : psmisc-22.20-15.el7.x86_64 1/7 \n Installing : cryptsetup-1.7.4-3.el7_4.1.x86_64 2/7 \n Installing : 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64 3/7 \n Installing : 2:ceph-selinux-13.0.1-3240.g235f211.el7.x86_64 4/7 \n/usr/lib/python2.7/site-packages/ceph_disk/main.py:5689: UserWarning: \n*******************************************************************************\nThis tool is now deprecated in favor of ceph-volume.\nIt is recommended to use ceph-volume for OSD deployments. For details see:\n\n http://docs.ceph.com/docs/master/ceph-volume/#migrating\n\n*******************************************************************************\n\n warnings.warn(DEPRECATION_WARNING)\n/usr/lib/python2.7/site-packages/ceph_disk/main.py:5721: UserWarning: \n*******************************************************************************\nThis tool is now deprecated in favor of ceph-volume.\nIt is recommended to use ceph-volume for OSD deployments. For details see:\n\n http://docs.ceph.com/docs/master/ceph-volume/#migrating\n\n*******************************************************************************\n\n warnings.warn(DEPRECATION_WARNING)\n Installing : libicu-50.1.2-15.el7.x86_64 5/7 \n Installing : gdisk-0.8.6-5.el7.x86_64 6/7 \n Installing : 2:ceph-osd-13.0.1-3240.g235f211.el7.x86_64 7/7 \n Verifying : gdisk-0.8.6-5.el7.x86_64 1/7 \n Verifying : 2:ceph-selinux-13.0.1-3240.g235f211.el7.x86_64 2/7 \n Verifying : libicu-50.1.2-15.el7.x86_64 3/7 \n Verifying : cryptsetup-1.7.4-3.el7_4.1.x86_64 4/7 \n Verifying : psmisc-22.20-15.el7.x86_64 5/7 \n Verifying : 2:ceph-osd-13.0.1-3240.g235f211.el7.x86_64 6/7 \n Verifying : 2:ceph-base-13.0.1-3240.g235f211.el7.x86_64 7/7 \n\nInstalled:\n ceph-osd.x86_64 2:13.0.1-3240.g235f211.el7 \n\nDependency Installed:\n ceph-base.x86_64 2:13.0.1-3240.g235f211.el7 \n ceph-selinux.x86_64 2:13.0.1-3240.g235f211.el7 \n cryptsetup.x86_64 0:1.7.4-3.el7_4.1 \n gdisk.x86_64 0:0.8.6-5.el7 \n libicu.x86_64 0:50.1.2-15.el7 \n psmisc.x86_64 0:22.20-15.el7 \n\nComplete!\n" ] } TASK [ceph-common : install redhat ceph-fuse package] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:35 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install redhat ceph-base package] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:42 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install redhat ceph-test package] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:49 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : install redhat ceph-radosgw package] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/installs/install_redhat_packages.yml:56 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include installs/install_on_suse.yml] ********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:31 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include installs/install_on_debian.yml] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:40 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include installs/install_on_clear.yml] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:49 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include ntp debian setup tasks] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:58 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : include ntp rpm setup tasks] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:66 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/ntp_rpm.yml for osd0 TASK [ceph-common : install ntp] *********************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/ntp_rpm.yml:2 changed: [osd0] => { "changed": true, "failed": false, "rc": 0, "results": [ "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirrors.ovh.net\n * epel: mirror.switch.ch\n * extras: mirrors.standaloneinstaller.com\n * updates: centos.mirrors.ovh.net\nResolving Dependencies\n--> Running transaction check\n---> Package ntp.x86_64 0:4.2.6p5-25.el7.centos.2 will be installed\n--> Processing Dependency: ntpdate = 4.2.6p5-25.el7.centos.2 for package: ntp-4.2.6p5-25.el7.centos.2.x86_64\n--> Processing Dependency: libopts.so.25()(64bit) for package: ntp-4.2.6p5-25.el7.centos.2.x86_64\n--> Running transaction check\n---> Package autogen-libopts.x86_64 0:5.18-5.el7 will be installed\n---> Package ntpdate.x86_64 0:4.2.6p5-25.el7.centos.2 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling:\n ntp x86_64 4.2.6p5-25.el7.centos.2 base 547 k\nInstalling for dependencies:\n autogen-libopts x86_64 5.18-5.el7 base 66 k\n ntpdate x86_64 4.2.6p5-25.el7.centos.2 base 86 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package (+2 Dependent packages)\n\nTotal download size: 699 k\nInstalled size: 1.6 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 1.6 MB/s | 699 kB 00:00 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : autogen-libopts-5.18-5.el7.x86_64 1/3 \n Installing : ntpdate-4.2.6p5-25.el7.centos.2.x86_64 2/3 \n Installing : ntp-4.2.6p5-25.el7.centos.2.x86_64 3/3 \n Verifying : ntp-4.2.6p5-25.el7.centos.2.x86_64 1/3 \n Verifying : ntpdate-4.2.6p5-25.el7.centos.2.x86_64 2/3 \n Verifying : autogen-libopts-5.18-5.el7.x86_64 3/3 \n\nInstalled:\n ntp.x86_64 0:4.2.6p5-25.el7.centos.2 \n\nDependency Installed:\n autogen-libopts.x86_64 0:5.18-5.el7 ntpdate.x86_64 0:4.2.6p5-25.el7.centos.2 \n\nComplete!\n" ] } TASK [ceph-common : start the ntp service] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/misc/ntp_rpm.yml:7 changed: [osd0] => { "changed": true, "enabled": true, "failed": false, "name": "ntpd", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system.slice systemd-journald.socket tmp.mount ntpdate.service basic.target -.mount syslog.target sntp.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target chronyd.service", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConflictedBy": "chronyd.service", "Conflicts": "shutdown.target", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "Network Time Service", "DevicePolicy": "auto", "EnvironmentFile": "/etc/sysconfig/ntpd (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/ntpd ; argv[]=/usr/sbin/ntpd -u ntp:ntp $OPTIONS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/ntpd.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "ntpd.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "1885", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "1885", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "ntpd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "yes", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Requires": "-.mount basic.target", "RequiresMountsFor": "/var/tmp", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "forking", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "disabled", "Wants": "system.slice", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [ceph-common : get ceph version] ****************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:74 ok: [osd0] => { "changed": false, "cmd": [ "ceph", "--version" ], "delta": "0:00:00.154162", "end": "2018-03-25 23:27:40.944735", "failed": false, "rc": 0, "start": "2018-03-25 23:27:40.790573" } STDOUT: ceph version 13.0.1-3240-g235f211 (235f2119010484c12c5bd29421aeef7d44df38a1) mimic (dev) TASK [ceph-common : set_fact ceph_version] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/main.yml:80 ok: [osd0] => { "ansible_facts": { "ceph_version": "13.0.1-3240-g235f211" }, "changed": false, "failed": false } TASK [ceph-common : set_fact ceph_release jewel] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : set_fact ceph_release kraken] ****************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml:8 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : set_fact ceph_release luminous] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml:14 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : set_fact ceph_release mimic] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/release-rhcs.yml:20 ok: [osd0] => { "ansible_facts": { "ceph_release": "mimic" }, "changed": false, "failed": false } TASK [ceph-common : check if /var/lib/ceph/mon/ceph-osd0/keyring already exists] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail if /var/lib/ceph/mon/ceph-osd0/keyring doesn't exist] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:7 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : get existing initial mon keyring if it already exists but not monitor_keyring.conf in /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:14 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : test existing initial mon keyring] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:22 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : fail if initial mon keyring found doesn't work] ************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:27 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : write initial mon keyring in /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/fetch/monitor_keyring.conf if it doesn't exist] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:33 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : put initial mon keyring in mon kv store] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/facts_mon_fsid.yml:41 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : create ceph initial directories] *************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_ceph_initial_dirs.yml:2 changed: [osd0] => (item=/etc/ceph) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/etc/ceph", "mode": "0755", "owner": "ceph", "path": "/etc/ceph", "secontext": "system_u:object_r:etc_t:s0", "size": 20, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 133, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/mon) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/mon", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/mon", "secontext": "unconfined_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/osd) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/osd", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/osd", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/mds) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/mds", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/mds", "secontext": "unconfined_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/tmp) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/tmp", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/tmp", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/radosgw) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/radosgw", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/radosgw", "secontext": "unconfined_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/bootstrap-rgw) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/bootstrap-rgw", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/bootstrap-rgw", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/bootstrap-mds) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/bootstrap-mds", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/bootstrap-mds", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/bootstrap-osd) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/bootstrap-osd", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/bootstrap-osd", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/lib/ceph/bootstrap-rbd) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/bootstrap-rbd", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/bootstrap-rbd", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } TASK [ceph-common : create rbd client directory] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/create_rbd_client_dir.yml:2 ok: [osd0] => (item=/var/run/ceph) => { "changed": false, "failed": false, "gid": 167, "group": "ceph", "item": "/var/run/ceph", "mode": "0770", "owner": "ceph", "path": "/var/run/ceph", "secontext": "system_u:object_r:ceph_var_run_t:s0", "size": 40, "state": "directory", "uid": 167 } changed: [osd0] => (item=/var/log/ceph) => { "changed": true, "failed": false, "gid": 167, "group": "ceph", "item": "/var/log/ceph", "mode": "0770", "owner": "ceph", "path": "/var/log/ceph", "secontext": "system_u:object_r:ceph_log_t:s0", "size": 6, "state": "directory", "uid": 167 } TASK [ceph-common : configure cluster name] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml:2 changed: [osd0] => { "backup": "", "changed": true, "failed": false } MSG: line added TASK [ceph-common : check /etc/default/ceph exist] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml:22 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : configure cluster name] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml:30 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : configure cluster name] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_cluster_name.yml:42 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : configure TCMALLOC_MAX_TOTAL_THREAD_CACHE_BYTES for debian] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-common : configure TCMALLOC_MAX_TOTAL_THREAD_CACHE_BYTES for redhat] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-common/tasks/configure_memory_allocator.yml:15 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : create ceph conf directory] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:4 ok: [osd0] => { "changed": false, "failed": false, "gid": 167, "group": "ceph", "mode": "0755", "owner": "ceph", "path": "/etc/ceph", "secontext": "system_u:object_r:etc_t:s0", "size": 20, "state": "directory", "uid": 167 } TASK [ceph-config : generate ceph configuration file: ceph.conf] *************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:12 [DEPRECATION WARNING]: ansible.utils.unicode.to_bytes is deprecated. Use ansible.module_utils._text.to_bytes instead. This feature will be removed in version 2.4. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. [DEPRECATION WARNING]: ansible.utils.unicode.to_unicode is deprecated. Use ansible.module_utils._text.to_text instead. This feature will be removed in version 2.4. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. NOTIFIED HANDLER ceph-defaults : set _mon_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy mon restart script NOTIFIED HANDLER ceph-defaults : restart ceph mon daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph mon daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _mon_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _osd_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy osd restart script NOTIFIED HANDLER ceph-defaults : restart ceph osds daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph osds daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _osd_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _mds_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy mds restart script NOTIFIED HANDLER ceph-defaults : restart ceph mds daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph mds daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _mds_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _rgw_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy rgw restart script NOTIFIED HANDLER ceph-defaults : restart ceph rgw daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph rgw daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _rgw_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _mgr_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy mgr restart script NOTIFIED HANDLER ceph-defaults : restart ceph mgr daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph mgr daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _mgr_handler_called after restart NOTIFIED HANDLER ceph-defaults : set _rbdmirror_handler_called before restart NOTIFIED HANDLER ceph-defaults : copy rbd mirror restart script NOTIFIED HANDLER ceph-defaults : restart ceph rbd mirror daemon(s) - non container NOTIFIED HANDLER ceph-defaults : restart ceph rbd mirror daemon(s) - container NOTIFIED HANDLER ceph-defaults : set _rbdmirror_handler_called after restart changed: [osd0] => { "changed": true, "checksum": "028a0a3ceb5d8b25a3b35f83285a5e017c9e7652", "dest": "/etc/ceph/ceph.conf", "failed": false, "gid": 167, "group": "ceph", "md5sum": "c3cca8ee26b0646a4d9a7b7bbe079b7c", "mode": "0644", "owner": "ceph", "secontext": "system_u:object_r:etc_t:s0", "size": 197, "src": "/home/vagrant/.ansible/tmp/ansible-tmp-1522020500.02-247590564013112/source", "state": "file", "uid": 167 } TASK [ceph-config : create a local fetch directory if it does not exist] ******* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:38 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : generate cluster uuid] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:54 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : read cluster uuid if it already exists] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:64 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : ensure /etc/ceph exists] *********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:76 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : generate ceph.conf configuration file] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:86 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-config : set fsid fact when generate_fsid = true] ******************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-config/tasks/main.yml:104 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact docker_exec_cmd] ************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : make sure public_network configured] ************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : make sure cluster_network configured] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:8 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : make sure journal_size configured] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:15 skipping: [osd0] => { "skip_reason": "Conditional result was False" } TASK [ceph-osd : make sure an osd scenario was chosen] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:23 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : make sure a valid osd scenario was chosen] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:31 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : verify devices have been provided] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:39 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : check if osd_scenario lvm is supported by the selected ceph version] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:49 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : verify lvm_volumes have been provided] ************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:59 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : make sure the lvm_volumes variable is a list] ***************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:69 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : make sure the devices variable is a list] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:79 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : verify dedicated devices have been provided] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:88 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : make sure the dedicated_devices variable is a list] *********** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:98 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : check if bluestore is supported by the selected ceph version] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_mandatory_vars.yml:109 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : include system_tuning.yml] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:11 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/system_tuning.yml for osd0 TASK [ceph-osd : disable osd directory parsing by updatedb] ******************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/system_tuning.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : disable osd directory path in updatedb.conf] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/system_tuning.yml:11 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : create tmpfiles.d directory] ********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/system_tuning.yml:22 ok: [osd0] => { "changed": false, "failed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tmpfiles.d", "secontext": "system_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0 } TASK [ceph-osd : disable transparent hugepage] ********************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/system_tuning.yml:33 changed: [osd0] => { "changed": true, "checksum": "c2fa1ea2403821dbbc293a7dfc0bc12be139bed4", "dest": "/etc/tmpfiles.d/ceph_transparent_hugepage.conf", "failed": false, "gid": 0, "group": "root", "md5sum": "cbe8f482777cfcb639343245d7cdb81e", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 79, "src": "/home/vagrant/.ansible/tmp/ansible-tmp-1522020509.11-120798986597315/source", "state": "file", "uid": 0 } TASK [ceph-osd : get default vm.min_free_kbytes] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/system_tuning.yml:45 ok: [osd0] => { "changed": false, "cmd": [ "sysctl", "-b", "vm.min_free_kbytes" ], "delta": "0:00:00.017529", "end": "2018-03-25 23:28:36.443886", "failed": false, "failed_when_result": false, "rc": 0, "start": "2018-03-25 23:28:36.426357" } STDOUT: 2823 TASK [ceph-osd : set_fact vm_min_free_kbytes] ********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/system_tuning.yml:52 ok: [osd0] => { "ansible_facts": { "vm_min_free_kbytes": "2823" }, "changed": false, "failed": false } TASK [ceph-osd : apply operating system tuning] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/system_tuning.yml:56 changed: [osd0] => (item={u'value': 4194303, u'name': u'kernel.pid_max'}) => { "changed": true, "failed": false, "item": { "name": "kernel.pid_max", "value": 4194303 } } changed: [osd0] => (item={u'value': 26234859, u'name': u'fs.file-max'}) => { "changed": true, "failed": false, "item": { "name": "fs.file-max", "value": 26234859 } } TASK [ceph-osd : increase aio-max-nr for bluestore] **************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/system_tuning.yml:66 changed: [osd0] => { "changed": true, "failed": false } TASK [ceph-osd : include pre_requisite.yml] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:16 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/pre_requisite.yml for osd0 TASK [ceph-osd : install dependencies] ***************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/pre_requisite.yml:2 ok: [osd0] => { "changed": false, "failed": false, "rc": 0, "results": [ "parted-3.1-28.el7.x86_64 providing parted is already installed" ] } TASK [ceph-osd : create bootstrap-osd and osd directories] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/pre_requisite.yml:9 ok: [osd0] => (item=/var/lib/ceph/bootstrap-osd/) => { "changed": false, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/bootstrap-osd/", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/bootstrap-osd/", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } ok: [osd0] => (item=/var/lib/ceph/osd/) => { "changed": false, "failed": false, "gid": 167, "group": "ceph", "item": "/var/lib/ceph/osd/", "mode": "0755", "owner": "ceph", "path": "/var/lib/ceph/osd/", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 6, "state": "directory", "uid": 167 } TASK [ceph-osd : copy osd bootstrap key] *************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/pre_requisite.yml:22 changed: [osd0] => (item={u'copy_key': True, u'name': u'/var/lib/ceph/bootstrap-osd/ceph.keyring'}) => { "changed": true, "checksum": "35157895bacf561421cf9ab3a616dbe3adeac005", "dest": "/var/lib/ceph/bootstrap-osd/ceph.keyring", "failed": false, "gid": 167, "group": "ceph", "item": { "copy_key": true, "name": "/var/lib/ceph/bootstrap-osd/ceph.keyring" }, "md5sum": "02e40668450c887e3c7114a61f8d2951", "mode": "0600", "owner": "ceph", "secontext": "system_u:object_r:ceph_var_lib_t:s0", "size": 71, "src": "/home/vagrant/.ansible/tmp/ansible-tmp-1522020533.19-165165665547537/source", "state": "file", "uid": 167 } changed: [osd0] => (item={u'copy_key': True, u'name': u'/etc/ceph/ceph.client.admin.keyring'}) => { "changed": true, "checksum": "6d24ed5b75493c61626b6c530f15b0f0b42353f7", "dest": "/etc/ceph/ceph.client.admin.keyring", "failed": false, "gid": 167, "group": "ceph", "item": { "copy_key": true, "name": "/etc/ceph/ceph.client.admin.keyring" }, "md5sum": "7939e6c27be842a6a5d862bbab67d911", "mode": "0600", "owner": "ceph", "secontext": "system_u:object_r:etc_t:s0", "size": 63, "src": "/home/vagrant/.ansible/tmp/ansible-tmp-1522020538.06-38583695745486/source", "state": "file", "uid": 167 } TASK [ceph-osd : set_fact ceph_disk_cli_options '--cluster ceph --bluestore'] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact ceph_disk_cli_options 'ceph_disk_cli_options'] ******* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:11 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact ceph_disk_cli_options '--cluster ceph'] ************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:20 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact ceph_disk_cli_options '--cluster ceph --bluestore --dmcrypt'] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:29 ok: [osd0] => { "ansible_facts": { "ceph_disk_cli_options": "--cluster ceph --bluestore --dmcrypt" }, "changed": false, "failed": false } TASK [ceph-osd : set_fact ceph_disk_cli_options '--cluster ceph --filestore --dmcrypt'] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:38 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact ceph_disk_cli_options '--cluster ceph --dmcrypt'] **** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:47 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact docker_env_args '-e KV_TYPE=etcd -e KV_IP=127.0.0.1 -e KV_PORT=2379'] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:56 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact docker_env_args '-e OSD_BLUESTORE=0 -e OSD_FILESTORE=1 -e OSD_DMCRYPT=0'] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:62 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact docker_env_args '-e OSD_BLUESTORE=0 -e OSD_FILESTORE=1 -e OSD_DMCRYPT=1'] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:70 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact docker_env_args '-e OSD_BLUESTORE=1 -e OSD_FILESTORE=0 -e OSD_DMCRYPT=0'] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:78 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact docker_env_args '-e OSD_BLUESTORE=1 -e OSD_FILESTORE=0 -e OSD_DMCRYPT=1'] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/ceph_disk_cli_options_facts.yml:86 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : set_fact devices generate device list when osd_auto_discovery] *** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/build_devices.yml:2 skipping: [osd0] => (item={'key': u'dm-4', 'value': {u'support_discard': u'512', u'partitions': {}, u'holders': [], u'host': u'', u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': [u'dm-name-journals-journal1', u'dm-uuid-LVM-wwUdZtb9nz33ukSV8HrdMK2LgARa55ikH7sf8AFbTyisn3J1Xe0bdTL9se0ApudF']}, u'removable': u'0', u'model': None, u'size': u'6.00 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'', u'rotational': u'1', u'sectors': u'12574720', u'vendor': None}}) => { "changed": false, "item": { "key": "dm-4", "value": { "holders": [], "host": "", "links": { "ids": [ "dm-name-journals-journal1", "dm-uuid-LVM-wwUdZtb9nz33ukSV8HrdMK2LgARa55ikH7sf8AFbTyisn3J1Xe0bdTL9se0ApudF" ], "labels": [], "masters": [], "uuids": [] }, "model": null, "partitions": {}, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "", "sectors": "12574720", "sectorsize": "512", "size": "6.00 GB", "support_discard": "512", "vendor": null, "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item={'key': u'sdd', 'value': {u'support_discard': u'512', u'partitions': {u'sdd2': {u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': [u'ata-QEMU_HARDDISK_QM00004-part2']}, u'uuid': None, u'holders': [], u'size': u'6.00 GB', u'sectorsize': 512, u'start': u'12582912', u'sectors': u'12580864'}, u'sdd1': {u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': [u'ata-QEMU_HARDDISK_QM00004-part1']}, u'uuid': None, u'holders': [], u'size': u'6.00 GB', u'sectorsize': 512, u'start': u'2048', u'sectors': u'12580864'}}, u'holders': [], u'host': u'', u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': [u'ata-QEMU_HARDDISK_QM00004']}, u'removable': u'0', u'model': u'QEMU HARDDISK', u'size': u'12.00 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'cfq', u'rotational': u'1', u'sectors': u'25165824', u'vendor': u'ATA'}}) => { "changed": false, "item": { "key": "sdd", "value": { "holders": [], "host": "", "links": { "ids": [ "ata-QEMU_HARDDISK_QM00004" ], "labels": [], "masters": [], "uuids": [] }, "model": "QEMU HARDDISK", "partitions": { "sdd1": { "holders": [], "links": { "ids": [ "ata-QEMU_HARDDISK_QM00004-part1" ], "labels": [], "masters": [], "uuids": [] }, "sectors": "12580864", "sectorsize": 512, "size": "6.00 GB", "start": "2048", "uuid": null }, "sdd2": { "holders": [], "links": { "ids": [ "ata-QEMU_HARDDISK_QM00004-part2" ], "labels": [], "masters": [], "uuids": [] }, "sectors": "12580864", "sectorsize": 512, "size": "6.00 GB", "start": "12582912", "uuid": null } }, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "cfq", "sectors": "25165824", "sectorsize": "512", "size": "12.00 GB", "support_discard": "512", "vendor": "ATA", "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item={'key': u'dm-1', 'value': {u'support_discard': u'0', u'partitions': {}, u'holders': [], u'host': u'', u'links': {u'labels': [], u'masters': [], u'uuids': [u'4c9ed4bd-f9de-4040-a24b-00273f48bafa'], u'ids': [u'dm-name-VolGroup00-LogVol01', u'dm-uuid-LVM-5wSnGs2SfytkqLD4ux97trCY33RcU3BphP5VJPZ5F35Y3nnHDjLBowHIMqUiY21i']}, u'removable': u'0', u'model': None, u'size': u'1.50 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'', u'rotational': u'1', u'sectors': u'3145728', u'vendor': None}}) => { "changed": false, "item": { "key": "dm-1", "value": { "holders": [], "host": "", "links": { "ids": [ "dm-name-VolGroup00-LogVol01", "dm-uuid-LVM-5wSnGs2SfytkqLD4ux97trCY33RcU3BphP5VJPZ5F35Y3nnHDjLBowHIMqUiY21i" ], "labels": [], "masters": [], "uuids": [ "4c9ed4bd-f9de-4040-a24b-00273f48bafa" ] }, "model": null, "partitions": {}, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "", "sectors": "3145728", "sectorsize": "512", "size": "1.50 GB", "support_discard": "0", "vendor": null, "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item={'key': u'dm-0', 'value': {u'support_discard': u'0', u'partitions': {}, u'holders': [], u'host': u'', u'links': {u'labels': [], u'masters': [], u'uuids': [u'b276e8a3-557b-43ca-9e88-a3af58c45856'], u'ids': [u'dm-name-VolGroup00-LogVol00', u'dm-uuid-LVM-5wSnGs2SfytkqLD4ux97trCY33RcU3BprTfTJTQFonBpt7ihvJF1yFJCfG14LBba']}, u'removable': u'0', u'model': None, u'size': u'37.47 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'', u'rotational': u'1', u'sectors': u'78577664', u'vendor': None}}) => { "changed": false, "item": { "key": "dm-0", "value": { "holders": [], "host": "", "links": { "ids": [ "dm-name-VolGroup00-LogVol00", "dm-uuid-LVM-5wSnGs2SfytkqLD4ux97trCY33RcU3BprTfTJTQFonBpt7ihvJF1yFJCfG14LBba" ], "labels": [], "masters": [], "uuids": [ "b276e8a3-557b-43ca-9e88-a3af58c45856" ] }, "model": null, "partitions": {}, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "", "sectors": "78577664", "sectorsize": "512", "size": "37.47 GB", "support_discard": "0", "vendor": null, "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item={'key': u'dm-3', 'value': {u'support_discard': u'512', u'partitions': {}, u'holders': [], u'host': u'', u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': [u'dm-name-test_group-data--lv2', u'dm-uuid-LVM-IhlLVZ3cCdF3sdAeMDOJxDh8pbXfTF9OuktWUgOTJehT0LkqJXq1Qk8m3gvmEtoY']}, u'removable': u'0', u'model': None, u'size': u'3.00 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'', u'rotational': u'1', u'sectors': u'6291456', u'vendor': None}}) => { "changed": false, "item": { "key": "dm-3", "value": { "holders": [], "host": "", "links": { "ids": [ "dm-name-test_group-data--lv2", "dm-uuid-LVM-IhlLVZ3cCdF3sdAeMDOJxDh8pbXfTF9OuktWUgOTJehT0LkqJXq1Qk8m3gvmEtoY" ], "labels": [], "masters": [], "uuids": [] }, "model": null, "partitions": {}, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "", "sectors": "6291456", "sectorsize": "512", "size": "3.00 GB", "support_discard": "512", "vendor": null, "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item={'key': u'dm-2', 'value': {u'support_discard': u'512', u'partitions': {}, u'holders': [], u'host': u'', u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': [u'dm-name-test_group-data--lv1', u'dm-uuid-LVM-IhlLVZ3cCdF3sdAeMDOJxDh8pbXfTF9OFxIj64DaiR1xtDx2jlNyf6Iit5bpFfPU']}, u'removable': u'0', u'model': None, u'size': u'6.00 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'', u'rotational': u'1', u'sectors': u'12574720', u'vendor': None}}) => { "changed": false, "item": { "key": "dm-2", "value": { "holders": [], "host": "", "links": { "ids": [ "dm-name-test_group-data--lv1", "dm-uuid-LVM-IhlLVZ3cCdF3sdAeMDOJxDh8pbXfTF9OFxIj64DaiR1xtDx2jlNyf6Iit5bpFfPU" ], "labels": [], "masters": [], "uuids": [] }, "model": null, "partitions": {}, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "", "sectors": "12574720", "sectorsize": "512", "size": "6.00 GB", "support_discard": "512", "vendor": null, "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item={'key': u'vda', 'value': {u'support_discard': u'0', u'partitions': {u'vda2': {u'links': {u'labels': [], u'masters': [], u'uuids': [u'df1e67ab-46b0-4b49-a861-e7d10736d467'], u'ids': []}, u'uuid': u'df1e67ab-46b0-4b49-a861-e7d10736d467', u'holders': [], u'size': u'1.00 GB', u'sectorsize': 512, u'start': u'4096', u'sectors': u'2097152'}, u'vda3': {u'links': {u'labels': [], u'masters': [u'dm-0', u'dm-1'], u'uuids': [], u'ids': [u'lvm-pv-uuid-YzzC0u-LNYi-agLJ-WvyP-YaDX-BFxo-XDIm7e']}, u'uuid': None, u'holders': [u'VolGroup00-LogVol00', u'VolGroup00-LogVol01'], u'size': u'39.00 GB', u'sectorsize': 512, u'start': u'2101248', u'sectors': u'81784832'}, u'vda1': {u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': []}, u'uuid': None, u'holders': [], u'size': u'1.00 MB', u'sectorsize': 512, u'start': u'2048', u'sectors': u'2048'}}, u'holders': [], u'host': u'', u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': []}, u'removable': u'0', u'model': None, u'size': u'41.00 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'', u'rotational': u'1', u'sectors': u'85983232', u'vendor': u'0x1af4'}}) => { "changed": false, "item": { "key": "vda", "value": { "holders": [], "host": "", "links": { "ids": [], "labels": [], "masters": [], "uuids": [] }, "model": null, "partitions": { "vda1": { "holders": [], "links": { "ids": [], "labels": [], "masters": [], "uuids": [] }, "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "start": "2048", "uuid": null }, "vda2": { "holders": [], "links": { "ids": [], "labels": [], "masters": [], "uuids": [ "df1e67ab-46b0-4b49-a861-e7d10736d467" ] }, "sectors": "2097152", "sectorsize": 512, "size": "1.00 GB", "start": "4096", "uuid": "df1e67ab-46b0-4b49-a861-e7d10736d467" }, "vda3": { "holders": [ "VolGroup00-LogVol00", "VolGroup00-LogVol01" ], "links": { "ids": [ "lvm-pv-uuid-YzzC0u-LNYi-agLJ-WvyP-YaDX-BFxo-XDIm7e" ], "labels": [], "masters": [ "dm-0", "dm-1" ], "uuids": [] }, "sectors": "81784832", "sectorsize": 512, "size": "39.00 GB", "start": "2101248", "uuid": null } }, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "", "sectors": "85983232", "sectorsize": "512", "size": "41.00 GB", "support_discard": "0", "vendor": "0x1af4", "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item={'key': u'sdc', 'value': {u'support_discard': u'512', u'partitions': {u'sdc2': {u'links': {u'labels': [], u'masters': [u'dm-4'], u'uuids': [], u'ids': [u'ata-QEMU_HARDDISK_QM00003-part2', u'lvm-pv-uuid-H1G5rs-q58W-CTQe-fdgX-V6db-PuYI-sJlnw7']}, u'uuid': None, u'holders': [u'journals-journal1'], u'size': u'6.00 GB', u'sectorsize': 512, u'start': u'12582912', u'sectors': u'12580864'}, u'sdc1': {u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': [u'ata-QEMU_HARDDISK_QM00003-part1']}, u'uuid': None, u'holders': [], u'size': u'6.00 GB', u'sectorsize': 512, u'start': u'2048', u'sectors': u'12580864'}}, u'holders': [], u'host': u'', u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': [u'ata-QEMU_HARDDISK_QM00003']}, u'removable': u'0', u'model': u'QEMU HARDDISK', u'size': u'12.00 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'cfq', u'rotational': u'1', u'sectors': u'25165824', u'vendor': u'ATA'}}) => { "changed": false, "item": { "key": "sdc", "value": { "holders": [], "host": "", "links": { "ids": [ "ata-QEMU_HARDDISK_QM00003" ], "labels": [], "masters": [], "uuids": [] }, "model": "QEMU HARDDISK", "partitions": { "sdc1": { "holders": [], "links": { "ids": [ "ata-QEMU_HARDDISK_QM00003-part1" ], "labels": [], "masters": [], "uuids": [] }, "sectors": "12580864", "sectorsize": 512, "size": "6.00 GB", "start": "2048", "uuid": null }, "sdc2": { "holders": [ "journals-journal1" ], "links": { "ids": [ "ata-QEMU_HARDDISK_QM00003-part2", "lvm-pv-uuid-H1G5rs-q58W-CTQe-fdgX-V6db-PuYI-sJlnw7" ], "labels": [], "masters": [ "dm-4" ], "uuids": [] }, "sectors": "12580864", "sectorsize": 512, "size": "6.00 GB", "start": "12582912", "uuid": null } }, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "cfq", "sectors": "25165824", "sectorsize": "512", "size": "12.00 GB", "support_discard": "512", "vendor": "ATA", "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item={'key': u'sdb', 'value': {u'support_discard': u'512', u'partitions': {}, u'holders': [u'test_group-data--lv1', u'test_group-data--lv2'], u'host': u'', u'links': {u'labels': [], u'masters': [u'dm-2', u'dm-3'], u'uuids': [], u'ids': [u'ata-QEMU_HARDDISK_QM00002', u'lvm-pv-uuid-4FAB53-0CiD-z5jW-Z9sI-Ng01-8vc1-lg8HJG']}, u'removable': u'0', u'model': u'QEMU HARDDISK', u'size': u'12.00 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'cfq', u'rotational': u'1', u'sectors': u'25165824', u'vendor': u'ATA'}}) => { "changed": false, "item": { "key": "sdb", "value": { "holders": [ "test_group-data--lv1", "test_group-data--lv2" ], "host": "", "links": { "ids": [ "ata-QEMU_HARDDISK_QM00002", "lvm-pv-uuid-4FAB53-0CiD-z5jW-Z9sI-Ng01-8vc1-lg8HJG" ], "labels": [], "masters": [ "dm-2", "dm-3" ], "uuids": [] }, "model": "QEMU HARDDISK", "partitions": {}, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "cfq", "sectors": "25165824", "sectorsize": "512", "size": "12.00 GB", "support_discard": "512", "vendor": "ATA", "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item={'key': u'sda', 'value': {u'support_discard': u'512', u'partitions': {}, u'holders': [], u'host': u'', u'links': {u'labels': [], u'masters': [], u'uuids': [], u'ids': [u'ata-QEMU_HARDDISK_QM00001']}, u'removable': u'0', u'model': u'QEMU HARDDISK', u'size': u'12.00 GB', u'virtual': 1, u'sas_address': None, u'sas_device_handle': None, u'sectorsize': u'512', u'scheduler_mode': u'cfq', u'rotational': u'1', u'sectors': u'25165824', u'vendor': u'ATA'}}) => { "changed": false, "item": { "key": "sda", "value": { "holders": [], "host": "", "links": { "ids": [ "ata-QEMU_HARDDISK_QM00001" ], "labels": [], "masters": [], "uuids": [] }, "model": "QEMU HARDDISK", "partitions": {}, "removable": "0", "rotational": "1", "sas_address": null, "sas_device_handle": null, "scheduler_mode": "cfq", "sectors": "25165824", "sectorsize": "512", "size": "12.00 GB", "support_discard": "512", "vendor": "ATA", "virtual": 1 } }, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : resolve dedicated device link(s)] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/build_devices.yml:15 TASK [ceph-osd : set_fact build dedicated_devices from resolved symlinks] ****** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/build_devices.yml:24 TASK [ceph-osd : set_fact build final dedicated_devices list] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/build_devices.yml:32 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : check if a partition named 'ceph' exists] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:29 TASK [ceph-osd : set config and keys paths] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/copy_configs.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : wait for ceph.conf and keys] ********************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/copy_configs.yml:7 TASK [ceph-osd : stat for ceph config and keys] ******************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/copy_configs.yml:14 TASK [ceph-osd : try to copy ceph config and keys] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/copy_configs.yml:25 TASK [ceph-osd : set selinux permissions] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/copy_configs.yml:35 skipping: [osd0] => (item=/etc/ceph) => { "changed": false, "item": "/etc/ceph", "skip_reason": "Conditional result was False", "skipped": true } skipping: [osd0] => (item=/var/lib/ceph) => { "changed": false, "item": "/var/lib/ceph", "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : check the partition status of the osd disks] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_gpt.yml:2 TASK [ceph-osd : create gpt disk label] **************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/check_gpt.yml:11 TASK [ceph-osd : include scenarios/collocated.yml] ***************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:47 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : include scenarios/non-collocated.yml] ************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:54 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : include scenarios/lvm.yml] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:62 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/scenarios/lvm.yml for osd0 TASK [ceph-osd : use ceph-volume to create bluestore osds] ********************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/scenarios/lvm.yml:3 changed: [osd0] => (item={u'data_vg': u'test_group', u'crush_device_class': u'test', u'data': u'data-lv1'}) => { "changed": true, "cmd": [ "ceph-volume", "--cluster", "ceph", "lvm", "create", "--bluestore", "--data", "test_group/data-lv1", "--crush-device-class", "test", "--dmcrypt" ], "delta": "0:00:38.925761", "end": "2018-03-25 23:29:46.840573", "failed": false, "item": { "crush_device_class": "test", "data": "data-lv1", "data_vg": "test_group" }, "rc": 0, "start": "2018-03-25 23:29:07.914812" } STDOUT: Running command: /bin/ceph-authtool --gen-print-key Running command: /bin/ceph-authtool --gen-print-key Running command: /bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f20169d2-fcb4-4b03-b8c5-a4bb422d7be7 Running command: /bin/ceph-authtool --gen-print-key Running command: /usr/sbin/cryptsetup --batch-mode --key-file - luksFormat /dev/test_group/data-lv1 Running command: /usr/sbin/cryptsetup --key-file - luksOpen /dev/test_group/data-lv1 FxIj64-DaiR-1xtD-x2jl-Nyf6-Iit5-bpFfPU Running command: /bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0 Running command: /bin/chown -R ceph:ceph /dev/dm-5 Running command: /bin/ln -s /dev/mapper/FxIj64-DaiR-1xtD-x2jl-Nyf6-Iit5-bpFfPU /var/lib/ceph/osd/ceph-0/block Running command: /bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap stderr: got monmap epoch 1 Running command: /bin/ceph-authtool /var/lib/ceph/osd/ceph-0/keyring --create-keyring --name osd.0 --add-key AQDEMLhaTyzIARAA9zgkkTN9M+qahmmqrhDeCQ== stdout: creating /var/lib/ceph/osd/ceph-0/keyring added entity osd.0 auth auth(auid = 18446744073709551615 key=AQDEMLhaTyzIARAA9zgkkTN9M+qahmmqrhDeCQ== with 0 caps) Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/ Running command: /bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid f20169d2-fcb4-4b03-b8c5-a4bb422d7be7 --setuser ceph --setgroup ceph --> ceph-volume lvm prepare successful for: test_group/data-lv1 Running command: /bin/ceph-authtool /var/lib/ceph/osd/ceph-0/lockbox.keyring --create-keyring --name client.osd-lockbox.f20169d2-fcb4-4b03-b8c5-a4bb422d7be7 --add-key AQDJMLha2TRXBxAA+31wJuuhWxP1y7iQdjzzJQ== stdout: creating /var/lib/ceph/osd/ceph-0/lockbox.keyring added entity client.osd-lockbox.f20169d2-fcb4-4b03-b8c5-a4bb422d7be7 auth auth(auid = 18446744073709551615 key=AQDJMLha2TRXBxAA+31wJuuhWxP1y7iQdjzzJQ== with 0 caps) Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/lockbox.keyring Running command: /bin/ceph --cluster ceph --name client.osd-lockbox.f20169d2-fcb4-4b03-b8c5-a4bb422d7be7 --keyring /var/lib/ceph/osd/ceph-0/lockbox.keyring config-key get dm-crypt/osd/f20169d2-fcb4-4b03-b8c5-a4bb422d7be7/luks Running command: /usr/sbin/cryptsetup --key-file - luksOpen /dev/test_group/data-lv1 FxIj64-DaiR-1xtD-x2jl-Nyf6-Iit5-bpFfPU stderr: Device FxIj64-DaiR-1xtD-x2jl-Nyf6-Iit5-bpFfPU already exists. Running command: /bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/mapper/FxIj64-DaiR-1xtD-x2jl-Nyf6-Iit5-bpFfPU --path /var/lib/ceph/osd/ceph-0 Running command: /bin/ln -snf /dev/mapper/FxIj64-DaiR-1xtD-x2jl-Nyf6-Iit5-bpFfPU /var/lib/ceph/osd/ceph-0/block Running command: /bin/chown -R ceph:ceph /dev/dm-5 Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Running command: /bin/systemctl enable ceph-volume@lvm-0-f20169d2-fcb4-4b03-b8c5-a4bb422d7be7 stderr: Created symlink from /etc/systemd/system/multi-user.target.wants/ceph-volume@lvm-0-f20169d2-fcb4-4b03-b8c5-a4bb422d7be7.service to /usr/lib/systemd/system/ceph-volume@.service. Running command: /bin/systemctl start ceph-osd@0 --> ceph-volume lvm activate successful for osd ID: 0 --> ceph-volume lvm create successful for: test_group/data-lv1 changed: [osd0] => (item={u'data_vg': u'test_group', u'db': u'journal1', u'db_vg': u'journals', u'data': u'data-lv2'}) => { "changed": true, "cmd": [ "ceph-volume", "--cluster", "ceph", "lvm", "create", "--bluestore", "--data", "test_group/data-lv2", "--block.db", "journals/journal1", "--dmcrypt" ], "delta": "0:00:41.484166", "end": "2018-03-25 23:30:31.592695", "failed": false, "item": { "data": "data-lv2", "data_vg": "test_group", "db": "journal1", "db_vg": "journals" }, "rc": 0, "start": "2018-03-25 23:29:50.108529" } STDOUT: Running command: /bin/ceph-authtool --gen-print-key Running command: /bin/ceph-authtool --gen-print-key Running command: /bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new d45ce6aa-8a19-42a3-8be4-2832d652cbc5 Running command: /bin/ceph-authtool --gen-print-key Running command: /usr/sbin/cryptsetup --batch-mode --key-file - luksFormat /dev/test_group/data-lv2 Running command: /usr/sbin/cryptsetup --key-file - luksOpen /dev/test_group/data-lv2 uktWUg-OTJe-hT0L-kqJX-q1Qk-8m3g-vmEtoY Running command: /usr/sbin/cryptsetup --batch-mode --key-file - luksFormat /dev/journals/journal1 Running command: /usr/sbin/cryptsetup --key-file - luksOpen /dev/journals/journal1 H7sf8A-FbTy-isn3-J1Xe-0bdT-L9se-0ApudF Running command: /bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1 Running command: /bin/chown -R ceph:ceph /dev/dm-6 Running command: /bin/ln -s /dev/mapper/uktWUg-OTJe-hT0L-kqJX-q1Qk-8m3g-vmEtoY /var/lib/ceph/osd/ceph-1/block Running command: /bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap stderr: got monmap epoch 1 Running command: /bin/ceph-authtool /var/lib/ceph/osd/ceph-1/keyring --create-keyring --name osd.1 --add-key AQDuMLhaM1A1DRAAJvexOz8zvevay+tMktd2qA== stdout: creating /var/lib/ceph/osd/ceph-1/keyring stdout: added entity osd.1 auth auth(auid = 18446744073709551615 key=AQDuMLhaM1A1DRAAJvexOz8zvevay+tMktd2qA== with 0 caps) Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/ Running command: /bin/chown -R ceph:ceph /dev/dm-7 Running command: /bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --bluestore-block-db-path /dev/mapper/H7sf8A-FbTy-isn3-J1Xe-0bdT-L9se-0ApudF --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid d45ce6aa-8a19-42a3-8be4-2832d652cbc5 --setuser ceph --setgroup ceph stderr: 2018-03-25 23:30:25.082 7f38b9406040 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid stderr: /home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos7/DIST/centos7/MACHINE_SIZE/huge/release/13.0.1-3240-g235f211/rpm/el7/BUILD/ceph-13.0.1-3240-g235f211/src/os/bluestore/StupidAllocator.cc: In function 'virtual void StupidAllocator::init_rm_free(uint64_t, uint64_t)' thread 7f38b9406040 time 2018-03-25 23:30:25.339214 stderr: /home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos7/DIST/centos7/MACHINE_SIZE/huge/release/13.0.1-3240-g235f211/rpm/el7/BUILD/ceph-13.0.1-3240-g235f211/src/os/bluestore/StupidAllocator.cc: 336: FAILED assert(rm.empty()) stderr: ceph version 13.0.1-3240-g235f211 (235f2119010484c12c5bd29421aeef7d44df38a1) mimic (dev) stderr: 1: (ceph::__ceph_assert_fail(char const*, char const*, int, char const*)+0xff) [0x7f38b0847c8f] stderr: 2: (()+0x278e77) [0x7f38b0847e77] stderr: 3: (StupidAllocator::init_rm_free(unsigned long, unsigned long)+0x20c2) [0x563e13931a22] stderr: 4: (BlueFS::mount()+0x222) [0x563e13911172] stderr: 5: (BlueStore::_open_db(bool, bool)+0x1531) [0x563e138388d1] stderr: 6: (BlueStore::mkfs()+0x699) [0x563e1386e429] stderr: 7: (OSD::mkfs(CephContext*, ObjectStore*, std::string const&, uuid_d, int)+0x177) [0x563e134391a7] stderr: 8: (main()+0x29fe) [0x563e1331324e] stderr: 9: (__libc_start_main()+0xf5) [0x7f38ace3ec05] stderr: 10: (()+0x37ea80) [0x563e133efa80] stderr: NOTE: a copy of the executable, or `objdump -rdS ` is needed to interpret this. stderr: 2018-03-25 23:30:25.343 7f38b9406040 -1 /home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos7/DIST/centos7/MACHINE_SIZE/huge/release/13.0.1-3240-g235f211/rpm/el7/BUILD/ceph-13.0.1-3240-g235f211/src/os/bluestore/StupidAllocator.cc: In function 'virtual void StupidAllocator::init_rm_free(uint64_t, uint64_t)' thread 7f38b9406040 time 2018-03-25 23:30:25.339214 stderr: /home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos7/DIST/centos7/MACHINE_SIZE/huge/release/13.0.1-3240-g235f211/rpm/el7/BUILD/ceph-13.0.1-3240-g235f211/src/os/bluestore/StupidAllocator.cc: 336: FAILED assert(rm.empty()) stderr: ceph version 13.0.1-3240-g235f211 (235f2119010484c12c5bd29421aeef7d44df38a1) mimic (dev) stderr: 1: (ceph::__ceph_assert_fail(char const*, char const*, int, char const*)+0xff) [0x7f38b0847c8f] stderr: 2: (()+0x278e77) [0x7f38b0847e77] stderr: 3: (StupidAllocator::init_rm_free(unsigned long, unsigned long)+0x20c2) [0x563e13931a22] stderr: 4: (BlueFS::mount()+0x222) [0x563e13911172] stderr: 5: (BlueStore::_open_db(bool, bool)+0x1531) [0x563e138388d1] stderr: 6: (BlueStore::mkfs()+0x699) [0x563e1386e429] stderr: 7: (OSD::mkfs(CephContext*, ObjectStore*, std::string const&, uuid_d, int)+0x177) [0x563e134391a7] stderr: 8: (main()+0x29fe) [0x563e1331324e] stderr: 9: (__libc_start_main()+0xf5) [0x7f38ace3ec05] stderr: 10: (()+0x37ea80) [0x563e133efa80] stderr: NOTE: a copy of the executable, or `objdump -rdS ` is needed to interpret this. stderr: -21> 2018-03-25 23:30:25.082 7f38b9406040 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid stderr: 0> 2018-03-25 23:30:25.343 7f38b9406040 -1 /home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos7/DIST/centos7/MACHINE_SIZE/huge/release/13.0.1-3240-g235f211/rpm/el7/BUILD/ceph-13.0.1-3240-g235f211/src/os/bluestore/StupidAllocator.cc: In function 'virtual void StupidAllocator::init_rm_free(uint64_t, uint64_t)' thread 7f38b9406040 time 2018-03-25 23:30:25.339214 stderr: /home/jenkins-build/build/workspace/ceph-dev-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos7/DIST/centos7/MACHINE_SIZE/huge/release/13.0.1-3240-g235f211/rpm/el7/BUILD/ceph-13.0.1-3240-g235f211/src/os/bluestore/StupidAllocator.cc: 336: FAILED assert(rm.empty()) stderr: ceph version 13.0.1-3240-g235f211 (235f2119010484c12c5bd29421aeef7d44df38a1) mimic (dev) stderr: 1: (ceph::__ceph_assert_fail(char const*, char const*, int, char const*)+0xff) [0x7f38b0847c8f] stderr: 2: (()+0x278e77) [0x7f38b0847e77] stderr: 3: (StupidAllocator::init_rm_free(unsigned long, unsigned long)+0x20c2) [0x563e13931a22] stderr: 4: (BlueFS::mount()+0x222) [0x563e13911172] stderr: 5: (BlueStore::_open_db(bool, bool)+0x1531) [0x563e138388d1] stderr: 6: (BlueStore::mkfs()+0x699) [0x563e1386e429] stderr: 7: (OSD::mkfs(CephContext*, ObjectStore*, std::string const&, uuid_d, int)+0x177) [0x563e134391a7] stderr: 8: (main()+0x29fe) [0x563e1331324e] stderr: 9: (__libc_start_main()+0xf5) [0x7f38ace3ec05] stderr: 10: (()+0x37ea80) [0x563e133efa80] stderr: NOTE: a copy of the executable, or `objdump -rdS ` is needed to interpret this. stderr: *** Caught signal (Aborted) ** stderr: in thread 7f38b9406040 thread_name:ceph-osd stderr: ceph version 13.0.1-3240-g235f211 (235f2119010484c12c5bd29421aeef7d44df38a1) mimic (dev) stderr: 1: (()+0x8e6e00) [0x563e13957e00] stderr: 2: (()+0xf5e0) [0x7f38ade295e0] stderr: 3: (gsignal()+0x37) [0x7f38ace521f7] stderr: 4: (abort()+0x148) [0x7f38ace538e8] stderr: 5: (ceph::__ceph_assert_fail(char const*, char const*, int, char const*)+0x25d) [0x7f38b0847ded] stderr: 6: (()+0x278e77) [0x7f38b0847e77] stderr: 7: (StupidAllocator::init_rm_free(unsigned long, unsigned long)+0x20c2) [0x563e13931a22] stderr: 8: (BlueFS::mount()+0x222) [0x563e13911172] stderr: 9: (BlueStore::_open_db(bool, bool)+0x1531) [0x563e138388d1] stderr: 10: (BlueStore::mkfs()+0x699) [0x563e1386e429] stderr: 11: (OSD::mkfs(CephContext*, ObjectStore*, std::string const&, uuid_d, int)+0x177) [0x563e134391a7] stderr: 12: (main()+0x29fe) [0x563e1331324e] stderr: 13: (__libc_start_main()+0xf5) [0x7f38ace3ec05] stderr: 14: (()+0x37ea80) [0x563e133efa80] stderr: 2018-03-25 23:30:25.349 7f38b9406040 -1 *** Caught signal (Aborted) ** stderr: in thread 7f38b9406040 thread_name:ceph-osd stderr: ceph version 13.0.1-3240-g235f211 (235f2119010484c12c5bd29421aeef7d44df38a1) mimic (dev) stderr: 1: (()+0x8e6e00) [0x563e13957e00] stderr: 2: (()+0xf5e0) [0x7f38ade295e0] stderr: 3: (gsignal()+0x37) [0x7f38ace521f7] stderr: 4: (abort()+0x148) [0x7f38ace538e8] stderr: 5: (ceph::__ceph_assert_fail(char const*, char const*, int, char const*)+0x25d) [0x7f38b0847ded] stderr: 6: (()+0x278e77) [0x7f38b0847e77] stderr: 7: (StupidAllocator::init_rm_free(unsigned long, unsigned long)+0x20c2) [0x563e13931a22] stderr: 8: (BlueFS::mount()+0x222) [0x563e13911172] stderr: 9: (BlueStore::_open_db(bool, bool)+0x1531) [0x563e138388d1] stderr: 10: (BlueStore::mkfs()+0x699) [0x563e1386e429] stderr: 11: (OSD::mkfs(CephContext*, ObjectStore*, std::string const&, uuid_d, int)+0x177) [0x563e134391a7] stderr: 12: (main()+0x29fe) [0x563e1331324e] stderr: 13: (__libc_start_main()+0xf5) [0x7f38ace3ec05] stderr: 14: (()+0x37ea80) [0x563e133efa80] stderr: NOTE: a copy of the executable, or `objdump -rdS ` is needed to interpret this. stderr: 0> 2018-03-25 23:30:25.349 7f38b9406040 -1 *** Caught signal (Aborted) ** stderr: in thread 7f38b9406040 thread_name:ceph-osd stderr: ceph version 13.0.1-3240-g235f211 (235f2119010484c12c5bd29421aeef7d44df38a1) mimic (dev) stderr: 1: (()+0x8e6e00) [0x563e13957e00] stderr: 2: (()+0xf5e0) [0x7f38ade295e0] stderr: 3: (gsignal()+0x37) [0x7f38ace521f7] stderr: 4: (abort()+0x148) [0x7f38ace538e8] stderr: 5: (ceph::__ceph_assert_fail(char const*, char const*, int, char const*)+0x25d) [0x7f38b0847ded] stderr: 6: (()+0x278e77) [0x7f38b0847e77] stderr: 7: (StupidAllocator::init_rm_free(unsigned long, unsigned long)+0x20c2) [0x563e13931a22] stderr: 8: (BlueFS::mount()+0x222) [0x563e13911172] stderr: 9: (BlueStore::_open_db(bool, bool)+0x1531) [0x563e138388d1] stderr: 10: (BlueStore::mkfs()+0x699) [0x563e1386e429] stderr: 11: (OSD::mkfs(CephContext*, ObjectStore*, std::string const&, uuid_d, int)+0x177) [0x563e134391a7] stderr: 12: (main()+0x29fe) [0x563e1331324e] stderr: 13: (__libc_start_main()+0xf5) [0x7f38ace3ec05] stderr: 14: (()+0x37ea80) [0x563e133efa80] stderr: NOTE: a copy of the executable, or `objdump -rdS ` is needed to interpret this. --> ceph-volume lvm prepare successful for: test_group/data-lv2 Running command: /bin/ceph-authtool /var/lib/ceph/osd/ceph-1/lockbox.keyring --create-keyring --name client.osd-lockbox.d45ce6aa-8a19-42a3-8be4-2832d652cbc5 --add-key AQDzMLha7ym6EhAAvBP/6nXBOhX3iOLwPI35ww== stdout: creating /var/lib/ceph/osd/ceph-1/lockbox.keyring added entity client.osd-lockbox.d45ce6aa-8a19-42a3-8be4-2832d652cbc5 auth auth(auid = 18446744073709551615 key=AQDzMLha7ym6EhAAvBP/6nXBOhX3iOLwPI35ww== with 0 caps) Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/lockbox.keyring Running command: /bin/ceph --cluster ceph --name client.osd-lockbox.d45ce6aa-8a19-42a3-8be4-2832d652cbc5 --keyring /var/lib/ceph/osd/ceph-1/lockbox.keyring config-key get dm-crypt/osd/d45ce6aa-8a19-42a3-8be4-2832d652cbc5/luks Running command: /usr/sbin/cryptsetup --key-file - luksOpen /dev/test_group/data-lv2 uktWUg-OTJe-hT0L-kqJX-q1Qk-8m3g-vmEtoY stderr: Device uktWUg-OTJe-hT0L-kqJX-q1Qk-8m3g-vmEtoY already exists. Running command: /usr/sbin/cryptsetup --key-file - luksOpen /dev/journals/journal1 H7sf8A-FbTy-isn3-J1Xe-0bdT-L9se-0ApudF stderr: Device H7sf8A-FbTy-isn3-J1Xe-0bdT-L9se-0ApudF already exists. Running command: /bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/mapper/uktWUg-OTJe-hT0L-kqJX-q1Qk-8m3g-vmEtoY --path /var/lib/ceph/osd/ceph-1 Running command: /bin/ln -snf /dev/mapper/uktWUg-OTJe-hT0L-kqJX-q1Qk-8m3g-vmEtoY /var/lib/ceph/osd/ceph-1/block Running command: /bin/chown -R ceph:ceph /dev/dm-6 Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 Running command: /bin/ln -snf /dev/mapper/H7sf8A-FbTy-isn3-J1Xe-0bdT-L9se-0ApudF /var/lib/ceph/osd/ceph-1/block.db Running command: /bin/chown -R ceph:ceph /dev/dm-7 Running command: /bin/systemctl enable ceph-volume@lvm-1-d45ce6aa-8a19-42a3-8be4-2832d652cbc5 stderr: Created symlink from /etc/systemd/system/multi-user.target.wants/ceph-volume@lvm-1-d45ce6aa-8a19-42a3-8be4-2832d652cbc5.service to /usr/lib/systemd/system/ceph-volume@.service. Running command: /bin/systemctl start ceph-osd@1 --> ceph-volume lvm activate successful for osd ID: 1 --> ceph-volume lvm create successful for: test_group/data-lv2 changed: [osd0] => (item={u'data': u'/dev/sdd1'}) => { "changed": true, "cmd": [ "ceph-volume", "--cluster", "ceph", "lvm", "create", "--bluestore", "--data", "/dev/sdd1", "--dmcrypt" ], "delta": "0:00:39.207612", "end": "2018-03-25 23:31:14.741986", "failed": false, "item": { "data": "/dev/sdd1" }, "rc": 0, "start": "2018-03-25 23:30:35.534374" } STDOUT: Running command: /bin/ceph-authtool --gen-print-key Running command: /bin/ceph-authtool --gen-print-key Running command: /bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 Running command: /usr/sbin/vgcreate --force --yes ceph-fad7031c-4bd3-4328-9716-4fbddc9c14b8 /dev/sdd1 stdout: Physical volume "/dev/sdd1" successfully created. stdout: Volume group "ceph-fad7031c-4bd3-4328-9716-4fbddc9c14b8" successfully created Running command: /usr/sbin/lvcreate --yes -l 100%FREE -n osd-block-f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 ceph-fad7031c-4bd3-4328-9716-4fbddc9c14b8 stdout: Logical volume "osd-block-f7a7a109-6fe2-4bd0-89c0-70aea7b19d72" created. Running command: /bin/ceph-authtool --gen-print-key Running command: /usr/sbin/cryptsetup --batch-mode --key-file - luksFormat /dev/ceph-fad7031c-4bd3-4328-9716-4fbddc9c14b8/osd-block-f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 Running command: /usr/sbin/cryptsetup --key-file - luksOpen /dev/ceph-fad7031c-4bd3-4328-9716-4fbddc9c14b8/osd-block-f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 3SpEPp-e8Ww-3Z8F-KSZC-Cwj9-D1G4-hgz1hs Running command: /bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2 Running command: /bin/chown -R ceph:ceph /dev/dm-9 Running command: /bin/ln -s /dev/mapper/3SpEPp-e8Ww-3Z8F-KSZC-Cwj9-D1G4-hgz1hs /var/lib/ceph/osd/ceph-2/block Running command: /bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap stderr: got monmap epoch 1 Running command: /bin/ceph-authtool /var/lib/ceph/osd/ceph-2/keyring --create-keyring --name osd.2 --add-key AQAbMbhaYECeJhAAT2w75UwUOwmT5nj8Hg9omg== stdout: creating /var/lib/ceph/osd/ceph-2/keyring added entity osd.2 auth auth(auid = 18446744073709551615 key=AQAbMbhaYECeJhAAT2w75UwUOwmT5nj8Hg9omg== with 0 caps) Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/ Running command: /bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 --setuser ceph --setgroup ceph --> ceph-volume lvm prepare successful for: /dev/sdd1 Running command: /bin/ceph-authtool /var/lib/ceph/osd/ceph-2/lockbox.keyring --create-keyring --name client.osd-lockbox.f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 --add-key AQAgMbhaNzlJLBAARg8Romn2JMByiDpbKPP1nw== stdout: creating /var/lib/ceph/osd/ceph-2/lockbox.keyring added entity client.osd-lockbox.f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 auth auth(auid = 18446744073709551615 key=AQAgMbhaNzlJLBAARg8Romn2JMByiDpbKPP1nw== with 0 caps) Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/lockbox.keyring Running command: /bin/ceph --cluster ceph --name client.osd-lockbox.f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 --keyring /var/lib/ceph/osd/ceph-2/lockbox.keyring config-key get dm-crypt/osd/f7a7a109-6fe2-4bd0-89c0-70aea7b19d72/luks Running command: /usr/sbin/cryptsetup --key-file - luksOpen /dev/ceph-fad7031c-4bd3-4328-9716-4fbddc9c14b8/osd-block-f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 3SpEPp-e8Ww-3Z8F-KSZC-Cwj9-D1G4-hgz1hs stderr: Device 3SpEPp-e8Ww-3Z8F-KSZC-Cwj9-D1G4-hgz1hs already exists. Running command: /bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/mapper/3SpEPp-e8Ww-3Z8F-KSZC-Cwj9-D1G4-hgz1hs --path /var/lib/ceph/osd/ceph-2 Running command: /bin/ln -snf /dev/mapper/3SpEPp-e8Ww-3Z8F-KSZC-Cwj9-D1G4-hgz1hs /var/lib/ceph/osd/ceph-2/block Running command: /bin/chown -R ceph:ceph /dev/dm-9 Running command: /bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Running command: /bin/systemctl enable ceph-volume@lvm-2-f7a7a109-6fe2-4bd0-89c0-70aea7b19d72 stderr: Created symlink from /etc/systemd/system/multi-user.target.wants/ceph-volume@lvm-2-f7a7a109-6fe2-4bd0-89c0-70aea7b19d72.service to /usr/lib/systemd/system/ceph-volume@.service. Running command: /bin/systemctl start ceph-osd@2 --> ceph-volume lvm activate successful for osd ID: 2 --> ceph-volume lvm create successful for: /dev/sdd1 TASK [ceph-osd : include activate_osds.yml] ************************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:70 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/activate_osds.yml for osd0 TASK [ceph-osd : activate osd(s) when device is a disk] ************************ task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/activate_osds.yml:5 TASK [ceph-osd : activate osd(s) when device is a disk (dmcrypt)] ************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/activate_osds.yml:16 TASK [ceph-osd : set_fact combined_activate_osd_disk_results] ****************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/activate_osds.yml:29 ok: [osd0] => { "ansible_facts": { "combined_activate_osd_disk_results": { "changed": false, "results": [], "skipped": true, "skipped_reason": "No items in the list" } }, "changed": false, "failed": false } TASK [ceph-osd : fail if ceph-disk cannot create an OSD] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/activate_osds.yml:33 TASK [ceph-osd : include start_osds.yml] *************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:77 included: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/start_osds.yml for osd0 TASK [ceph-osd : get osd id] *************************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/start_osds.yml:2 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : ensure systemd service override directory exists] ************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/start_osds.yml:14 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : add ceph-osd systemd service overrides] *********************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/start_osds.yml:22 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } TASK [ceph-osd : ensure osd daemons are started] ******************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/start_osds.yml:32 TASK [ceph-osd : include docker/main.yml] ************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/roles/ceph-osd/tasks/main.yml:84 skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _mon_handler_called before restart] ******* ok: [osd0] => { "ansible_facts": { "_mon_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy mon restart script] ********************** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mon daemon(s) - non container] *** skipping: [osd0] => (item=mon0) => { "changed": false, "item": "mon0", "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mon daemon(s) - container] ******* skipping: [osd0] => (item=mon0) => { "changed": false, "item": "mon0", "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _mon_handler_called after restart] ******** ok: [osd0] => { "ansible_facts": { "_mon_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _osd_handler_called before restart] ******* ok: [osd0] => { "ansible_facts": { "_osd_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy osd restart script] ********************** changed: [osd0] => { "changed": true, "checksum": "1e0cef0a45ded980f0823af51755e6153be19cc7", "dest": "/tmp/restart_osd_daemon.sh", "failed": false, "gid": 0, "group": "root", "md5sum": "106d0b8a339451202f8ca87b368e97bc", "mode": "0750", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 2790, "src": "/home/vagrant/.ansible/tmp/ansible-tmp-1522020676.83-178243302703101/source", "state": "file", "uid": 0 } RUNNING HANDLER [ceph-defaults : restart ceph osds daemon(s) - non container] *** skipping: [osd0] => (item=osd0) => { "changed": false, "item": "osd0", "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph osds daemon(s) - container] ****** skipping: [osd0] => (item=osd0) => { "changed": false, "item": "osd0", "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _osd_handler_called after restart] ******** ok: [osd0] => { "ansible_facts": { "_osd_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _mds_handler_called before restart] ******* ok: [osd0] => { "ansible_facts": { "_mds_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy mds restart script] ********************** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mds daemon(s) - non container] *** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mds daemon(s) - container] ******* skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _mds_handler_called after restart] ******** ok: [osd0] => { "ansible_facts": { "_mds_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _rgw_handler_called before restart] ******* ok: [osd0] => { "ansible_facts": { "_rgw_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy rgw restart script] ********************** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph rgw daemon(s) - non container] *** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph rgw daemon(s) - container] ******* skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _rgw_handler_called after restart] ******** ok: [osd0] => { "ansible_facts": { "_rgw_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _rbdmirror_handler_called before restart] *** ok: [osd0] => { "ansible_facts": { "_rbdmirror_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy rbd mirror restart script] *************** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph rbd mirror daemon(s) - non container] *** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph rbd mirror daemon(s) - container] *** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _rbdmirror_handler_called after restart] *** ok: [osd0] => { "ansible_facts": { "_rbdmirror_handler_called": false }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : set _mgr_handler_called before restart] ******* ok: [osd0] => { "ansible_facts": { "_mgr_handler_called": true }, "changed": false, "failed": false } RUNNING HANDLER [ceph-defaults : copy mgr restart script] ********************** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mgr daemon(s) - non container] *** skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : restart ceph mgr daemon(s) - container] ******* skipping: [osd0] => { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } RUNNING HANDLER [ceph-defaults : set _mgr_handler_called after restart] ******** ok: [osd0] => { "ansible_facts": { "_mgr_handler_called": false }, "changed": false, "failed": false } META: ran handlers TASK [set ceph osd install 'Complete'] ***************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/site.yml.sample:163 ok: [osd0] => { "ansible_stats": { "aggregate": true, "data": { "installer_phase_ceph_osd": { "end": "20180326013123Z", "status": "Complete" } }, "per_host": false }, "changed": false, "failed": false } META: ran handlers PLAY [mdss] ******************************************************************** skipping: no hosts matched PLAY [rgws] ******************************************************************** skipping: no hosts matched PLAY [nfss] ******************************************************************** skipping: no hosts matched PLAY [restapis] **************************************************************** skipping: no hosts matched PLAY [rbdmirrors] ************************************************************** skipping: no hosts matched PLAY [clients] ***************************************************************** skipping: no hosts matched PLAY [iscsi-gws] *************************************************************** skipping: no hosts matched PLAY RECAP ********************************************************************* mon0 : ok=75 changed=20 unreachable=0 failed=0 osd0 : ok=75 changed=18 unreachable=0 failed=0 centos7-bluestore-dmcrypt runtests: commands[6] | ansible-playbook -vv -i /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/setup.yml /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt$ /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/ansible-playbook -vv -i /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/setup.yml ansible-playbook 2.4.1.0 config file = None configured module search path = [u'/home/jenkins-build/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/lib/python2.7/site-packages/ansible executable location = /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/ansible-playbook python version = 2.7.5 (default, Aug 4 2017, 00:39:18) [GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] No config file found; using defaults PLAYBOOK: setup.yml ************************************************************ 1 plays in /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/setup.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* ok: [mon0] ok: [osd0] META: ran handlers TASK [check if it is Atomic host] ********************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/setup.yml:8 ok: [osd0] => { "changed": false, "failed": false, "stat": { "exists": false } } ok: [mon0] => { "changed": false, "failed": false, "stat": { "exists": false } } TASK [set fact for using Atomic host] ****************************************** task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/setup.yml:13 ok: [osd0] => { "ansible_facts": { "is_atomic": false }, "changed": false, "failed": false } ok: [mon0] => { "ansible_facts": { "is_atomic": false }, "changed": false, "failed": false } TASK [install net-tools] ******************************************************* task path: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/setup.yml:19 changed: [mon0] => { "changed": true, "failed": false, "rc": 0, "results": [ "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirror.ate.info\n * epel: mirror.switch.ch\n * extras: mirror.guru\n * updates: mirror.guru\nResolving Dependencies\n--> Running transaction check\n---> Package net-tools.x86_64 0:2.0-0.22.20131004git.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n net-tools x86_64 2.0-0.22.20131004git.el7 base 305 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package\n\nTotal download size: 305 k\nInstalled size: 917 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : net-tools-2.0-0.22.20131004git.el7.x86_64 1/1 \n Verifying : net-tools-2.0-0.22.20131004git.el7.x86_64 1/1 \n\nInstalled:\n net-tools.x86_64 0:2.0-0.22.20131004git.el7 \n\nComplete!\n" ] } changed: [osd0] => { "changed": true, "failed": false, "rc": 0, "results": [ "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: centos.mirrors.ovh.net\n * epel: mirror.23media.de\n * extras: mirrors.standaloneinstaller.com\n * updates: centos.mirrors.ovh.net\nResolving Dependencies\n--> Running transaction check\n---> Package net-tools.x86_64 0:2.0-0.22.20131004git.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n net-tools x86_64 2.0-0.22.20131004git.el7 base 305 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package\n\nTotal download size: 305 k\nInstalled size: 917 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : net-tools-2.0-0.22.20131004git.el7.x86_64 1/1 \n Verifying : net-tools-2.0-0.22.20131004git.el7.x86_64 1/1 \n\nInstalled:\n net-tools.x86_64 0:2.0-0.22.20131004git.el7 \n\nComplete!\n" ] } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* mon0 : ok=4 changed=1 unreachable=0 failed=0 osd0 : ok=4 changed=1 unreachable=0 failed=0 centos7-bluestore-dmcrypt runtests: commands[7] | testinfra -n 4 --sudo -v --connection=ansible --ansible-inventory=/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt$ /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/testinfra -n 4 --sudo -v --connection=ansible --ansible-inventory=/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests ============================= test session starts ============================== platform linux2 -- Python 2.7.5, pytest-3.5.0, py-1.5.3, pluggy-0.6.0 -- /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/python cachedir: ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/.pytest_cache rootdir: /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests, inifile: pytest.ini plugins: testinfra-1.7.1, xdist-1.22.2, forked-0.2 gw0 I / gw1 I / gw2 I / gw3 I [gw0] linux2 Python 2.7.5 cwd: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt [gw1] linux2 Python 2.7.5 cwd: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt [gw2] linux2 Python 2.7.5 cwd: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt [gw3] linux2 Python 2.7.5 cwd: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt [gw1] Python 2.7.5 (default, Aug 4 2017, 00:39:18) -- [GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] [gw2] Python 2.7.5 (default, Aug 4 2017, 00:39:18) -- [GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] [gw0] Python 2.7.5 (default, Aug 4 2017, 00:39:18) -- [GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] [gw3] Python 2.7.5 (default, Aug 4 2017, 00:39:18) -- [GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] gw0 [122] / gw1 [122] / gw2 [122] / gw3 [122] scheduling tests via LoadScheduling ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_dir_is_a_directory[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_conf_is_a_file[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_dir_exists[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_conf_exists[ansible:/osd0] [gw3] [ 0%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_conf_exists[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestCephConf::test_mon_host_line_has_correct_value[ansible:/osd0] [gw0] [ 1%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_dir_is_a_directory[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestCephConf::test_ceph_config_has_mon_host_line[ansible:/osd0] [gw1] [ 2%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_dir_exists[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_command_exists[ansible:/osd0] [gw2] [ 3%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_conf_is_a_file[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_dir_exists[ansible:/mon0] [gw2] [ 4%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_dir_exists[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_command_exists[ansible:/mon0] [gw3] [ 4%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestCephConf::test_mon_host_line_has_correct_value[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_conf_is_a_file[ansible:/mon0] [gw1] [ 5%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_command_exists[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_dir_is_a_directory[ansible:/mon0] [gw0] [ 6%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestCephConf::test_ceph_config_has_mon_host_line[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_conf_exists[ansible:/mon0] [gw2] [ 7%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_command_exists[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_service_is_running[ansible:/osd0] [gw2] [ 8%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_service_is_running[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_is_installed[ansible:/mon0] [gw2] [ 9%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_docker_mds_is_up[ansible:/mon0] [gw2] [ 9%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_docker_mds_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_is_up[ansible:/osd0] [gw2] [ 10%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_service_is_running[ansible:/mon0] [gw2] [ 11%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_service_is_running[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_service_is_enabled[ansible:/mon0] [gw2] [ 12%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_service_is_enabled[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_is_up[ansible:/mon0] [gw2] [ 13%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_docker_mgr_is_up[ansible:/mon0] [gw2] [ 13%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_docker_mgr_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_ceph_mon_package_is_installed[ansible:/osd0] [gw2] [ 14%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_ceph_mon_package_is_installed[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_listens_on_6789[ansible:/osd0] [gw2] [ 15%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_listens_on_6789[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_service_is_running[ansible:/osd0] [gw2] [ 16%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_service_is_running[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_service_is_enabled[ansible:/osd0] [gw2] [ 17%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_service_is_enabled[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_can_get_cluster_health[ansible:/osd0] [gw2] [ 18%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_can_get_cluster_health[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_ceph_config_has_inital_members_line[ansible:/osd0] [gw2] [ 18%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_ceph_config_has_inital_members_line[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_initial_members_line_has_correct_value[ansible:/osd0] [gw2] [ 19%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_initial_members_line_has_correct_value[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestOSDs::test_all_osds_are_up_and_in[ansible:/osd0] [gw2] [ 20%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestOSDs::test_all_osds_are_up_and_in[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestOSDs::test_all_docker_osds_are_up_and_in[ansible:/osd0] [gw2] [ 21%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestOSDs::test_all_docker_osds_are_up_and_in[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_ceph_mon_package_is_installed[ansible:/mon0] [gw0] [ 22%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_conf_exists[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestCephConf::test_mon_host_line_has_correct_value[ansible:/mon0] [gw1] [ 22%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_dir_is_a_directory[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestCephConf::test_ceph_config_has_mon_host_line[ansible:/mon0] [gw3] [ 23%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestInstall::test_ceph_conf_is_a_file[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_is_installed[ansible:/osd0] [gw3] [ 24%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_is_installed[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_docker_mds_is_up[ansible:/osd0] [gw3] [ 25%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_docker_mds_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_is_up[ansible:/mon0] [gw3] [ 26%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_service_is_enabled[ansible:/osd0] [gw3] [ 27%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_service_is_enabled[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestOSDs::test_all_osds_are_up_and_in[ansible:/mon0] [gw2] [ 27%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_ceph_mon_package_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_listens_on_6789[ansible:/mon0] [gw0] [ 28%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestCephConf::test_mon_host_line_has_correct_value[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_is_up[ansible:/osd0] [gw0] [ 29%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_service_is_enabled[ansible:/mon0] [gw0] [ 30%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_service_is_enabled[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_service_is_running[ansible:/osd0] [gw0] [ 31%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_service_is_running[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_is_installed[ansible:/mon0] [gw0] [ 31%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_config_override[ansible:/osd0] [gw0] [ 32%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_config_override[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_is_up[ansible:/osd0] [gw0] [ 33%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_docker_nfs_is_up[ansible:/osd0] [gw0] [ 34%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_docker_nfs_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_ganesha_is_installed[ansible:/mon0] [gw0] [ 35%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_ganesha_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_ganesha_rgw_package_is_installed[ansible:/mon0] [gw0] [ 36%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_ganesha_rgw_package_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_services_are_running[ansible:/mon0] [gw0] [ 36%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_services_are_running[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_services_are_enabled[ansible:/mon0] [gw0] [ 37%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_services_are_enabled[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_config_override[ansible:/mon0] [gw0] [ 38%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_config_override[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_is_up[ansible:/mon0] [gw0] [ 39%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_docker_nfs_is_up[ansible:/mon0] [gw0] [ 40%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_docker_nfs_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_journal_collocation.py::TestOSD::test_osds_are_all_collocated[ansible:/osd0] [gw0] [ 40%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_journal_collocation.py::TestOSD::test_osds_are_all_collocated[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_journal_collocation.py::TestOSD::test_osds_are_all_collocated[ansible:/mon0] [gw0] [ 41%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_journal_collocation.py::TestOSD::test_osds_are_all_collocated[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_osd_package_is_installed[ansible:/osd0] [gw1] [ 42%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/test_install.py::TestCephConf::test_ceph_config_has_mon_host_line[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_service_is_enabled[ansible:/osd0] [gw1] [ 43%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_service_is_enabled[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_service_is_running[ansible:/mon0] [gw1] [ 44%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mds/test_mds.py::TestMDSs::test_mds_service_is_running[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_is_installed[ansible:/osd0] [gw1] [ 45%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_mgr_is_installed[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_docker_mgr_is_up[ansible:/osd0] [gw1] [ 45%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mgr/test_mgr.py::TestMGRs::test_docker_mgr_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_services_are_running[ansible:/osd0] [gw3] [ 46%] FAILED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestOSDs::test_all_osds_are_up_and_in[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestOSDs::test_all_docker_osds_are_up_and_in[ansible:/mon0] [gw3] [ 47%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestOSDs::test_all_docker_osds_are_up_and_in[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_ganesha_is_installed[ansible:/osd0] [gw3] [ 48%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_ganesha_is_installed[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_ganesha_rgw_package_is_installed[ansible:/osd0] [gw3] [ 49%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_ganesha_rgw_package_is_installed[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_services_are_running[ansible:/osd0] [gw3] [ 50%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_services_are_running[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_services_are_enabled[ansible:/osd0] [gw3] [ 50%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/nfs/test_nfs_ganesha.py::TestNFSs::test_nfs_services_are_enabled[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_volume_systemd_is_installed[ansible:/osd0] [gw2] [ 51%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_listens_on_6789[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_service_is_running[ansible:/mon0] [gw2] [ 52%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_service_is_running[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_service_is_enabled[ansible:/mon0] [gw2] [ 53%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_mon_service_is_enabled[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_can_get_cluster_health[ansible:/mon0] [gw3] [ 54%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_volume_systemd_is_installed[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_osd_package_is_installed[ansible:/mon0] [gw3] [ 54%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_osd_package_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osds_listen_on_public_network[ansible:/mon0] [gw3] [ 55%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osds_listen_on_public_network[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osds_listen_on_cluster_network[ansible:/mon0] [gw3] [ 56%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osds_listen_on_cluster_network[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_services_are_running[ansible:/mon0] [gw3] [ 57%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_services_are_running[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_services_are_enabled[ansible:/mon0] [gw3] [ 58%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_services_are_enabled[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_are_mounted[ansible:/mon0] [gw3] [ 59%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_are_mounted[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_volume_is_installed[ansible:/mon0] [gw3] [ 59%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_volume_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_volume_systemd_is_installed[ansible:/mon0] [gw3] [ 60%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_volume_systemd_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_is_installed[ansible:/osd0] [gw3] [ 61%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_is_installed[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_before_luminous[ansible:/osd0] [gw3] [ 62%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_before_luminous[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_docker_before_luminous[ansible:/osd0] [gw3] [ 63%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_docker_before_luminous[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_docker_from_luminous[ansible:/osd0] [gw3] [ 63%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_docker_from_luminous[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_from_luminous[ansible:/osd0] [gw3] [ 64%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_from_luminous[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_before_luminous[ansible:/osd0] [gw3] [ 65%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_before_luminous[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_docker_before_luminous[ansible:/osd0] [gw3] [ 66%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_docker_before_luminous[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_from_luminous[ansible:/osd0] [gw3] [ 67%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_from_luminous[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_is_up[ansible:/osd0] [gw3] [ 68%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_docker_rbd_mirror_is_up[ansible:/osd0] [gw3] [ 68%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_docker_rbd_mirror_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_is_installed[ansible:/mon0] [gw3] [ 69%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_before_luminous[ansible:/mon0] [gw3] [ 70%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_before_luminous[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_docker_before_luminous[ansible:/mon0] [gw3] [ 71%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_docker_before_luminous[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_docker_from_luminous[ansible:/mon0] [gw3] [ 72%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_docker_from_luminous[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_from_luminous[ansible:/mon0] [gw3] [ 72%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_running_from_luminous[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_before_luminous[ansible:/mon0] [gw3] [ 73%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_before_luminous[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_docker_before_luminous[ansible:/mon0] [gw3] [ 74%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_docker_before_luminous[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_from_luminous[ansible:/mon0] [gw3] [ 75%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_service_is_enabled_from_luminous[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_is_up[ansible:/mon0] [gw3] [ 76%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_rbd_mirror_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_docker_rbd_mirror_is_up[ansible:/mon0] [gw3] [ 77%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rbd-mirror/test_rbd_mirror.py::TestRbdMirrors::test_docker_rbd_mirror_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_is_installed[ansible:/osd0] [gw3] [ 77%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_is_installed[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_service_is_running[ansible:/osd0] [gw3] [ 78%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_service_is_running[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_service_is_enabled[ansible:/osd0] [gw3] [ 79%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_service_is_enabled[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_is_up[ansible:/osd0] [gw3] [ 80%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_http_endpoint[ansible:/osd0] [gw3] [ 81%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_http_endpoint[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_docker_rgw_is_up[ansible:/osd0] [gw3] [ 81%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_docker_rgw_is_up[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_is_installed[ansible:/mon0] [gw3] [ 82%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_is_installed[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_service_is_running[ansible:/mon0] [gw3] [ 83%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_service_is_running[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_service_is_enabled[ansible:/mon0] [gw3] [ 84%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_service_is_enabled[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_is_up[ansible:/mon0] [gw3] [ 85%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_http_endpoint[ansible:/mon0] [gw3] [ 86%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_rgw_http_endpoint[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_docker_rgw_is_up[ansible:/mon0] [gw3] [ 86%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw.py::TestRGWs::test_docker_rgw_is_up[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_bucket_default_quota_is_set[ansible:/osd0] [gw3] [ 87%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_bucket_default_quota_is_set[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_bucket_default_quota_is_applied[ansible:/osd0] [gw3] [ 88%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_bucket_default_quota_is_applied[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_tuning_pools_are_set[ansible:/osd0] [gw3] [ 89%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_tuning_pools_are_set[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_bucket_default_quota_is_set[ansible:/mon0] [gw3] [ 90%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_bucket_default_quota_is_set[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_bucket_default_quota_is_applied[ansible:/mon0] [gw3] [ 90%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_bucket_default_quota_is_applied[ansible:/mon0] [gw0] [ 91%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_osd_package_is_installed[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osds_listen_on_public_network[ansible:/osd0] [gw2] [ 92%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_can_get_cluster_health[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_ceph_config_has_inital_members_line[ansible:/mon0] [gw1] [ 93%] FAILED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_services_are_running[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_services_are_enabled[ansible:/osd0] [gw1] [ 94%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_services_are_enabled[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_are_mounted[ansible:/osd0] [gw2] [ 95%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_ceph_config_has_inital_members_line[ansible:/mon0] [gw0] [ 95%] FAILED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osds_listen_on_public_network[ansible:/osd0] [gw1] [ 96%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osd_are_mounted[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osds_listen_on_cluster_network[ansible:/osd0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_initial_members_line_has_correct_value[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_tuning_pools_are_set[ansible:/mon0] ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_volume_is_installed[ansible:/osd0] [gw3] [ 97%] SKIPPED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/rgw/test_rgw_tuning.py::TestRGWs::test_rgw_tuning_pools_are_set[ansible:/mon0] [gw2] [ 98%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py::TestMons::test_initial_members_line_has_correct_value[ansible:/mon0] [gw1] [ 99%] PASSED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_ceph_volume_is_installed[ansible:/osd0] [gw0] [100%] FAILED ../../../../../../../../../../../../../../tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py::TestOSDs::test_osds_listen_on_cluster_network[ansible:/osd0] =================================== FAILURES =================================== _____________ TestOSDs.test_all_osds_are_up_and_in[ansible://mon0] _____________ [gw3] linux2 -- Python 2.7.5 /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/python self = node = {'address': '192.168.3.10', 'ceph_stable_release': 'luminous', 'cluster_address': '', 'cluster_name': u'ceph', ...} host = @pytest.mark.no_docker def test_all_osds_are_up_and_in(self, node, host): cmd = "sudo ceph --cluster={} --connect-timeout 5 -s".format(node["cluster_name"]) output = host.check_output(cmd) phrase = "{num_osds} osds: {num_osds} up, {num_osds} in".format(num_osds=node["total_osds"]) > assert phrase in output E AssertionError: assert '3 osds: 3 up, 3 in' in ' cluster:\n id: fad7031c-4bd3-4328-9716-4fbddc9c14b8\n health: HEALTH_WARN\n no active mgr\n \n ser... 2 in\n \n data:\n pools: 0 pools, 0 pgs\n objects: 0 objects, 0\n usage: 0 used, 0 / 0 avail\n pgs: \n ' /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/mon/test_mons.py:48: AssertionError ____________ TestOSDs.test_osd_services_are_running[ansible://osd0] ____________ [gw1] linux2 -- Python 2.7.5 /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/python self = node = {'address': '192.168.3.100', 'ceph_stable_release': 'luminous', 'cluster_address': '192.168.4.200', 'cluster_name': u'ceph', ...} host = def test_osd_services_are_running(self, node, host): # TODO: figure out way to paramaterize node['osds'] for this test for osd in node["osds"]: > assert host.service("ceph-osd@%s" % osd).is_running E AssertionError: assert False E + where False = .is_running E + where = (('ceph-osd@%s' % '1')) E + where = .service /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py:23: AssertionError _________ TestOSDs.test_osds_listen_on_public_network[ansible://osd0] __________ [gw0] linux2 -- Python 2.7.5 /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/python self = node = {'address': '192.168.3.100', 'ceph_stable_release': 'luminous', 'cluster_address': '192.168.4.200', 'cluster_name': u'ceph', ...} host = def test_osds_listen_on_public_network(self, node, host): # TODO: figure out way to paramaterize this test nb_port = (node["num_devices"] * 2) > assert host.check_output("netstat -lntp | grep ceph-osd | grep %s | wc -l" % (node["address"])) == str(nb_port) E AssertionError: assert '4' == '6' E - 4 E + 6 /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py:13: AssertionError _________ TestOSDs.test_osds_listen_on_cluster_network[ansible://osd0] _________ [gw0] linux2 -- Python 2.7.5 /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/python self = node = {'address': '192.168.3.100', 'ceph_stable_release': 'luminous', 'cluster_address': '192.168.4.200', 'cluster_name': u'ceph', ...} host = def test_osds_listen_on_cluster_network(self, node, host): # TODO: figure out way to paramaterize this test nb_port = (node["num_devices"] * 2) > assert host.check_output("netstat -lntp | grep ceph-osd | grep %s | wc -l" % (node["cluster_address"])) == str(nb_port) E AssertionError: assert '4' == '6' E - 4 E + 6 /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests/osd/test_osds.py:18: AssertionError =============================== warnings summary =============================== None Module already imported so cannot be rewritten: testinfra File fixture is deprecated. Use host fixture and get File module with host.file TestinfraBackend fixture is deprecated. Use host fixture and get backend with host.backend File fixture is deprecated. Use host fixture and get File module with host.file TestinfraBackend fixture is deprecated. Use host fixture and get backend with host.backend File fixture is deprecated. Use host fixture and get File module with host.file TestinfraBackend fixture is deprecated. Use host fixture and get backend with host.backend File fixture is deprecated. Use host fixture and get File module with host.file TestinfraBackend fixture is deprecated. Use host fixture and get backend with host.backend -- Docs: http://doc.pytest.org/en/latest/warnings.html ======== 4 failed, 25 passed, 93 skipped, 9 warnings in 153.34 seconds ========= ERROR: InvocationError: '/tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/bin/testinfra -n 4 --sudo -v --connection=ansible --ansible-inventory=/home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/lvm/centos7/bluestore/dmcrypt/hosts /tmp/tox.45zLJjIuoI/centos7-bluestore-dmcrypt/tmp/ceph-ansible/tests/functional/tests' ___________________________________ summary ____________________________________ ERROR: centos7-bluestore-dmcrypt: commands failed Build step 'Execute shell' marked build as failure [PostBuildScript] - Executing post build scripts. [ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt] $ /bin/bash /tmp/jenkins4713404269642026598.sh ++ mktemp -td venv.XXXXXXXXXX + TEMPVENV=/tmp/venv.IMIeemGNuK + VENV=/tmp/venv.IMIeemGNuK/bin + cd /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional ++ find . ++ grep Vagrantfile ++ xargs dirname + scenarios='./simple/centos7/bluestore/activate ./simple/centos7/bluestore/dmcrypt-luks ./simple/centos7/bluestore/dmcrypt-plain ./simple/centos7/filestore/activate ./simple/centos7/filestore/dmcrypt-luks ./simple/centos7/filestore/dmcrypt-plain ./simple/xenial/bluestore/activate ./simple/xenial/bluestore/dmcrypt-luks ./simple/xenial/bluestore/dmcrypt-plain ./simple/xenial/filestore/activate ./simple/xenial/filestore/dmcrypt-luks ./simple/xenial/filestore/dmcrypt-plain . ./lvm/centos7/bluestore/dmcrypt ./lvm/centos7/bluestore/create ./lvm/centos7/filestore/dmcrypt ./lvm/centos7/filestore/create ./lvm/xenial/bluestore/dmcrypt ./lvm/xenial/bluestore/create ./lvm/xenial/filestore/dmcrypt ./lvm/xenial/filestore/create' + for scenario in '$scenarios' + cd ./simple/centos7/bluestore/activate + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/centos7/bluestore/dmcrypt-luks + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/centos7/bluestore/dmcrypt-plain + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/centos7/filestore/activate + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/centos7/filestore/dmcrypt-luks + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/centos7/filestore/dmcrypt-plain + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/xenial/bluestore/activate + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/xenial/bluestore/dmcrypt-luks + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/xenial/bluestore/dmcrypt-plain + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/xenial/filestore/activate + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/xenial/filestore/dmcrypt-luks + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd ./simple/xenial/filestore/dmcrypt-plain + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f ==> osd1: Remove stale volume... ==> osd1: Domain is not created. Please run `vagrant up` first. ==> osd0: Remove stale volume... ==> osd0: Domain is not created. Please run `vagrant up` first. ==> mon0: Remove stale volume... ==> mon0: Domain is not created. Please run `vagrant up` first. + cd - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional + for scenario in '$scenarios' + cd . + collect_ceph_logs all + limit=all + '[' -f ./vagrant_ssh_config ']' + vagrant destroy -f There was an error loading a Vagrantfile. The file being loaded and the error message are shown below. This is usually caused by a syntax error. Path: /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/Vagrantfile Line number: 0 Message: Errno::ENOENT: No such file or directory @ rb_sysopen - /home/jenkins-build/build/workspace/ceph-volume-nightly-master-lvm-centos7-bluestore-dmcrypt/src/ceph-volume/ceph_volume/tests/functional/vagrant_variables.yml Build step 'Execute Scripts' marked build as failure Archiving artifacts Sending e-mails to: aschoen@redhat.com adeza@redhat.com [BFA] Scanning build for known causes... .[BFA] Found failure cause(s): [BFA] No such file or directory from category Development [BFA] Commands failed from category System [BFA] Assert from category Development [BFA] Error E{INVAL,PERM,ACCESS...} from category Development [BFA] Caught Signal from category Development [BFA] InvocationError from category Test [BFA] STDERR from category Development [BFA] Done. 1s Finished: FAILURE