Project

General

Profile

Actions

Bug #12725

closed

ansible fails on a CentOS 7 cloud image from centos.org

Added by Loïc Dachary over 8 years ago. Updated over 8 years ago.

Status:
Resolved
Priority:
Normal
Assignee:
-
Category:
-
% Done:

0%

Source:
other
Tags:
Backport:
Regression:
No
Severity:
3 - minor
Reviewed:
Affected Versions:
ceph-qa-suite:
Crash signature (v1):
Crash signature (v2):

Description

Running suite/ceph-deploy with centos_7.0 (adding it to the supported distros), it fails with the following (see full log attached):

  File "/home/ubuntu/teuthology/virtualenv/local/lib/python2.7/site-packages/yaml/parser.py", line 98, in check_event
    self.current_event = self.state()
  File "/home/ubuntu/teuthology/virtualenv/local/lib/python2.7/site-packages/yaml/parser.py", line 174, in parse_document_start
    self.peek_token().start_mark)
ParserError: expected '<document start>', but found '{'
  in "/tmp/teuth_ansible_failures_hp9sRK", line 3, column 1

Files

centos7.log (114 KB) centos7.log Loïc Dachary, 08/18/2015 07:28 PM
ansible-2.log (104 KB) ansible-2.log Loïc Dachary, 08/18/2015 08:03 PM
Actions #1

Updated by Andrew Schoen over 8 years ago

The actual failure here is:

2015-08-18T19:04:10.052 INFO:teuthology.task.ansible.out:fatal: [target170122.teuthology] => error while evaluating conditional: replace_repos
Actions #2

Updated by Zack Cerza over 8 years ago

Not according to the failure_reason. That's got to be fixed

Actions #3

Updated by Andrew Schoen over 8 years ago

The parsing error you put in the description was fixed by this: https://github.com/ceph/teuthology/commit/4695ea0cd50d48bccc6e17cce037a2cc820245a9

Are you using an up-to-date teuthology?

Actions #4

Updated by Loïc Dachary over 8 years ago

  • Status changed from New to Won't Fix

Old teuthology, indeed, thanks for the quick comment :-)

Actions #5

Updated by Zack Cerza over 8 years ago

Just a heads-up, Loic, that there might be something funny about the images you're using:

2015-08-18T19:04:09.005 INFO:teuthology.task.ansible.out:TASK: [common | Log the OS name, version and release] ************************* 

2015-08-18T19:04:09.056 INFO:teuthology.task.ansible.out:[0;32mok: [target170123.teuthology] => {
2015-08-18T19:04:09.058 INFO:teuthology.task.ansible.out:
    "msg": "Host target170123.teuthology is running RedHat 7.1.1503 (Core)" 
}[0m
[0;32mok: [target170122.teuthology] => {
    "msg": "Host target170122.teuthology is running RedHat 7.1.1503 (Core)" 
}[0m
Actions #6

Updated by Loïc Dachary over 8 years ago

After rebasing and re-installing:

(virtualenv)ubuntu@teuthology:~/teuthology$ git log -20 --oneline
f8b090f upload teuthology archive on completion
c99a463 transparent OpenStack provisioning for teuthology-suite
5b80153 Merge pull request #599 from ceph/wip-archive-ansible-log
df6677e task.ansible: archive the ansible log even if parsing fails
44d58e6 Merge pull request #598 from ceph/wip-ansible-log-perms
ea53f95 task.ansible: set the ansible_failure.yaml file mode to 0664
...

And running again
teuthology-openstack --simultaneous-jobs 10 -l1 --verbose --key-name myself --key-filename ~/Downloads/myself --suite ceph-deploy --suite-branch wip-11881-multipath --email loic@dachary.org --filter=centos_7 --ceph wip-11881-multipath 

I still see the failure
2015-08-18T19:55:58.240 ERROR:teuthology.task.ansible:Failed to parse ansible failure log: /tmp/teuth_ansible_failures_DMzQDF
Traceback (most recent call last):
  File "/home/ubuntu/teuthology/teuthology/task/ansible.py", line 268, in _handle_failure
    failures = yaml.safe_load(fail_log)
  File "/home/ubuntu/teuthology/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 93, in safe_load
    return load(stream, SafeLoader)
  File "/home/ubuntu/teuthology/virtualenv/local/lib/python2.7/site-packages/yaml/__init__.py", line 71, in load
    return loader.get_single_data()
  File "/home/ubuntu/teuthology/virtualenv/local/lib/python2.7/site-packages/yaml/constructor.py", line 37, in get_single_data
    node = self.get_single_node()
  File "/home/ubuntu/teuthology/virtualenv/local/lib/python2.7/site-packages/yaml/composer.py", line 39, in get_single_node
    if not self.check_event(StreamEndEvent):
  File "/home/ubuntu/teuthology/virtualenv/local/lib/python2.7/site-packages/yaml/parser.py", line 98, in check_event
    self.current_event = self.state()
  File "/home/ubuntu/teuthology/virtualenv/local/lib/python2.7/site-packages/yaml/parser.py", line 174, in parse_document_start
    self.peek_token().start_mark)
ParserError: expected '<document start>', but found '{'
  in "/tmp/teuth_ansible_failures_DMzQDF", line 3, column 1
2015-08-18T19:55:58.241 INFO:teuthology.task.ansible:Archiving ansible failure log at: /usr/share/nginx/html/ubuntu-2015-08-18_19:53:06-ceph-deploy-wip-11881-multipath---basic-openstack/1/ansible_failures.yaml
2015-08-18T19:55:58.241 ERROR:teuthology.run_tasks:Saw exception from tasks.
Traceback (most recent call last):
  File "/home/ubuntu/teuthology/teuthology/run_tasks.py", line 56, in run_tasks
    manager.__enter__()
  File "/home/ubuntu/teuthology/teuthology/task/__init__.py", line 121, in __enter__
    self.begin()
  File "/home/ubuntu/teuthology/teuthology/task/ansible.py", line 231, in begin
    self.execute_playbook()
  File "/home/ubuntu/teuthology/teuthology/task/ansible.py", line 256, in execute_playbook
    self._handle_failure(command, status)
  File "/home/ubuntu/teuthology/teuthology/task/ansible.py", line 282, in _handle_failure
    raise CommandFailedError(command, status)
CommandFailedError: Command failed with status 3: 'ansible-playbook -v --extra-vars \'{"ansible_ssh_user": "ubuntu"}\' -i /etc/ansible/hosts --limit target170153.teuthology,target170152.teuthology /home/ubuntu/src/ceph-cm-ansible_master/cephlab.yml'
2015-08-18T19:55:58.242 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab
2015-08-18T19:55:58.242 INFO:teuthology.task.ansible:Skipping ansible cleanup...

See the full log attached.

Actions #8

Updated by Zack Cerza over 8 years ago

To fix the parser issue, we'll need to see the yaml file that's failing to parse.

Actions #9

Updated by Loïc Dachary over 8 years ago

Scheduled the same CentOS 7 job on sepia to verify if it's a CentOS 7 image issue or not http://pulpito.ceph.com/loic-2015-08-18_22:15:04-ceph-deploy-wip-11881-multipath---basic-vps/

Actions #10

Updated by Andrew Schoen over 8 years ago

Zack Cerza wrote:

To fix the parser issue, we'll need to see the yaml file that's failing to parse.

The ParserError is being caught in that last exception, it's just that ansible apparently sometimes gives us output than can not be made into yaml. In that case I chose to just trap the error and raise the CommandFailedError. Should I try to load the failure as plain text instead of the yaml parsing fails?

Actions #11

Updated by Zack Cerza over 8 years ago

We've got to figure out how to generate safe yaml then. Having large tracebacks show up "sometimes" undoes a lot of the usefulness of using the callback plugin to generate that yaml file in the first place :-/

Actions #12

Updated by Loïc Dachary over 8 years ago

  • Subject changed from ansible fails on CentOS 7 to ansible fails on a CentOS 7 cloud image from centos.org
  • Status changed from New to 12

On http://cloud.centos.org/centos/7/images/CentOS-7-x86_64-GenericCloud.qcow2, logging in shows

[centos@test ~]$ cat /etc/redhat-release 
Derived from Red Hat Enterprise Linux 7.1 (Source)

and on a VPS locked from the Sepia lab :
[ubuntu@vpm084 ~]$ cat /etc/redhat-release 
CentOS Linux release 7.0.1406 (Core) 

How should the cloud image from centos.org be modified to be identified as a CentOS 7 image by ansible ?

Actions #13

Updated by Loïc Dachary over 8 years ago

  • Status changed from 12 to Won't Fix

Ansible requires lsb-release to work properly, it is the responsibility of whatever happens before ansible to install lsb-release.

Actions #14

Updated by Loïc Dachary over 8 years ago

<dmick> of course searching for ansible_distribution goes nowhere (sigh)
<loicd> dmick: zackc thanks for the help, installing lsb-release is kind of tricky but I know how to do that
<dmick> it may or may not be the answer
<dmick> what's this say on that cloud image?
<dmick> python -c "import platform; print platform.linux_distribution()" 
<dmick> and then
<dmick> what does strace show it looking at?
<dmick> on plana39 the last thing it looks at is /etc/centos-release
<dmick> loicd: ^
<loicd> [centos@test ~]$ python -c "import platform; print platform.linux_distribution()" 
<loicd> ('CentOS Linux', '7.1.1503', 'Core')
<dmick> so, that is far more confusing
<dmick> because it sure looks like that's what ansible is using
<loicd> [centos@test ~]$ cat /etc/redhat-release 
<loicd> Derived from Red Hat Enterprise Linux 7.1 (Source)
<dmick> yeah.  /etc/centos-release
<loicd> [centos@test ~]$ cat /etc/centos-release 
<loicd> CentOS Linux release 7.1.1503 (Core) 
<dmick> so I remain mystified.
<loicd> me too
<dmick> I haven't exhaustively searched the ansible identification code
<dmick> but I'd be tempted to spend some quality time with it and the pythno interpreter
<dmick> fwiw, /etc/redhat-release, /etc/system-release, /etc/os-release, and /etc/centos-release (along with others) are all in package centos-release
Actions #15

Updated by Loïc Dachary over 8 years ago

ansible localhost -i /dev/null -m setup

localhost | success >> {
    "ansible_facts": {
        "ansible_all_ipv4_addresses": [
            "149.202.171.22" 
        ], 
        "ansible_all_ipv6_addresses": [
            "fe80::f816:3eff:fe93:af66" 
        ], 
        "ansible_architecture": "x86_64", 
        "ansible_bios_date": "01/01/2011", 
        "ansible_bios_version": "Bochs", 
        "ansible_cmdline": {
            "BOOT_IMAGE": "/boot/vmlinuz-3.10.0-229.el7.x86_64", 
            "LANG": "en_US.UTF-8", 
            "console": "ttyS0,115200", 
            "ro": true, 
            "root": "UUID=a78bb152-e525-4f0e-961a-bf6147ac7d3e", 
            "vconsole.font": "latarcyrheb-sun16", 
            "vconsole.keymap": "us" 
        }, 
        "ansible_date_time": {
            "date": "2015-08-19", 
            "day": "19", 
            "epoch": "1439944913", 
            "hour": "00", 
            "iso8601": "2015-08-19T00:41:53Z", 
            "iso8601_micro": "2015-08-19T00:41:53.845632Z", 
            "minute": "41", 
            "month": "08", 
            "second": "53", 
            "time": "00:41:53", 
            "tz": "UTC", 
            "tz_offset": "+0000", 
            "weekday": "mercredi", 
            "year": "2015" 
        }, 
        "ansible_default_ipv4": {
            "address": "149.202.171.22", 
            "alias": "eth0", 
            "gateway": "149.202.160.1", 
            "interface": "eth0", 
            "macaddress": "fa:16:3e:93:af:66", 
            "mtu": 1500, 
            "netmask": "255.255.255.255", 
            "network": "149.202.171.22", 
            "type": "ether" 
        }, 
        "ansible_default_ipv6": {}, 
        "ansible_devices": {
            "vda": {
                "holders": [], 
                "host": "", 
                "model": null, 
                "partitions": {
                    "vda1": {
                        "sectors": "20962777", 
                        "sectorsize": 512, 
                        "size": "10.00 GB", 
                        "start": "2048" 
                    }
                }, 
                "removable": "0", 
                "rotational": "1", 
                "scheduler_mode": "", 
                "sectors": "20971520", 
                "sectorsize": "512", 
                "size": "10.00 GB", 
                "support_discard": "0", 
                "vendor": "0x1af4" 
            }
        }, 
        "ansible_distribution": "RedHat", 
        "ansible_distribution_major_version": "7", 
        "ansible_distribution_release": "Core", 
        "ansible_distribution_version": "7.1.1503", 
        "ansible_domain": "", 
        "ansible_env": {
            "HISTCONTROL": "ignoredups", 
            "HISTSIZE": "1000", 
            "HOME": "/home/centos", 
            "HOSTNAME": "test", 
            "LANG": "en_US.UTF-8", 
            "LC_ADDRESS": "fr_FR.UTF-8", 
            "LC_CTYPE": "en_US.UTF-8", 
            "LC_IDENTIFICATION": "fr_FR.UTF-8", 
            "LC_MEASUREMENT": "fr_FR.UTF-8", 
            "LC_MONETARY": "fr_FR.UTF-8", 
            "LC_NAME": "fr_FR.UTF-8", 
            "LC_NUMERIC": "fr_FR.UTF-8", 
            "LC_PAPER": "fr_FR.UTF-8", 
            "LC_TELEPHONE": "fr_FR.UTF-8", 
            "LC_TIME": "fr_FR.UTF-8", 
            "LESSOPEN": "||/usr/bin/lesspipe.sh %s", 
            "LOGNAME": "centos", 
            "LS_COLORS": "rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:", 
            "MAIL": "/var/spool/mail/centos", 
            "PATH": "/home/centos/v/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/home/centos/.local/bin:/home/centos/bin", 
            "PWD": "/home/centos", 
            "SELINUX_LEVEL_REQUESTED": "", 
            "SELINUX_ROLE_REQUESTED": "", 
            "SELINUX_USE_CURRENT_RANGE": "", 
            "SHELL": "/bin/bash", 
            "SHLVL": "2", 
            "SSH_AUTH_SOCK": "/tmp/ssh-SnmSY88SWC/agent.7880", 
            "SSH_CLIENT": "77.12.136.38 57662 22", 
            "SSH_CONNECTION": "77.12.136.38 57662 149.202.171.22 22", 
            "SSH_TTY": "/dev/pts/0", 
            "TERM": "xterm", 
            "USER": "centos", 
            "VIRTUAL_ENV": "/home/centos/v", 
            "XDG_RUNTIME_DIR": "/run/user/1000", 
            "XDG_SESSION_ID": "6", 
            "_": "/home/centos/v/bin/python" 
        }, 
        "ansible_eth0": {
            "active": true, 
            "device": "eth0", 
            "ipv4": {
                "address": "149.202.171.22", 
                "netmask": "255.255.255.255", 
                "network": "149.202.171.22" 
            }, 
            "ipv6": [
                {
                    "address": "fe80::f816:3eff:fe93:af66", 
                    "prefix": "64", 
                    "scope": "link" 
                }
            ], 
            "macaddress": "fa:16:3e:93:af:66", 
            "module": "virtio_net", 
            "mtu": 1500, 
            "promisc": false, 
            "type": "ether" 
        }, 
        "ansible_fips": false, 
        "ansible_form_factor": "Other", 
        "ansible_fqdn": "test", 
        "ansible_hostname": "test", 
        "ansible_interfaces": [
            "lo", 
            "eth0" 
        ], 
        "ansible_kernel": "3.10.0-229.el7.x86_64", 
        "ansible_lo": {
            "active": true, 
            "device": "lo", 
            "ipv4": {
                "address": "127.0.0.1", 
                "netmask": "255.0.0.0", 
                "network": "127.0.0.0" 
            }, 
            "ipv6": [
                {
                    "address": "::1", 
                    "prefix": "128", 
                    "scope": "host" 
                }
            ], 
            "mtu": 65536, 
            "promisc": false, 
            "type": "loopback" 
        }, 
        "ansible_machine": "x86_64", 
        "ansible_machine_id": "dae72fe0cc064eb0b7797f25bfaf69df", 
        "ansible_memfree_mb": 1433, 
        "ansible_memory_mb": {
            "nocache": {
                "free": 1798, 
                "used": 155
            }, 
            "real": {
                "free": 1433, 
                "total": 1953, 
                "used": 520
            }, 
            "swap": {
                "cached": 0, 
                "free": 0, 
                "total": 0, 
                "used": 0
            }
        }, 
        "ansible_memtotal_mb": 1953, 
        "ansible_mounts": [
            {
                "device": "/dev/vda1", 
                "fstype": "xfs", 
                "mount": "/", 
                "options": "rw,seclabel,relatime,attr2,inode64,noquota", 
                "size_available": 9673138176, 
                "size_total": 10722455552, 
                "uuid": "a78bb152-e525-4f0e-961a-bf6147ac7d3e" 
            }
        ], 
        "ansible_nodename": "test", 
        "ansible_os_family": "RedHat", 
        "ansible_pkg_mgr": "yum", 
        "ansible_processor": [
            "GenuineIntel", 
            "Intel Xeon E312xx (Sandy Bridge)" 
        ], 
        "ansible_processor_cores": 1, 
        "ansible_processor_count": 1, 
        "ansible_processor_threads_per_core": 1, 
        "ansible_processor_vcpus": 1, 
        "ansible_product_name": "OpenStack Nova", 
        "ansible_product_serial": "NA", 
        "ansible_product_uuid": "NA", 
        "ansible_product_version": "2014.2.3", 
        "ansible_python_version": "2.7.5", 
        "ansible_selinux": false, 
        "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHBj+KPNRpcKQKimeYY4Cvkd/G35We2s017AGLsSr+stoAS+cZ3vhnTNMxQOZtf4bCHSJDXPIr0wAWDPGxwO6oY=", 
        "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABAQDk7M7ld5ct3zFakLYAoasgOKngKcNb1+LCJRhcyr1A1FCNS28UAl39UD2wEgX8L7VXoFwfJgLfEWR1M+JcTtNq3SbFQCPMLuCcIyHZAM0CR/vfRn1oJSkDOAIWozDUbAwrz4BJZhiw4pO7ex/srbvh8UW8IaTJ8BOMlySaeallGA7Lk6EPF7TyeVkNi+7vQaWWR0vWaVbLekbDpp9HLoQSfBACksnHNti1lm1Z3r5uuL7VUrIVZZTbn4c+nXQ6IGctMlD65iJE00wjvLBnRJ+EQkwCgtH361iz3bPv7MEEKrOuc2gY8t06EEcqDcygjLIuBjoE1BraVZdhKz/uzOfR", 
        "ansible_swapfree_mb": 0, 
        "ansible_swaptotal_mb": 0, 
        "ansible_system": "Linux", 
        "ansible_system_vendor": "OpenStack Foundation", 
        "ansible_user_dir": "/home/centos", 
        "ansible_user_gecos": "Cloud User", 
        "ansible_user_gid": 1000, 
        "ansible_user_id": "centos", 
        "ansible_user_shell": "/bin/bash", 
        "ansible_user_uid": 1000, 
        "ansible_userspace_architecture": "x86_64", 
        "ansible_userspace_bits": "64", 
        "ansible_virtualization_role": "host", 
        "ansible_virtualization_type": "kvm", 
        "module_setup": true
    }, 
    "changed": false
}
Actions #17

Updated by Loïc Dachary over 8 years ago

  • Status changed from Resolved to 12
Actions #18

Updated by Loïc Dachary over 8 years ago

Despite lsb-release being installed

2015-08-19T17:25:29.256 INFO:teuthology.task.ansible.out:
    "msg": "Host target173081.teuthology is running RedHat 7.1.1503 (Core)" 
}[0m

TASK: [common | Including major version specific variables.] ****************** 

2015-08-19T17:25:29.264 INFO:teuthology.task.ansible.out:[0;32mok: [target173081.teuthology] => (item=/home/ubuntu/src/ceph-cm-ansible_master/roles/common/vars/redhat_7.yml) => {"ansible_facts": {"rhsm_repos": ["rhel-7-server-rpms", "rhel-7-server-optional-rpms", "rhel-7-server-extras-rpms"]}, "item": "/home/ubuntu/src/ceph-cm-ansible_master/roles/common/vars/redhat_7.yml"}[0m
2015-08-19T17:25:29.266 INFO:teuthology.task.ansible.out:

TASK: [common | Set entitlements_path] **************************************** 

2015-08-19T17:25:29.274 INFO:teuthology.task.ansible.out:[0;32mok: [target173081.teuthology] => {"ansible_facts": {"entitlements_path": "/etc/ansible/secrets/entitlements.yml"}}[0m
2015-08-19T17:25:29.275 INFO:teuthology.task.ansible.out:

TASK: [common | Include Red Hat encrypted variables.] ************************* 

2015-08-19T17:25:29.283 INFO:teuthology.task.ansible.out:[0;32mok: [target173081.teuthology] => {"censored": "results hidden due to no_log parameter"}[0m
2015-08-19T17:25:29.284 INFO:teuthology.task.ansible.out:

TASK: [common | Set have_entitlements] **************************************** 

2015-08-19T17:25:29.291 INFO:teuthology.task.ansible.out:[0;32mok: [target173081.teuthology] => {"ansible_facts": {"have_entitlements": "False"}}[0m
2015-08-19T17:25:29.292 INFO:teuthology.task.ansible.out:

TASK: [common | Determine if node is registered with subscription-manager.] *** 

2015-08-19T17:25:29.396 INFO:teuthology.task.ansible.out:[0;31mfailed: [target173081.teuthology] => {"censored": "results hidden due to no_log parameter", "changed": false, "failed": true, "rc": 2}[0m
2015-08-19T17:25:29.398 INFO:teuthology.task.ansible.out:
[0;36m...ignoring[0m

TASK: [common | Set rhsm_registered] ****************************************** 

2015-08-19T17:25:29.409 INFO:teuthology.task.ansible.out:[0;32mok: [target173081.teuthology] => {"ansible_facts": {"rhsm_registered": "False"}}[0m
2015-08-19T17:25:29.411 INFO:teuthology.task.ansible.out:

TASK: [common | Register with subscription-manager.] ************************** 

2015-08-19T17:25:29.419 INFO:teuthology.task.ansible.out:[0;36mskipping: [target173081.teuthology] => (item={'skipped': True, 'censored': 'results hidden due to no_log parameter', 'changed': False})[0m
2015-08-19T17:25:29.420 INFO:teuthology.task.ansible.out:

TASK: [common | Get list of enabled repos] ************************************ 

2015-08-19T17:25:29.427 INFO:teuthology.task.ansible.out:[0;36mskipping: [target173081.teuthology][0m
2015-08-19T17:25:29.429 INFO:teuthology.task.ansible.out:

TASK: [common | Store list of enabled repos] ********************************** 

2015-08-19T17:25:29.437 INFO:teuthology.task.ansible.out:[0;36mskipping: [target173081.teuthology][0m
2015-08-19T17:25:29.438 INFO:teuthology.task.ansible.out:

TASK: [common | Set replace_repos if rhsm_repos differs from repo_list] ******* 

2015-08-19T17:25:29.445 INFO:teuthology.task.ansible.out:[0;36mskipping: [target173081.teuthology][0m
2015-08-19T17:25:29.446 INFO:teuthology.task.ansible.out:

TASK: [common | Set replace_repos if newly-subscribed] ************************ 

2015-08-19T17:25:29.455 INFO:teuthology.task.ansible.out:[0;36mskipping: [target173081.teuthology][0m
2015-08-19T17:25:29.456 INFO:teuthology.task.ansible.out:

TASK: [common | Disable all rhsm repos] *************************************** 

2015-08-19T17:25:29.463 INFO:teuthology.task.ansible.out:[0;31mfatal: [target173081.teuthology] => error while evaluating conditional: replace_repos[0m
2015-08-19T17:25:29.465 INFO:teuthology.task.ansible.out:
[0;31m
FATAL: all hosts have already failed -- aborting[0m

Actions #20

Updated by Loïc Dachary over 8 years ago

  • Status changed from 12 to Resolved
Actions

Also available in: Atom PDF