In addition to changes to install.py, some more changes were -
- In ~.teuthoplogy.yaml file, added - test_user: jenkins #In our test nodes the test user is "jenkins" instead of "ubuntu"
- gitbuilder_host: http://download.suse.de/SUSE/Products/ceph/images/iso/ #This is where teuthology install task will pickup
the ISO from and create it's zypper repo. Actually this bit was easier for me because we were running teuthology tests only against SLES12 using ISO
only from our master branch.
- Changed lab domain to suse.de
- Instead of fetching ceph-qa-suite at runtime (there was some complication involved in it which I can not recollect right now), I already had ceph-qa-suite cloned
on the same node where I had teuthology deployed and then passed the needed parameters while running teuthology suite. example -
teuthology-suite /home/jenkins/github.suse/automation/teuthology/examples/suites/common.yaml -c firefly -v -s rados/basic -m teuthida -d sles --suite-dir /home/jenkins/src/ceph-qa-suite_firefly/ -e ceph-bugs@suse.de
- The common.yaml file mentioned above looks like this -
check-locks: false
nuke-on-error: false
unlock_on_failure: true
os_version: 12
os_type: SLES
overrides:
workunit:
branch: firefly
I had some issues with check-locks and nuke-on-error features so had to disable them.
- Make sure you pass correct username while calling the canonicalize_hostname method in misc.py
Besides install.py changes, most of the other changes were needed when I wanted to run teuthology-suite.
Running individual teuthology tasks from a specific yaml file did not present much of issues.