Testing changes when contributing to Ansible

This is by no means a detailed post about testing Ansible.  The main reason I’m writing this down is because I know I’ll forget by a few months time.

Goal:

  • Create an Ansible plugin filter to calculate the distance between two coordinates (haversine)
  • Write and test the required unit tests (and integration?) for the code to be accepted into Ansible
  • Raise a PR to the main project

Background

I was writing an Ansible playbook to check the local traffic site for accidents.  The site returns a list of incidents state wide, which, is great, but I was only interested in a 10km radius.  Eg:

 .....
 with_items: “{{ traffic.json.features }}”

 when: item | distance(myLong,myLat, item.geometry.coordinates.1, item.geometry.coordinates.0)|int < 10

This looped over a variable with_items and then passed some coordinates through the a custom plugin filter called distance.  (Note: Distance was a custom plugin filter which I stored in the plugin_filters directory.  I figured I’d go a little further and try to submit to Ansible core.)

Fork

  • I’ve forked ansible/ansible from github into my own repo.
  • I’ve cloned the repo to my development machine
  • Changed directory into the freshly cloned ansible directory and ran `source ./hacking/env-setup`.

Finding where to add the filter code

Haversine (wikipedia) is a maths formula, so, it’s probably best to hang out with other Ansible filters such as power, logarithm, min, max and friends.

Grepping through the source for above brought up this:

./lib/ansible/plugins/filter/mathstuff.py

Perfect!  Hopefully.

The code…

Added a function to this file:

def haversine(measurement, lat1, lon1, lat2, lon2):

    from math import radians, sin, cos, sqrt, asin

    diameter = {

        ‘m’: 7917.5,

        ‘km’: 12742}

    try:

        dlat = radians(float(lat2) – float(lat1))

        dlon = radians(float(lon2) – float(lon1))

        lat1 = radians(float(lat1))

        lat2 = radians(float(lat2))

        a = sin(dlat / 2) ** 2 + cos(lat1) * cos(lat2) * sin(dlon / 2) ** 2

        c = 2 * asin(sqrt(a))

    except (ValueError, TypeError) as e:

        raise errors.AnsibleFilterError(‘haversine() only accepts floats: %s’ % str(e))

    if measurement in diameter:

        return round(diameter[measurement] / 2 * c, 2)

    else:

        raise errors.AnsibleFilterError(‘haversine() can only be called with km or m’)

Testing

Ansible has a great test setup.  I started with the unit tests.

Once again, needed to find out where the math stuff testing hung out.  Grep’ing through the test directory found:  test/units/plugins/filter/test_mathstuff.py

Copying the format of existing tests I added a few tests:

class TestHaversine:

    def test_haversine_non_number(self):

        with pytest.raises(AnsibleFilterError, message=’haversine() only accepts floats’):

            ms.haversine(‘km’, ‘a’, ‘b’, ‘c’, ‘d’)

        with pytest.raises(AnsibleFilterError, message=’haversine() only accepts floats’):

            ms.haversine(‘m’, ‘a’, ‘b’, ‘c’, ‘d’)

        with pytest.raises(AnsibleFilterError, message=’haversine() can only be called with km or m’):

            ms.haversine(‘z’, ‘35.9914928’,’-78.907046′, ‘-33.8523063’, ‘151.2085984’)

    def test_km(self):

        assert ms.haversine(‘km’, ‘35.9914928’, ‘-78.907046’, ‘-33.8523063’, ‘151.2085984’) == 15490.46

    def test_m(self):

        assert ms.haversine(‘m’, ‘35.9914928’, ‘-78.907046’, ‘-33.8523063’, ‘151.2085984’) == 9625.31

This was to test:

  • That proper float values were passed through to the function
  • That a proper measure was used (km or m)
  • That two test calculations calculated and rounded nicely

The quickest way to test after making the change was:

ansible-test units --tox --python 2.7 test/units/plugins/filter/test_mathstuff.py

Other types of testing in Ansible were here.

Test playbook

Of course, better test to ensure it’s actually usable in a playbook:

– hosts: localhost

  tasks:

    – name: Haversine distance between two lon/lat co-ordinates

      debug:

        msg: “{{ ‘km’|haversine(‘35.9914928′,’-78.907046′, ‘-33.8523063’, ‘151.2085984’) }}”

    – name: Haversine distance between two lon/lat co-ordinates

      debug:

        msg: “{{ ‘m’|haversine(‘35.9914928′,’-78.907046′, ‘-33.8523063’, ‘151.2085984’) }}”

    – name: Haversine distance between two lon/lat co-ordinates

      debug:

        msg: “{{ ‘m’|haversine(‘a35.9914928′,’-78.907046′, ‘-33.8523063’, ‘151.2085984’) }}”

Which resulted in:

$ ansible-playbook only_filter.yml

PLAY [localhost] *********************************************************************************************************************************************

TASK [Gathering Facts] ***************************************************************************************************************************************

ok: [localhost]

TASK [Haversine distance between two lon/lat co-ordinates] ***************************************************************************************************

ok: [localhost] => {

    “msg”: “15490.46”

}

TASK [Haversine distance between two lon/lat co-ordinates] ***************************************************************************************************

ok: [localhost] => {

    “msg”: “9625.31”

}

TASK [Haversine distance between two lon/lat co-ordinates] ***************************************************************************************************

fatal: [localhost]: FAILED! => {“msg”: “haversine() only accepts floats: could not convert string to float: a35.9914928”}

to retry, use: –limit @/home/johni/ansible/hacking/ansible/only_filter.retry

PLAY RECAP ***************************************************************************************************************************************************

localhost                  : ok=3    changed=0    unreachable=0    failed=1   

Documentation as Code System (DACS)

A few months ago I wrote a post about Documentation as Code which was an extension of a proof of concept (POC) I put together at work.  Simply put, build some awesome gold templates for your documentation in a mark up/mark down format of your choice [1], check it into the SCM [2] of your choice and then have a system collect information for you and fill in the blanks.

There seems to be a bit of interest regarding automatically updating documentation on https://networktocode.slack.com #cicd channel, so, I thought I’d post a few more musings.

[1] POC was using AsciiDoc and AsciiDoctor
[2] POC was using Git

Documentation as Code System (DACS)

Lacking a more creative name, DACS seem to portray what was trying to achieve.

The idea was to create a few containers/microservices to carry tasks to produce documentation.  Some distinct functions that come to mind:

  • Webfront end (basic, to start a new project and start builds)
  • Source Control (external, or, interchangeable). Contains:
    • Gold templates
    • Clone templates which become live artifacts
  • CMDB – Define systems to be documented
    • Hostname
    • IP details
    • Credentials / access method
  • Data gathering
    • Filling the gaps (substituting) data into the templates (and checking into repo)
    • Could also build the learnt CI CMDB and relationships between configuration items here
  • Documentation builders (acsiidoctor in POC, but, interchangeable if other mark up/mark down formats desired)
    • Building the HTML/PDF/desired format
  • Delivering the outputted documentation
    • Uploading to various locations (SCP/FTP/Sharepoint/Documentation store)
    • Emailing to end user

 

The Dream

Architects building a solution pick from gold templates and create a new solution.

  • Consider building gold templates for Business Service Catalogues (BSC) and Technical Service Catalogues (TSC) items.

Build engineers build the solution and add the devices to DACS.  DACS audits the environment and fills in the documentation blanks.    Repositories and end users updated with freshly minted documentation.

Future changes in the environment are then captured and documentation updated by:

  • Scheduled periodic tasks to ensure documentation is up-to-date
  • Triggered by CI/CD pipelines when a change is made in the environment
  • Manually triggered by the change implementer

Diagrams could also be built with the CMDB and relationship information we’ve gathered during data gathering, with, diagrams inserted into the documentation repository/end document.

No more decaying documentation lying around. 🙂

Data flow

Trying to map out the data flow….