Ansible triggered by a consumed RabbitMQ message

At work we settled on Ansible Tower to take care of scheduling/triggering Ansible playbook runs, however, during the evaluation of Tower the question was always there:  “Do we need Tower?  Can we trigger Ansible playbooks another way with existing systems?”

Tower is the right choice for us at the moment due to all the features it brings, such as, role based access control (RBAC), API interfaces, logging and much more…

At home the question still intrigued me,  how to trigger an Ansible playbook run in a eloquent manner.

Whilst it would be simple and effective enough to SSH into a box and execute an ansible-playbook command line (ssh user@box ‘ansible-playbook playbook.yml’), it didn’t feel eloquent.

Under the hood Tower uses RabbitMQ, celery, postgresql and Django.  I wondered what it would take to trigger an Ansible playbook run via a RabbitMQ message.

Ansible-runner to the rescue

https://github.com/ansible/ansible-runner states:

A tool and python library that helps when interfacing with Ansible directly or as part of another system whether that be through a container image interface, as a standalone tool, or as a Python module that can be imported. The goal is to provide a stable and consistent interface abstraction to Ansible.

Fantastic!  From the documentation site: https://ansible-runner.readthedocs.io/en/latest/

Ansible Runner represents the modularization of the part of Ansible Tower/AWX that is responsible for running ansible and ansible-playbook tasks and gathers the output from it. It does this by presenting a common interface that doesn’t change, even as Ansible itself grows and evolves.

Workflow

Breaking down the requirement, I wanted to:

  1. Connect to a RabbitMQ server and subscribe to a queue.
  2. Upon receiving a message on the queue, trigger an Ansible runner playbook execution, or, ad-hoc command.
  3. Print out results to console.
  4. Sit and wait for the next message on the queue.

Python packages required

The following packages make it very easy to do this.

pika is a python package written in python for talking to RabbitMQ (link):

Pika is a pure-Python implementation of the AMQP 0-9-1 protocol that tries to stay fairly independent of the underlying network support library.

ansible-runner is as discussed above… checkout the documentation (here).

The script below basically smashes together two examples from:

You obviously need a RabbitMQ server and queue setup to run this.

 

Using pika and ansible-runner to execute ansible via a message queue

The example code is available here: https://github.com/Im0/ansible-runner-rabbitmq

import ansible_runner
import pika

'''
Just a rough a ready example of combining ansible_runner with pika.  Pika subscribes to a
channel and when a message is present, it grabs the message and fires ansible-runner.

Basically smashing together examples from:
* https://ansible-runner.readthedocs.io/en/latest/python_interface.html#usage-examples
* https://pika.readthedocs.io/en/stable/examples/blocking_consume.html

Consider:
* If content-type of the rabbitmq message is JSON... load that into a dict.
* Using data from the queue in the ansible-runner (extravars).
* How to pass large data to ansible-runner.
* Security considerations
'''

def on_message(channel, method_frame, header_frame, body):
    print(method_frame.delivery_tag)
    print(body)
    print()
    exec_ansible_runner()
    channel.basic_ack(delivery_tag=method_frame.delivery_tag)

def exec_ansible_runner():
    # Use private_data_dir if you want the output of the ansible run saved
    #r = ansible_runner.run(private_data_dir='/tmp/demo', host_pattern='localhost', module='shell', module_args='whoami')
    r = ansible_runner.run(json_mode=True, host_pattern='localhost', module='shell', module_args='whoami')
    print("{}: {}".format(r.status, r.rc))
    # successful: 0
    for each_host_event in r.events:
        print(each_host_event['event'])
    print("Final status:")
    print(r.stats)

def main():
    url = 'amqp://guest:[email protected]:5672/%2F'
    parameters = pika.URLParameters(url)
    connection = pika.BlockingConnection(parameters)
    channel = connection.channel()
    channel.basic_consume(on_message, 'hello')
    try:
        channel.start_consuming()
    except KeyboardInterrupt:
        channel.stop_consuming()
    connection.close()

if __name__ == "__main__":
    main()


This was only intended as a basic example, which, could be extended.

 

Documentation as Code System (DACS)

A few months ago I wrote a post about Documentation as Code which was an extension of a proof of concept (POC) I put together at work.  Simply put, build some awesome gold templates for your documentation in a mark up/mark down format of your choice [1], check it into the SCM [2] of your choice and then have a system collect information for you and fill in the blanks.

There seems to be a bit of interest regarding automatically updating documentation on https://networktocode.slack.com #cicd channel, so, I thought I’d post a few more musings.

[1] POC was using AsciiDoc and AsciiDoctor
[2] POC was using Git

Documentation as Code System (DACS)

Lacking a more creative name, DACS seem to portray what was trying to achieve.

The idea was to create a few containers/microservices to carry tasks to produce documentation.  Some distinct functions that come to mind:

  • Webfront end (basic, to start a new project and start builds)
  • Source Control (external, or, interchangeable). Contains:
    • Gold templates
    • Clone templates which become live artifacts
  • CMDB – Define systems to be documented
    • Hostname
    • IP details
    • Credentials / access method
  • Data gathering
    • Filling the gaps (substituting) data into the templates (and checking into repo)
    • Could also build the learnt CI CMDB and relationships between configuration items here
  • Documentation builders (acsiidoctor in POC, but, interchangeable if other mark up/mark down formats desired)
    • Building the HTML/PDF/desired format
  • Delivering the outputted documentation
    • Uploading to various locations (SCP/FTP/Sharepoint/Documentation store)
    • Emailing to end user

 

The Dream

Architects building a solution pick from gold templates and create a new solution.

  • Consider building gold templates for Business Service Catalogues (BSC) and Technical Service Catalogues (TSC) items.

Build engineers build the solution and add the devices to DACS.  DACS audits the environment and fills in the documentation blanks.    Repositories and end users updated with freshly minted documentation.

Future changes in the environment are then captured and documentation updated by:

  • Scheduled periodic tasks to ensure documentation is up-to-date
  • Triggered by CI/CD pipelines when a change is made in the environment
  • Manually triggered by the change implementer

Diagrams could also be built with the CMDB and relationship information we’ve gathered during data gathering, with, diagrams inserted into the documentation repository/end document.

No more decaying documentation lying around. 🙂

Data flow

Trying to map out the data flow….