Bootstrap and automate server and desktop configs using Ansible Pull

Last updated on 2021-03-14 Tagged under  # ansible  # linux

On remote hosts, set up Ansible to automatically run and update the host using the latest changes stored in a git repository.

Let's go!

In the past I have used a shell script I wrote to setup devices after a minimal install of Debian. Ansible also can bootstrap and provision a new device, and so much more. I discovered its awesomeness in these two getting-started YouTube tutorials:

Ansible commonly operates in push mode. A central server has an inventory file that lists the devices it manages. Ansible connects to each device and the server pushes out changes to those devices.

However, Ansible can also operate in pull mode. In this case there is no need for a central server or device list. All the details are stored in a git repository and the device pulls down the changes and configures itself.

After running ansible-pull on the localhost for the first time, all further changes are automatically applied.

The above tutorials are enough to get started. Below are a few additional notes I made about getting everything setup for the first time.


This repository contains a copy of the GitLab-hosted Ansible configuration that I use for my laptops and servers.

PLEASE DON'T DIRECTLY USE THIS AGAINST YOUR OWN DEVICES, as it is something I developed for myself and may not translate to your use-case. It even configures OpenSSH, so if you connect remotely and run it you will get locked out. I've provided this as an example you can use to build your own, and compare syntax.


Debian Buster has ansible packages both in main and buster-backports but they are quite old (same with Ubuntu and Linux Mint). Instead of doing apt install ansible, I use Python's pip3 package tool to install the latest version.

On all (Debian/Ubuntu) devices that are going to run Ansible, first install git and python3-pip ...

$ sudo apt install git python3-pip

... then install ansible itself ...

$ sudo pip3 install ansible
$ ansible --version
ansible 2.10.6
  config file = None
  configured module search path = ['/home/dwa/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/local/lib/python3.7/dist-packages/ansible
  executable location = /usr/local/bin/ansible
  python version = 3.7.3 (default, Jul 25 2020, 13:03:44) [GCC 8.3.0]

Query the localhost device variables with ansible localhost -m setup. Example: Display the localhost distribution details ...

$ ansible localhost -m setup | grep distribution
[WARNING]: No inventory was parsed, only implicit localhost is available
        "ansible_distribution": "Debian",
        "ansible_distribution_file_parsed": true,
        "ansible_distribution_file_path": "/etc/os-release",
        "ansible_distribution_file_variety": "Debian",
        "ansible_distribution_major_version": "testing",
        "ansible_distribution_release": "bullseye",
        "ansible_distribution_version": "testing",

Multiple playbooks

A playbook is a list of automated tasks to be run by Ansible. When running Ansible in push mode, different roles can be specified for each device in the inventory file, and different tasks will be executed based on those roles.

One thing that tripped me up in the beginning, using Ansible in pull mode, is that all the changes are being made on the localhost. No inventory file, no need for remote (via SSH) connections. How do I mark a device as a server, or a desktop, or a workstation, etc. and only run the designated tasks for that particular role?

Solution: I create a local-DEVICE.yml playbook for each role. Example: For a server, I create local-server.yml, and I login to that remote device and run ...

$ sudo ansible-pull -U local-server.yml

When the device to be configured is a laptop or desktop, I switch local-server.yml for local-desktop.yml.


Tasks can be assigned tags. Example: This base_sources.yml task has three tags ...

- name: copy sources.list for debian buster
  tags: apt,settings,base

One advantage of using tags is that, instead of running the complete playbook each time you want to test/perform a subset of tasks, a tag can be specified and only tasks with the matching tag will be run.

Example: On the device where I'm developing my playbook, list all the tags created ...

$ ansible-playbook --list-tags local-server.yml

playbook: local-server.yml

  play #1 (localhost): localhost	TAGS: []
      TASK TAGS: [always, apt, base, cron, git, packages, server, settings, ssh, sudo, systemd, users]

Any [WARNING]: provided hosts list is empty ... messages can be safely ignored. Pull mode on localhost will only be running Ansible on itself.

List the tasks associated with a tag (which also includes any task tagged with always) ...

$ ansible-playbook --list-tasks --tags ssh local-server.yml

playbook: local-server.yml

  play #1 (localhost): localhost    TAGS: []
      update/upgrade system   TAGS: [always, apt]
      generate sshd_config from template    TAGS: [base, settings, ssh]

Run a playbook on a remote device, but limited to tasks with the packages tag ...

$ sudo ansible-pull --tags packages -U local-server.yml

Automate updates

After running ansible-pull for the first time on my device, the real power of Ansible is the ability to automate any additional changes. I create tasks/base_users.yml to add a user ansible that is tasked with running ansible playbooks, and create (for servers) tasks/server_cron.yml to add a cron job to run that playbook every hour in the background.

After the playbook's first run, /home/ansible is created, and a system user is added ...

$ grep ansible /etc/group 

View crontab for ansible ...

$ sudo crontab -u ansible -l
#Ansible: ansible provision
@hourly /usr/local/bin/ansible-pull -o -U local-server.yml > /home/ansible/ansible.log

With the -o option, the playbook local-server.yml is only run if there have been any actual changes to the git repository. If nothing has changed, it will skip the entire job. Resource-friendly!

» Later: Automatic security and other upgrades in Debian

« Earlier: Secure remote access using SSH keys