This post appeared originally in our sysadvent series and has been moved here following the discontinuation of the sysadvent microsite

While dropping root account passwords completely in favour of sudo is an option in many cases, we prefer keeping root passwords around for when we need direct console access. We keep these passwords in an encrypted password-store (we will write about this in a later blog post this season), and change them when someone should no longer have access or the passwords approach three months in age.

We prefer “horse” passwords for ease of communicating verbally, and use “diceware” with the EFF large word list to generate the password, but changing the playbook to use “apg” or similar should be simple. For Ubuntu, the diceware .deb(dead) in newer versions of Ubuntu can be installed also on Xenial without issues.

Ansible

We use Ansible extensively for day to day operations, so it was only natural to create a playbook for the password change. While Ansible does have a rather fragile “interface” (playbooks you write can sometimes stop working even on non-major Ansible upgrades), the small work with keeping playbooks up to date is more than worth it when looking at how much work and time Ansible saves. Doing a task (like replacing the root password) can easily be done no hundreds of hosts with a single command, as long as you’ve got a playbook for it.

Ansible inventory

I won’t go into much detail, but we’re using a dynamic inventory pulling information directly from our CMDB, which contains a master list of all nodes for every rig (project) we manage. The method described in this post works just as well with a static inventory file.

The process

The procedure we want to execute for each host group, is basically:

  1. Generate a new password
  2. Update the password-store entry for the host
  3. Update the password on the server

When procedures are automated this way, new features tends to appear over time (“hey, why don’t we add this bit as well?”), and the full playbook we’ve ended up with does a lot of extra checking/verification, ensuring that the required tools are present, pre-hashing the password before transferring it to the server, forcing a recheck of our Icinga checks that monitors password age, etc.

The playbook

Let’s do breakdown of the playbook

First is a prep section, where we check for required versions and software at all. This is executed locally (on “localhost”).


- hosts: localhost # match run locally
  any_errors_fatal: true # stop if anything is wrong
  become: false # do as myself, not root

  tasks:

We need to ensure that we’ve been given the “-e hostlist=$something” argument.


- name: Assert hostlist variable is set
  ansible.builtin.assert:
    that:
      - "false"
    msg: playbook must be executed with '-e hostlist=$hosts', where $hosts can be hostgroup or fqdn
  when: (hostlist is undefined) or (hostlist | trim == '')

Next, make sure that our locally installed Ansible is fresh enough. Ansible supplies its own version number in the “ansible_version_string” variable, so we just check that.


- name: Assert ansible version
  ansible.builtin.assert:
    that:
      - "{{ ansible_version.string | version_compare('2.2', '>=') }}"
    msg: Ansible 2.2 or above is required

In addition to the base Ansible installation, we also need the Ansible “expect” module version 3.3 or above. This module is installed through an additional package. Debian and RedHat based systems have different checks.

Note that “ansible_os_family” returns what the distribution is based on, not the distribution itself. “Debian” is Debian-based systems (including Ubuntu), and “RedHat” is RedHat-based systems (including CentOS and Fedora).

To check versions, we first run dpkg-query or rpm, and store (“register”) the result of the execution for later use. The resulting dict/hash contains keys like stdout (all the stdout in a string), stdout_lines (array of strings, one per line of output), stderr, stderr_lines, rc (exit value), and more. We then ensure that the stdout and rc of the command is what we need.


- name: Look up python-pexpect version (osfamily=debian)
  ansible.builtin.command:
    cmd: "dpkg-query --showformat='${Version}' --show  python-pexpect"
  register: pexpect_version
  failed_when: 0
  when: ansible_os_family == 'Debian'

- name: Assert python-pexpect version (osfamily=debian)
  ansible.builtin.assert:
    that:
      - "{{ pexpect_version.rc == 0 }}"
      - "{{ pexpect_version.stdout | version_compare('3.3', '>=') }}"
    msg: python-pexpect 3.3 or above is required
  when: ansible_os_family == 'Debian'

- name: Look up pexpect version (osfamily=redhat)
  ansible.builtin.command: rpm -q --qf "%{VERSION}" pexpect
  register: pexpect_version
  failed_when: 0
  when: ansible_os_family == 'RedHat'

- name: Assert pexpect version (osfamily=redhat)
  ansible.builtin.assert:
    that:
      - "{{ pexpect_version.rc == 0 }}"
      - "{{ pexpect_version.stdout | version_compare('3.3', '>=') }}"
    msg: pexpect 3.3 or above is required
  when: ansible_os_family == 'RedHat'

We also need to ensure that diceware is installed. We do this by just running “diceware –version”, and making sure that it had an exit value of 0, through the same “register” mechanic as above. If you prefer another password generator, just replace “diceware” with something else.


- name: Probe diceware
  ansible.builtin.command: diceware  --version
  register: diceware_present
  failed_when: 0

- name: Assert diceware installed
  ansible.builtin.assert:
    that:
      - diceware_present.rc == 0
    msg: diceware password generator is required

We’ve got all the prerequisites to do the password changing we need. As a final prep we do a git-pull of the password-store Git repository, to avoid merge conflicts when we update passwords in it.


- name: Git pull password-store
  ansible.builtin.command: "pass git pull --rebase"

All the prep work is now finished. It’s time to start the main task-list that will loop once for each host in the supplied hostlist(s), generating a unique password for each host, setting it in the password-store, and setting it on the host in question.


- name: Update root password on {{ hostlist }}
  hosts: "{{ hostlist }}" # run this list of tasks once for each host in the
                          # supplied hostlist
  become: false # do it as my own user (not root) by default
  any_errors_fatal: true # stop the playbook completely if something fails
  serial: 1 # stop "pass insert" from tripping over its own feet
  vars:
    # Settings for a Nagios/Icinga server with a password age check that needs a
    # forced refresh after setting the password.
    icinga_server: monitoring.example.com
    icinga_check: Root password age

  tasks:

Our dynamic Ansible inventory defines some variables for each customer project, which are only used in the password-store file path. Remove these, or exchange with your own. If you remove/exchange these, also change the password path in the password-store further down.


- name: Assert required variables are defined
  delegate_to: localhost # run it on local workstation instead of remote server
  ansible.builtin.assert:
    that:
      - "inventory_hostname is defined"
      - "project_team is defined"
      - "project_handle is defined"
    msg: not all of project_team, project_handle and inventory_hostname are defined

Now we can finally generate a password. Swap out diceware with your password generator of choice. Output should be one password (and only that). “apg -n 1” is a good alternative. The “register: dice” stores stdout (amongst others) in the “dice” hash/dict for use in the following tasks.


- name: Generate new password
  delegate_to: localhost
  ansible.builtin.command: diceware --no-caps -d ' ' -w en_eff
  register: dice

The next task is to insert the password into the password-store by running “pass insert”.

We use the Ansible “expect” module to submit the password when “pass insert” asks for it. The “expect” module searches command output for a given regex, and submits a given string as response. In this case “(?i)password” is the regex (a case insensitive search for “password”), which will prompt Ansible to respond with the stdout from the above password generation command.

Some Ansible inventory variables for the hosts in question are used to construct the path to the password in the password-store. Modify the path in the command to suit your environment and password-store hierarchy.


- name: Update password in password-store
  delegate_to: localhost # run it on local workstation instead of remote server
  expect:
    ansible.builtin.command: "pass insert -f {{ project_team.0 }}/{{ project_handle }}/{{ inventory_hostname }}/root"
    responses:
      (?i)password: "{{ dice.stdout }}"

Next we generate the password hash. We want to do this locally to avoid sending the password in clear-text to the host in question, as that might end up in Ansible temp-files. We again use the “expect” module to submit the earlier generated password when mkpasswd asks for it. We again use “register” to store the result for later.


- name: Generate password hash
  delegate_to: localhost # run it on local workstation instead of remote server
  ansible.builtin.expect:
    command: mkpasswd --method=sha-512
    responses:
      (?i)password: "{{ dice.stdout }}"
  register: password_hash

This is the task that actually sets the password on the remote host. The Ansible “user” module takes a sha512-hashed password and updates it on the host in question (note the absence of a “delegate_to”).

We use the last line of the stdout of the previous “mkpasswd” command, since some versions of mkpasswd are chatty on stdout, stating generic info before the password hash itself.


- name: Change password for root user
  become: true # do this as root
  ansible.builtin.user:
    name: root
    password: "{{ password_hash.stdout_lines[-1] }}"

Finally, we’ve have an Icinga check to alert us when a root password is due for change. The Icinga check is only executed once a day, so we force a recheck to immediately get it to an OK state. Just delete this block if this is not relevant for your needs.


- name: "Recheck icinga {{ icinga_check }} check"
  delegate_to: "{{ icinga_server }}" # do this on the monitoring server
  become: true # do this as root
  community.general.nagios:
    action: command
    command: SCHEDULE_FORCED_SVC_CHECK;{{ inventory_hostname }};{{ icinga_check }};{{ ansible_date_time.epoch }}
  changed_when: 0 # never trigger changed flag

The above task was the last in the main task-list that does the actual password change. After the above list of tasks has been executed for each host in the host-list, the new task-list below is run once.

The only thing we need to do here, is git-push the password-store we’ve been adding our new passwords to.


- name: Git push password-store
  hosts: localhost
  gather_facts: no
  become: false

  tasks:
    - name: Git push password-store
      ansible.builtin.command: pass git push

The playbook is used with an -e argument to say which host-list/hosts should be updated. Multiple host-lists can be supplied separated by colon. Some examples:

# Change password on all hosts in the "acme" hostgroup
$ ansible-playbook root-password-change.yml -e hostlist=acme

# Change password on all hosts in the "acme" and "ancme" hostgroups
$ ansible-playbook root-password-change.yml -e hostlist=acme:ancme

# Change password on someserver.example.com
$ ansible-playbook root-password-change.yml -e hostlist=someserver.example.com

Result

The playbook makes it easy to keep root passwords fresh even on projects with a lot of hosts, since you can just start it and let it do its work while you do other tasks. Optimally we’d want to remove the “serial: 1” and change passwords on all the hosts in parallel, but “pass insert” sometimes failed when we did that.

The full playbook can be downloaded here.


Update

  • 2024-01-29 Marked link to download diceware as dead.

Jimmy Olsen

at Redpill Linpro

Thoughts on the CrowdStrike Outage

Unless you’ve been living under a rock, you probably know that last Friday a global crash of computer systems caused by ‘CrowdStrike’ led to widespread chaos and mayhem: flights were cancelled, shops closed their doors, even some hospitals and pharmacies were affected. When things like this happen, I first have a smug feeling “this would never happen at our place”, then I start thinking. Could it?

Broken Software Updates

Our department do take responsibility for keeping quite a lot ... [continue reading]

Alarms made right

Published on June 27, 2024

Just-Make-toolbox

Published on March 22, 2024