Installing and using the AWS CLI and aws-shell in an agile fashion

In our work we often encounter AWS installations under various accounts. Using the web-based console is straightforward, but for repetitive tasks and more advanced interaction we often rely on Amazons CLI for AWS. (

The first question people often have when needing to use the CLI is how to go about installing it. It’s distributed as a python package and Amazon recommends using pip, a Python package manager, to install it system-wide. Pip itself is not a part of most Python distributions, so it must first be installed. At this point some developers might feel a bit like Alice falling down the rabbit hole, and if you want your whole team to use the CLI then getting past this step translates to quite a few hours of time in total. Additionally many developers, this one included, think that system-wide modifications to one’s working computer to satisfy a working environment for a specific software project is a really, really bad way of working.


Once you’ve managed to get pip and the CLI installed system-wide you should of course be able to run it, which of course you can. You’ll run “aws configure”, enter your creds, and you’re set. Now what happens when the next project comes along that requires your use of the CLI? The short story is that you can configure multiple accounts under ~/.aws and switch between them using your system-wide setup. I’ll leave out the details but will confess to thinking this dynamic is a tad hairy.

Python virtual environments

A while ago I started using virtualenv, a kind of virtualized container for Python code to isolate the CLI installation, and thus not make it system-wide. The typical thing this approach solves is that you can include the CLI as part of your source code repo. Clone it, enter the virtualenv, do the “aws configure” thing, and go. It was much quicker and simpler to get other team members started with doing deploys and the like. Unfortunately this approach still has some drawbacks. Sometimes it wouldn’t work because of incompatible Python versions and of course all the credentials still got lumped under ~/.aws. I also think committing a virtualenv instance to version control is bad. This approach was still a big step in the right direction. Now how to improve it?

Ephemeral, account-specific CLI instances

Here’s what I’m using nowadays:

#!/bin/bash -e

THISDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
TEMPDIR="$( mktemp -q -d -t tmp.XXXXXX )"

if [ -s $CONFFILE ]
        source $CONFFILE
        printf "Enter an identifier for PS1, blank if unsure: "
        read PS1_PREFIX
        printf "Enter default AWS region: "
        printf "Enter AWS acces key ID: "
        read AWS_ACCESS_KEY_ID
        printf "Enter AWS secret access key: (will not echo) "
        read -s AWS_SECRET_ACCESS_KEY
        printf "Additional PyPI packages to include, space separated: "
        read PYPI_PACKAGES
        echo "PS1_PREFIX=$PS1_PREFIX" > $CONFFILE

INITSTR="$TEMPDIR/virtualenv/ $TEMPDIR/awsenv --prompt=\"$PS1_PREFIX \" && \
        source $TEMPDIR/awsenv/bin/activate && \
        pip install awscli $PYPI_PACKAGES && \
        alias deactivate=exit && \
        export PATH=$THISDIR:$TEMPDIR/awsenv/bin:$PATH && \
        export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID && \
        if [ -s ~/.bashrc ]; then source ~/.bashrc; fi && \
        if [ $SHELL == '/bin/bash' ]; then complete -C '$TEMPDIR/awsenv/bin/aws_completer' aws; fi && \

git clone --depth 1 $TEMPDIR/virtualenv

bash --init-file <(echo $INITSTR)

rm -rf $TEMPDIR

The above shell script can be had at
So what does it do? Well, it solves the gripes I’ve had with installing the CLI and using it for multiple projects under separate AWS accounts.

First, it requires Python, git, and bash to run, and these are often installed on a developer’s system. Let’s look through the important parts of the script:

  1. It downloads the latest version of virtualenv to a temporary directory.
  2. It enters that virtualenv and installs awscli within it
  3. It tries to read “awsenv.conf” from the dir that it resides in and creates it if it doesn’t exist, prompting you for creds.
  4. On exit, it removes the temporary dir.

Pretty simple, right? Absolutely nothing installed and left behind except for the conf file, which can be manually edited or removed. The biggest win for me is that I can embed this into the local repository of some project I’m working on, and by simply running it I’m instantly using the CLI under the correct account, and so can everyone else working on the same repository.

On a side note...

Amazon is currently developing a thing it calls aws-shell, which is kind of the CLI on steroids. If you want to run it under then all you need to do is add “aws-shell” to the additional PyPI packages section of the configuration, and then type “aws-shell” while in the awsenv.

Leave a Reply