The Use of Docker API to Automate Deployment

We would like to share a Docker migration experience made for one of our client, a software editor. We are in the context of a full stack javascript application (D3, AngularJS, Node.js, MongoDB), but it could be based on whatever technologies running on Linux.

Context

To automate builds and deployments to QA and production, we previously developed Node.js scripts. Application’s code is managed under GitHub private repositories and the way to go QA was to produce an archive file, to copy it to staging, reinstall the application, test (and re-test) and only then tag the repo with a version number and push the archive to our clients. We were certainly able to improve our scripts by using deployment libraries like Puppet or Chef or another specialized library, but it was easy to write, efficient and reliable.

Without the need to finally deploy the application to our clients we might have kept this solution in place. But unfortunately, client's deployments were rarely successful.

Our application is made of mainly ten services, usually bundled within 3 LXC containers. Even if only one or two services often change, deploying a running context is painful. In the top list of problems our clients were facing, you will find first an approximate execution of our release instructions, second, modifications of client's OS context without notifying us.

So we decided to review our target production platform to deliver our application via Docker containers.

About Docker

Virtualization helps to isolate from hardware where Docker helps to isolate from Operating System and helps to manage services, it’s a new abstraction layer: we don’t need to go through a painful installation step anymore, just PUSH and PULL, and all is running smoothly.

Docker Stack (Source : docker.com)

PROS:

  • Simple Deployment: installing all components directly on VM or LXC containers is sometimes tricky, mainly because of the diversity of components (nodejs, rabbitmq, mongodb, zmq). With Docker, this installation will be under our responsibilities, images will be pushed to Docker Hub, and clients will have to pull them into their datacenter to run them. All containers are isolated from each other, just linked by their functional connections.

  • Simple Command Line Interface: all containers will be managed the same way (run, start, stop, rm) independently of their content and service. No need to study and learn hundred of dedicated CLI and write a ton of documentation.

  • Simple Upgrade: to upgrade, just pull a new image, stop the existing one and run the new one. The new image will be delivered with all the execution environment, nothing has to be tuned. If something is wrong, it will be our responsibility not client’s deployment team.

CONS:

We need to manage all the services as a whole

  • Application: With Docker we can’t define the whole application as a composition of many containers, we are stuck at the containers level.

  • Orchestration: No way to automatically deploy all Application’s containers in a cluster of hosts, we have yet to do it manually.

Many providers are heavily working on those defects and solutions will be shortly available (ex: Kubernetes, Docker Swarm).

Code to Deploy

First attempt to script Docker was to replace shell commands in Node.js deployment script by Docker commands:

function buildImage(context, cb){  
    doSpawn(
        'docker', 
        ['build', '--rm=true', '-t', context.imageName, '.'], 
        {cwd: context.dockerDir}, 
        cb
    );
}

Of course, it works, but it's sometimes difficult to manage correctly return status and code is totally ugly.

This is the wrong way to practice. Docker delivers a remote http API to communicate with Docker’s demon. Native Docker’s CLI seems to be written with this API. Thanks to Pedro Dias and Dockerode, Node.js developers can use those API at a good abstraction level.

Le's rewrite buildImage:

var docker = new require('dockerode')();  
function buildImage(context, cb){  
    var config = {
        Image: context.imageName,
        name: context.containerName,
        Hostname: context.containerName,
        AttachStdin: false,
        AttachStdout: false,
        AttachStderr: false,
        Tty: false
       };

    docker.createContainer(
        config, 
        function(err, container){ cb(err, container) });
    }

GitRepository dockerized gives you a full sample of code to deploy an application with 2 commands:

  • build --hash 96b7d64: setup a Dockerfile, create a Docker image based on a specific commit version, check if a running container exists, remove it and then run the new one.

  • push --hash 96b7d64 --tag 0.1.6: clone the git repository, checkout to the hash commit, tag it with the new version, push the image to DockerHub with a new tag.

Using DockerAPI is a huge change comparing to spawn linux commands, we increase our data domain to now include Image, Container, all docker’s object types. Our Full Stack Javascript stack is no longer limited to development, but now includes QA and production. It definitely breaks this old fashion border between development and production teams with positive impact on organisation, quality and costs, we are clearly in a new age of DevOps.