Why use docker?

Why use docker? Introduction

The key buzzwords in the IT world for the last one or two years, especially with the aggressive penetration of the cloud services like the Amazon AWS and Microsoft Azure, are containers and dockers. It all started with the release of the Docker 1.0 a few years ago (2014), which aimed at packaging the containers based development in to a much more convenient and efficient way.

This effort has really taken the industry by storm and now the companies are adopting Docker at a much faster rate. Let us see why is the Docker technology so popular and why one should use Dockers!

Virtualisation

To have a better understanding of Docker and its convenience, it is good to have a glance into the past to see what was in existence earlier, and what is getting replaced with Docker. Before Dockers, the solution was to use Virtual Machines, which made use of the virtualization technology.

Virtualisation imports guest operating systems over a host operating system. This technique, was soon popular amongst the developers as it allowed the running of multiple operating systems in different virtual machines, which ultimately ran on one single host operating system. Apart from being able to run multiple operating systems, virtualization approach was a huge success because of the following two reasons:

      1.cost effective

Could save money on infrastructure. With Virtualisation, you need just one machine instead of 3 or 4 to run multiple operating systems.

     2.Better Failure handling

Maintenance and recovery was easy when encountered with failures as everything for the multiple systems can be found using the host OS or the host hardware.

The following diagram depicts how the Virtualization works on the top a host machine:

Why use docker?

As you can see from the diagram, there are three applications deployed, with each having a separate operating system. All the three applications and their respective operating systems sits on top of the host operating system and host hardware.

But as time flew, virtualization technique too proved to be inefficient. This was mainly because, the existence of multiple operating systems on a single host OS was proving to be a sluggy affair. The main issue was that each one of these guest operating systems had its kernel files, libraries and other dependencies which were bulky and was resource intensive, especially the main memory and the processing power.

Also clubbed with this, there was another issue of concern and it was a bottleneck in the case of virtual machines. With the virtualization technique, each virtual machine would take more than a minute of boot up time and hence it was really difficult to get them onboard in real time applications.

Containerisation to the rescue

As discussed in the previous section, the demerits of the Virtualisation led to active solution seeking for those issues and as a result, the containerisation approach emerged. Containerization is nothing but a type of virtualization which brings the abstraction to the operating system. Unlike virtualisation which brings the abstraction to the hardware level, containerisation brings in the ability to run the app with the host operating system, thus eliminating the guest operating system altogether. Containers isolate the application which it is running with all the dependencies and the OS level interactions the application needs.

The diagram below shows containerisation in host server:

Why use docker?

Here the container engine layer is responsible for the allocation of host os resources to the respective apps.

The advantages of containerisation are :

1. The primary advantage of this technique is that the host resources which were otherwise utilized by the guest operating system was removed altogether and the OS resources needed for the app to operate is harnessed from the host OS itself. This makes the containers faster. 

2. More applications can be hosted on the host machine compared to that of the virtual machines. This is possible because containers are lightweight and smaller for an application compared to the virtual machine solution of the same. Thus number of apps can be hosted in the machine, which is a huge saving on the infrastructure and monetary side of deployment.

 3. The startup time required for the container is significantly less than that of that needed by the virtual machines. It usually takes only a few seconds for a container to up comparing to minutes long boot up time of the virtual machines.

Docker

The benefits that the containerisation brought was huge but could not be harnessed to the fullest. This was primarily due to the fact that there were no good enough container engines developed which can interact with the wide variety of host operating systems. This indeed was a big bottleneck which indeed stalled the advance of the container technology. The wild success of Docker is often attributed to the fact that it was successful in developing a container engine technology (Docker), which achieved platform independence and also providing a simple and efficient method in the portability of the created application containers. In other words, Docker brought the container technology to the masses. Let us have a brief look upon the Docker components and the Docker workflow in the next sections

  1. Docker components

   1. docker daemon

Docker daemon is the Docker engine which runs on the host operating system. Currently, docker supports only the Linux environment

     2. docker cli

The communication to the docker engine/daemon is happening through the docker command line interface (cli). From here we can type in commands and manage the container building in the docker.

The following figure shows the two components in a native Linux OS and also in Windows/Mac OSx environments. In the non-linux environment we need to set up a virtual linux host on top of the host OS in order to facilitate the use of Docker. But again it is a very simple and easy process with tools such as boot2docker.

Why use docker?

  1. Docker workflow components

A simplified version of the docker workflow can be visualised in the diagram below:

Why use docker?

  1. Docker image

A Docker image is a separate file system that includes all of the requirements and metadata for running a docker container. Only the resources which are specified in the image are being allotted to the container.

  1. Docker containers

A running instance of a Docker image is called the Docker container. Multiple instances of the same image can be instantiated simultaneously to create multiple containers.

  1. Docker registry

Docker registry allows the storage of images in both public and private repositories just like Github.

  1. Dockerfile

Dockerfile is a file which has documented the image and can be used to reproduce the image in any system.

Why use docker?

Let us move into the benefits of using Docker in the light of the discussion so far.

1. Performance improvement due to the container architecture as discussed in the containers  section.

2. High density of applications

Using the same host server and OS, the number of apps that can be packaged using Docker would be considerably higher than that of the virtual machine based app deployments.

3. Portability and scaling

Another important benefit of Docker is the easy portability and scalability. In a VM based setup, each set of packages has to be installed in each VMware instance, whereas in the Docker world we just need to instantiate the same image again. This saves a lot of time and resources while porting and scaling.

4. Standard environments across the teams

Dockerization enables us to implement standard environments across different teams. For example, the development environment will be exactly the same as that of the production environment and hence an app working in the dev environment would definitely work on the production environment too.

5. Suppose if we want a nodeJs script to be run on our system and we don’t have node installed, all we have to do is to install the node image supplied from the docker hub and then run the script inside it. This implies, we pretty much need only Docker for our development process and need to install the host operating system with other packages.

6. Another important benefit is the solution to one of the most frequent issue, that is handling different versions of a language. Suppose we have a piece of code we wish to run in Python 2 and Python 3. In most cases this would create trouble as installing the different versions of Python might interfere with the other. So we might need to adopt a few hacks and then make it run. But what if this code is to be run very frequently, the trouble increases and we have to perform the hacks each time. By using Docker we can instantiate separate images as containers with different versions of the same language and then run the piece of code in the respective container.

Conclusion

In this article, we have discussed why Docker is a good choice for adoption in the IT infrastructure. In this article, we have seen the differences between the virtualisation and containerisation approach and how Docker makes use of the containerisation technique to make deployments more friendly using its workflow.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>