What is Docker? An Introduction to the world of containerization

Jan 12, 2019 | Docker - Build, Ship and Deploy

Docker is gaining popularity and the small buzz now became a roar as many big & small companies have moved toward Docker. The reason for Docker’s growing popularity is the way it resolved the challenges faced by IT companies for managing multiple technology stacks across multiple environments. Along with the complexity involved in to move an application from lower to the higher environment. However, whatever it’s the first question kicks in mind why docker? and what makes it so popular?

Why Docker?

Let’s imagine you are a developer working on Python application which requires a specific version of python and libraries to run correctly. On your laptop or development environment, application doing just fine as it has all the dependencies it needs.

Now think development is completed and we have to move this application to the higher environment. We pushed the code to the repository or provided it to DevOps team for deployment. Unfortunately, when DevOps team deployed it on a server it failed to run.

The reason behind the failure is application requires runtime environment, just like it has in a development environment.  In order to have this application working you have to provide not only application code but also a specific version of python and libraries that need to be installed on the server.

For DevOps team, there are few more challenges other than installation of required software as there could be other applications running on same infrastructure or server

  • Make sure the new software or any library does not conflict with any other running application.
  • If there is a conflict, check if there is any possibility to provide isolation which could include provisioning of another application server or virtual machine.

Well, provisioning of the server will require additional cost and VM which can run on the same server also not a clear solution to this problem as VM includes much more than just typical application and heavy to run on the host.

If in case there is no way to get around this. One possible solution left is to check if there are some code changes can be done to have solution compatible with others which might not be the possibility always.

In simple words “It is the challenge of packaging any application, irrespective of language/frameworks/dependencies so that it can run on anywhere, irrespective of the underlying OS/hardware/infrastructure” and Docker is the answer to this challenge.

Solution: A self-contained application container

The solution of the above problem has come from another problem faced by us years ago when global economy held by the expense of ocean-going freight as all goods come in different sizes, shapes and requires a different mechanism to ship them.

To answer this problem shipping containers kicks in and provided a standard way to pack and ship goods. As they came in standard shape and size it’s easy to ship them and accept them worldwide which significantly reduced the cost involved in a process and boosted the world economy.

Similarly, Docker containers are the standard unit of software that packages up the code and all its dependencies so the application runs quickly and reliably from one computing environment to another.

In simple words now Developers can focus on bundling applications and dependencies as containers, without thinking over underlying hardware or infrastructure. Administrators/DevOps team can concentrate on managing containers, without thinking over the contents of those containers. 

Unlike virtual machines, containers do not have the high overhead and hence enable more efficient usage of the underlying system and resources.

More than that!

Docker has several other advantages and benefits, the reason why do large companies like Paypal, NetFlix, and Spotify keep using Docker and why it’s growing so fast

Rapid application deployment: containers include the minimal runtime requirements of the application, reducing their size and allowing them to be deployed quickly.

Component reuse: Containers reuse components from the preceding layers, which makes them noticeably lightweight in size which also saves the bandwidth and storage required to store containers.

Sharing: you can use a remote repository just like GitHub to share your container with others. Docker hub provides a registry for this purpose, and it is also possible to configure your own private repository.

Lightweight footprint and minimal overhead: Docker containers are lightweight executable packages which run on the same host reducing overhead to running guest OS and having less resource footprint.

So now when we understand why docker? and how it’s making developer and system administrator life better? It’s a time you decide if you really need docker? and if you are decided to use docker let’s talk what it is

What is Docker?

Docker is an open-source containerization platform which facilitates tools and services to run and package your application and all its dependencies in a small, light-weight and standalone software unit called containers. Docker is also a company that promotes and evolves this technology, working in collaboration with a cloud, Linux, and Windows vendors.

To understand this little deeper, let’s see what is software containers?

What is a Container? And VM vs Containers

We already defined it many times so let’s go through its definition one more time “A container is a standard unit of software that packages up a code and all its dependencies so the application runs quickly and reliably from one computing environment to another”

Okay! Let’s think about it one more time what we would do if we have to package entire application along with its runtime? The one simple answer is VM.

VM or Virtual Machines are a great tool to package the entire application along with its runtime which includes OS itself and it is hard to say that any other application or VM running on the same host can affect the application running in VM. However, this comes at a great cost as running guest OS comes with computational overhead to virtualizing hardware.

Containers on other hand use the system kernel and instead of virtualizing entire hardware only virtualize OS. Each container shares the host kernel to communicate with hardware and sometimes binaries and libraries too which significantly reduces the resource overhead and thus makes containers very lightweight, megabytes in size and takes only seconds to start while VM can take minutes to do the same.

Docker Architecture

We will be talking more on different components of Docker architecture and client commands as we progress with docker. For now, as shown in diagram Docker uses a client-server architecture. The Docker client talks to the Docker daemon, which does the heavy lifting of a building, running, and distributing your Docker containers. Docker client and daemon can run on the same system, or you can connect a Docker client to a remote Docker daemon.


What’s Next?

We will be installing docker on our system, run container on it and check basic docker components and commands in our next article.