Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it all out as one package.
It's a computer program that performs operating-system-level virtualization, and it's also known as "containerization". It was first released in 2013 and is developed by Docker, Inc. Solomon Hykes started Docker in France as an internal project within dotCloud, a platform-as-a-service company, with initial contributions by other dotCloud engineers including Andrea Luzzardi and Francois-Xavier Bourlet.
Jeff Lindsay also became involved as an independent collaborator. Docker represents an evolution of dotCloud proprietary technology, which is itself built on earlier open-source projects such as Cloudlets.
Docker is used to running software packages called "containers" which are isolated from each other and bundle their own application, tools, libraries and configuration files. They can communicate with each other through well-defined channels. All containers are run by a single operating system kernel and are thus more lightweight than virtual machines.
So, why you should use Docker in development?
Docker allows you to wrangle dependencies starting from the operating system up to details such as R and Latex package versions. It makes sure that your analyses are reproducible.
Since a Docker container can easily be sent to another machine, you can set up everything on your own computer and then run the analyses on e.g. a more powerful machine. You can send the Docker container to anyone who knows how to operate it.
Containers are created from "images" that specify their precise contents. Images are often created by combining and modifying standard images downloaded from public repositories.
Imagine you are working on an analysis in R and you send your code to a friend. Your friend runs exactly this code on exactly the same data set but gets a slightly different result. This can have various reasons such as a different operating system, a different version of an R package, et cetera. Docker is trying to solve problems like that. The cool thing about this docker (virtual computer) is that you can send it to your friends. And when they start this and run your code they will get exactly the same results as you did.
PHP, Ruby, Java, and Node are the main programming frameworks used in containers.
Because of Docker containers are lightweight, a single server or virtual machine can run several containers simultaneously.
This allows the deployment of nodes to be performed as the resources become available or when more nodes are needed, allowing a platform as a service (PaaS)-style of deployment and scaling for systems such as Apache Cassandra, MongoDB, and Riak.
Docker implements a high-level API to provide lightweight containers that run processes in isolation.
Docker manages to reduce deployment to seconds. This is due to the fact that it creates a container for every process and does not boot an OS. Data can be created and destroyed without worry that the cost to bring it up again would be higher than what is affordable.
Docker ensures consistent environments from development to production. Docker containers are configured to maintain all configurations and dependencies internally.
Docker ensures your applications and resources are isolated and segregated. Docker makes sure each container has its own resources that are isolated from other containers.
You can have various containers for separate applications running completely different stacks.
Docker helps you ensure clean app removal since each application runs on its own container. If you no longer need an application, you can simply delete its container. It won’t leave any temporary or configuration files on your host OS.
Docker also ensures that each application only uses resources that have been assigned to them.
From a security point of view, Docker ensures that applications that are running on containers are completely segregated and isolated from each other, granting you complete control over traffic flow and management. No Docker container can look into processes running inside another container.
Real-time uses of Docker
A 2016 analysis found that a typical Docker use case involves running five containers per host, but that many organizations run 10 or more.
Nanobox uses Docker (natively and with VirtualBox) containers as a core part of its software development platform.
Docker can be integrated into various infrastructure tools, including Amazon Web Services, Ansible, CFEngine, Chef, Google Cloud Platform, IBM Bluemix, HPE Helion Stackato, Jelastic, Jenkins, Kubernetes, Microsoft Azure, OpenStack Nova, OpenSVC, Oracle Container Cloud Service, Puppet, ProGet, Salt, Vagrant, and VMware vSphere Integrated Containers.
Red Hat's OpenShift PaaS integrates Docker with related projects (Kubernetes, Geard, Project Atomic and others) since v3 (June 2015). The Cloud Foundry Diego project integrates Docker into the Cloud Foundry PaaS. The Apprenda PaaS integrates Docker containers in version 6.0 of its product.
Jelastic PaaS provides managed multi-tenant Docker containers with full compatibility to the native ecosystem.
The Tsuru PaaS integrates Docker containers in its product in 2013, the first PaaS to use Docker in a production environment.
On October 15, 2014, Microsoft announced the integration of the Docker engine into the next Windows Server release and native support for the Docker client role in Windows. From 8th June 2016, Docker used natively on Windows 10 with Hyper-V Containers, to build, ship and run containers utilizing the Windows Server 2016 Technical Preview 5 Nano Server container OS image. Since then, a feature known as Windows Containers was made available for Windows 10 and Windows Server 2016.
Among the most serious issues that might appear during the employment of the technology, developers point to the vulnerability of the target software. For instance, if you provide access to containers through web servers with the use of an API, you will have to thoroughly think through the parameter verification process. In particular, you must make sure that some hacker doesn’t pass transformed data along with the request, which can provoke further generation of new containers.
You can share your comments with us in the comment section. Thank you!
Photograph by hafakot