Table of contents of the article:
What is Docker?
Docker represents a revolution in the field of virtualization. If you're already a virtualization expert, you might want to skip some of the next sections. However, if you are new to this concept, a basic understanding of virtualization will be essential to help you grasp the value and potential of Docker.
Docker is a powerful open-source tool for containerization, an innovation that's revolutionizing the way we build, deploy, and run applications. With Docker, your applications are enclosed in isolated environments called "containers", allowing them to operate independently of the underlying operating system and ensure consistent performance, regardless of the environment in which they run.
Docker containers use advanced operating system-level virtualization technologies, including Linux cgroups and namespaces. These tools isolate critical system resources such as memory and processes, enabling the creation of lightweight and portable execution environments. Unlike traditional virtual machines, Docker containers don't require the installation of a full operating system or hypervisor, making them more efficient and streamlined.
One of the main advantages of Docker lies in its ability to overcome compatibility problems between different environments, making it easier to deploy applications on any operating system that supports Docker. Docker also makes it easy to set up development and test environments that closely mirror production environments, since developers can work with the same containers used in production. This strategic use of containers can help reduce infrastructure costs, since containers can operate on a single machine, both physical and virtual, thus eliminating the need for dedicated machines for each application.
What is virtualization?
Let's begin to unravel the concept of virtualization with a simple metaphor: imagine you own a house and you have a friend who needs a place to stay. You have several possibilities to help your friend:
- You could invite your friend to share your bedroom, but that option could get uncomfortable really fast.
- You could build a new home for your friend on your property, but that might be too expensive.
- You could offer your friend to stay in the spare room, thus keeping your lives separate, while sharing some common resources like the kitchen and living room.
The third option represents the essence of virtualization. In computing, virtualization refers to the process of creating a virtual (or simulated) version of a hardware resource, such as a server, storage device, or network.
Now, let's say you want to run a web server on your computer, but you want to keep it separate from your existing operating system and applications. The solution? Virtualization. You can create a virtual machine (VM) on your system, which will host the web server. The VM works like a separate computer, with its own operating system and applications, but uses your computer's hardware resources, such as processor and RAM.
When you start the VM, you will see a completely new OS pop up in a window inside your current OS. This is the equivalent of inviting your friend to stay in your spare room: you share resources (in this case, your computer hardware resources), but maintain separation and independence. This powerful technology makes it possible to make the most of available hardware resources, reducing costs and improving the efficiency and flexibility of computer systems.
What's different about Docker? How is it different from traditional virtualization?
Docker represents an innovative and different approach to virtualization. While a traditional virtual machine encapsulates the entire operating system along with the running application, Docker uses a sharing approach, maximizing the common use of resources between virtualized systems. This strategy allows Docker to consume fewer resources when running and makes Docker containers easier to deploy, both for developers and for the production environment.
Traditional virtualization, offered by hypervisors such as VMware or Hyper-V, generates a separate execution environment known as a "virtual machine" (VM). In a VM, a complete operating system runs, allowing several operating systems to run concurrently on a single physical machine. However, this approach results in a significant overhead of system resources, as each VM needs its own memory, CPU, and disk space.
Instead, Docker employs a technology called containerization to create isolated execution environments, called "containers.". These containers share the operating system kernel of the host machine. This means that, unlike VMs, Docker containers don't need an entire operating system to work, but use the shared resources of the host operating system. This feature allows Docker to create execution environments that are lighter and more portable than VMs. In fact, containers can be moved seamlessly between different physical or virtual machines, without the need for modifications.
In summary, while traditional virtualization leverages hypervisors to create completely separate virtual machines, Docker leverages containerization technology to generate isolated execution environments that share the kernel of the host operating system. This unique approach to virtualization makes Docker an efficient and flexible solution for developing, testing and deploying applications.
Docker plays a vital role for web developers, providing powerful tools and facilitating many day-to-day operations.
One of the main advantages of Docker is the ease of sharing development environments. If you and I were collaborating on a Node app, for example, we'd like to ensure that we both had Node installed and were using the same version to ensure consistency across our environments. Version inconsistencies can cause hard-to-find problems, as libraries and our code may behave differently between different versions of Node.
A possible solution is to install the same version of Node for both, but if we already have other projects on our systems that require different versions of Node, we should consider installing NVM, a tool that allows us to switch versions easily. At this point, we could add an .nvmrc file to the project, specifying the version we intend to use. This process, however, can be quite labor intensive, and despite implementing all of these steps, we cannot guarantee that the environment will be the same for all developers.
Docker offers us a solution to these challenges by allowing us to provide the same development environment for all developers. With Docker, the process becomes:
- Install Docker.
- Write a Dockerfile.
- Run
docker build -t <image-name>
. - Run
docker run -p 3000:3000 <image-name>
.
This process may not seem much simpler than configuring Node/NVM, but it offers one major advantage: installing Docker is a one-time operation, regardless of the technology stack you intend to use. With Docker, instead of having to install specific software for each stack, you simply write a different Dockerfile (or Docker Compose file, depending on the complexity of your app).
A Dockerfile is a simple text file, with no extension, which defines the configuration of a Docker environment. For example, here's what a Dockerfile for a Node app might look like:
# This Docker image will be based on the Node 11.6 image
FROM node:11.6.0
# Install dependencies
COPY package*.json ./
RUN npm install
# Copy the node app from the host into the image at /app
COPY . /app
# Expose port 3000 and start the app
PRESENTATION 3000
CMD npm start
This Dockerfile is for a Node app that listens on port 3000 and is started with the command npm start
. By placing it in your project's repository, onboarding new developers becomes simple and 100% consistent: every developer always gets the same environment. In essence, Docker is a powerful tool that makes developers' lives easier, increases efficiency, and fosters consistency across development environments.
Develop on the same environment as production
Once the app is installed in a Docker development environment, you can ship the entire container directly to production. If you think it is a problem to deal with the inconsistencies between two developers, just wait for you to write the code that works on your machine just to make sure that not functions in production. It's extremely frustrating.
You have tons of options for deploying Docker containers to production. Here are some of them:
- AWS ECS( official tutorial )
- Digital Ocean( tutorial )
- Heroku( official tutorial )
- io( official tutorial )
I like Heroku's approach because it's the only one that allows you to simply ramp up your project with a Dockerfile to run them. Others take many other steps like pushing the Docker image to a repository. The extra steps aren't the end of the world, but they're not necessary.
What about more complex apps?
Due to Docker's philosophy (one process per container), the most apps will require multiple containers . For example, a WordPress site should consist of a container for the web server running PHP and a container for the MySQL database. This means you need a way for the containers to talk. This is called container orchestration .
If you can run all containers on a single host, Docker Compose it will probably meet the orchestration needs. It's included when you install Docker and it's easy to learn. It allows you to launch multiple containers at the same time and network with each other so they can talk to each other. This is the fastest and easiest way to orchestrate multiple containers.
If you have to orchestrate containers scattered over more guest, Kubernetes is the prevailing solution. Many hosts that support Docker deployments offer Kubernetes for orchestration.
Quick benefits of understanding Docker.
It might not seem relevant right now, but keep this information in mind for the first time you run into a problem caused by differences in development environments. You don't want it to happen again. By learning to use Docker, you will be able to ensure a consistent environment for your application, regardless of where it runs or who is managing it. That means consistent, reliable results that you, your clients and your employers can count on.
Understanding Docker offers a number of immediate benefits or, as we call them in the business world, “quick wins”. Let's look at some of the more significant ones:
- Consistent development environment: Docker allows you to easily create and deploy uniform development environments across different machines and different platforms. This significantly reduces problems related to differences in development environments, a common problem in many development teams.
- Application portability: With Docker, your applications can be easily moved from one environment to another without compatibility issues. This means your applications can be developed locally, tested on a staging environment, then moved to production without any changes to the environment.
- Application isolation: Docker allows you to run applications in isolated containers, ensuring they don't interfere with each other. This can be especially useful when working with applications that require different versions of the same dependencies.
- Resource efficiency: Docker containers are notoriously resource efficient, using only the resources needed to run the application they contain. This can lead to greater resource efficiency, particularly when working with resource constrained machines.
- Replicability: With Docker, application building and deployment processes are fully automated and replicable. This means that each team member can run the application in exactly the same way, eliminating the issues of differences in local configurations.
In short, understanding and using Docker can lead to a number of immediate benefits, making it a valuable tool for any developer.