Docker: what it is and why you should use it in your company
*With the collaboration of José Manuel Blanco.
Agility, flexibility, scalability… All these terms can be attributed to Docker.
In this post, we explain what you need to know about Docker and why you are interested in running it in your corporation.
What Docker is
Docker is an application encapsulation technology, as software containers, to distribute without problems of dependencies or incompatibilities in various operating systems (such as Windows or Linux) and local or cloud environments such as Azure or AWS. Thus, the different files that make up our applications and their dependencies… are transformed into a Docker image, ready to be deployed on countless systems.
Additionally, the Docker package allows us to create, run and, if necessary, update and stop the operation of the containers.
Behind Docker is an eponymous company that works alongside Linux, Microsoft, and cloud providers. In an environment where more and more organizations are opting for cloud development environments, Docker increases work efficiency.
How it works
We will start with the concept of Docker image. An image defines everything needed to run an application: code, runtime, library, or configuration.
Plenty of Docker images are ready to use in both public and private repositories. Among the former, Docker Hub stands out as the most extensive public repository in the world, with more than 100,000 images. Do you need to install a database, a monitoring system, or a cache? Here you will find its official images to get it, a history of its different versions, and some notes to help you.
The same can be achieved with our company’s applications, publishing their image in a public repository with free access or private only for our employees and customers.
Beyond the definition in a Docker image, the following key concept is the Docker container. A container is simply the execution of an image. The same idea may be run multiple times, thus creating different containers, potentially each with minor configuration differences, if allowed by the vision and specified when the container is launched.
As a de facto standard for containerization, our applications can be deployed from their Docker images in a wide variety of options: on-premise or in the cloud, in isolation or together with other containers, simply using the runtime environment provided by Docker or in more complex and advanced systems such as Kubernetes.
Why should your company use Docker?
As was already the case with virtual machines, Docker allows us to run the same application on different devices of different natures. But compared to these, containers are smaller and faster to deploy.
Thanks to Docker, some operations are more accessible than other tools. For example, shipping software faster or simplifying code execution on multiple servers, and they are also transferring applications. In short, work is delivered or executed more quickly, and tasks are streamlined. For this reason, Docker’s flexibility is one of its most outstanding features.
Moreover, thanks to Docker, it is possible to unify public and private cloud environments to improve productivity and performance.
At Plain Concepts, our artificial intelligence team uses this technology to load their models in Azure. If you want to know more about its possibilities, our colleague Luis Fraile talks in the following video about functional testing in .NET Core applications with the help of Docker, both in local systems and in Azure Pipelines.
For many years we have been betting on application containerization as part of our value delivery, both in the applications we develop for our clients and by helping them to reach this point in the other solution modernization support and consulting services we offer. We also offer our experience to deploy these images in an infinite number of systems; with this, we seek to maximize the value to our customers through fast and secure deployments and taking advantage of all the existing scalability and resilience possibilities.