With docker, the agility and adaptability to the customer, as well as the assurance that the changes implemented in each development cycle will work in production, is much greater.
Docker technology began to be used by the big technology giants (RedHat first and then Microsoft, Amazon, IBM etc...) in 2013, the same year in which the code was released. Its use today is massive, there is no company or organization that consciously or unconsciously is not using docker in some of its areas.
Why is there so much buzz in the tech world? How has it evolved and where is it today? How does it influence industrial companies? How does it relate to IoT/Edge computing?
In a series of articles we are going to explore the technological, strategic and business impact of Docker in organizations, as well as some of its most interesting applications in the market.
Docker is a software platform that allows you to create applications packaged in containers, which include everything necessary for the application to run.
According to its own website Docker is: "A container - A standard software unit that packages both the code and dependencies of an application so that it can run quickly and consistently from one environment to another. A container is a lightweight, self-contained, executable software package with everything an application needs: code, system tools, libraries and configurations".
¿...y? Three aspects of this definition are key to understanding the Docker revolution in the development industry.
Containers standardise applications at the deployment level and homogenise their execution in different environments, whether cloud or on-premises. These small pieces of software that the development team composes are easily executable by the deployment team and give consistency to the solution.
As a product manager at Barbara IoT we made the decision to containerise our applications from the beginning and we see that it was the right decision. The adaptability to the customer and the assurance that the changes implemented in each development cycle (typically 2 weeks) will work in production, is much greater .
As product managers, containers allow us to save deployment costs, to make the customer's options versatile with local options (On-Prem) and even from different cloud providers since this standardization allows a cheap and secure migration.
A container is self-contained, it includes code, libraries, configurations and dependencies.
You have to take into account the complexity of developing applications on the developer's computer, testing them on the quality computer, testing them on the company's server and finally deploying them in the cloud of the provider on duty.
Container technology makes this process consistent. The software has everything it needs to run inside the container and with the version that the developer has decided on. No more versioning hell with libraries, operating systems, file systems. The container image runs wherever you run it from the local computer, to the edge server out in the field.
Why don't we use virtual machines? Virtualization technology is much heavier, emulating all the HW of the system in addition to the software. Containers are fast and lightweight, and allow for iterations and deployments accordingly. Product teams can build and maintain capabilities easily and in more modular architectures, saving costs and improving development and deployment times.
As in any configuration, at the beginning in Docker you also have to spend time to prepare the servers and adapt the applications to work. But that time does not fall on deaf ears, because according to the following points you will see that in every project it saves time in ALL departments.
By not having to boot an operating system as in a VM, a Docker container is able to boot immediately or in a matter of seconds.
If configured, each deployment (typically containing one or more containers) can be on a separate subnet in isolation.
In the case that some kind of virus or malware enters, it will only affect that container structure, and not the rest. Using Docker-Compose, each deployment is already done automatically in an independent subnet.
The same code can be shared between different containers, and each container can have a different version of, for example, an apache.
This way, we know how the same code reacts to changing that factor in our environment. This simplifies build tracking, branching and version control for multi-service applications, while independently iterating the code without breaking the application.
Docker can make QA testing easier, faster and more effective. Containers can be configured to contain only part or parts of the entire environment configuration.
Containers are created either by command line or by a template file (Docker-Compose). Everything works the same under Docker. That is, if we launch the same command or use the same template (taking into account the volumes and a couple of iron-specific details, such as RAM and CPU, if specified), the container running on machine A will be exactly the same as the container running on machine B. Therefore, there will be no variation between development, pre-production or production environments.
Now we know what a docker or container entails. In the following articles we will focus on the different applications of this technology and its benefits in generating value in different areas of the enterprise, from cloud servers to IoT field devices.
If you want to know more about Docker, and how Barbara IoT software can help you, please contact us.