Docker really did change the world

In 2013, Docker was the “it” company. Docker made headlines for the pivotal role it played in bringing containers to the mainstream, and in many ways displaced PaaS as the fad of the day (Heroku anyone?). Now the company is back in the press with the introduction of a new model for Docker Desktop that requires larger organizations to purchase a paid subscription for the tools. There has been a vocal reaction to this announcement, one that reminds me of the important role that Docker played in popularizing a model we know, love, and now use across the board: containers.

Docker didn’t invent containers, but it made the technology accessible through an open source tool and reusable images. With Docker, developers could actually create software once and run it locally or on a production server.

The fact that the Docker command line tool has displaced years of attractive web interfaces is perhaps a comment on what developers really want. But to really understand the impact of Docker, it’s important to go back a bit before Docker container technology made its debut.

Looking for the next big thing

In 2009, the value of using virtualization was well understood and widely implemented. Most organizations had already realized the benefits of virtualization or had a roadmap to get there. The marketing machine was tired of virtualization. People were eager for the next innovation in IT and software development. It came in the form of Heroku. In fact, PaaS in general and Heroku specifically became wildly popular. So much so that it seemed that PaaS was going to take over the world.

At that time, Heroku was huge. Just go to this portal and develop your applications and deliver them as a service? What’s not to like? Why wouldn’t you develop apps on Heroku?

As it turned out, there were a couple of good reasons not to use Heroku and PaaS platforms of its kind. For example, the applications created on Heroku were not portable; they were available only within Heroku. Developers had to work remotely on the PaaS platform if they wanted to collaborate. Unlike Netflix, it turns out that developers love to develop locally. If a developer wanted to work on their local box, they still had to manually build the environment themselves.

Also, while the Heroku model was extremely powerful if you used what was provided out of the box, it was complex behind the scenes. As soon as your team created something more complex than a simple web application, or needed to customize the infrastructure for security or performance reasons, it became a very “real” and difficult engineering problem.

It was great … until it wasn’t. But, in typical IT fashion, many people did their best before realizing that platforms like Heroku have their place, but they are not the right tool for every job.

The Docker difference

Containers, on the other hand, solved many of the challenges with PaaS, and Docker was the company that made developers, IT managers, and business managers see and understand. In fact, when Docker came along, its value was staggeringly obvious: all the things that were difficult for Heroku were easy with Docker, and all the things that were easy for Heroku were also easy with Docker. With Docker, you can quickly and easily start a prebuilt service, but you can also develop it locally and customize the services to do what you need.

That is not to say that Docker was pretty. It actually took advantage of a user experience that first became popular in the 1970s on Unix. Docker was just a command running on a Linux terminal, a far cry from the fancy graphical interfaces of most PaaS platforms. But the Docker command line interface (CLI) was really elegant. In fact, I would say that the Docker CLI in particular showed the world that when we bring a modern sense of UX to the development of a CLI, it can change the world.

Docker, and containers in general, provided the underlying technology for developing cloud-native applications. They worked, and continue to work, on highly distributed architectures and within the devops and CI / CD (continuous integration and continuous delivery) models that are required today to meet new and constant customer demands for regression-free enhancements (also known as such as bugs, security issues). , etc.).

Containers allow developers to change applications quickly, without breaking the functionality that users trust. Additionally, the ecosystem that has evolved around containers, including the seemingly unfailing Kubernetes orchestration platform, has enabled organizations to effectively scale and manage growing container collections.

The developers quickly understood the value of the containers. Operations teams quickly understood and Silicon Valley investors understood. But it took a bit of work to convince managers, CIOs, and CEOs, who often see clever demos, that a command-line tool was better than all these bells and whistles with PaaS.

Life in a containerized world

And here we are in 2021 with a command line tool that is still making waves. That’s pretty remarkable, to say the least. There even seems to be room for two players in this market for container CLIs (see “Red Hat Enterprise Linux Targets Edge Computing” and “When Is Docker Used Versus Podman: A Developer’s Perspective?”).

Now, thanks to the highway paved with container technology, developers can work locally or in the cloud much more easily than before. CIOs and CEOs can expect shorter development cycles, lower risk of outages, and even reduced cost to manage applications throughout the lifecycle.

Docker isn’t perfect, and neither are containers. It could be said that it is more work to migrate applications to containers compared to virtual machines, but the benefits last the entire life cycle of the application, so it is worth the investment. This is especially true with new applications being developed, but it also applies to lift and shift migrations, or even refactoring work.

Docker brought container technology front, center, and top priority, displacing PaaS as the reigning fad, and for that reason alone it really changed the world.

TO Red Hat, Scott McCarty helps educate IT professionals, customers and partners on all aspects of Linux containers, from organizational transformation to technical implementation, and works to advance Red Hat’s go-to-market strategy around containers. and related technologies.

New Tech Forum provides a place to explore and discuss emerging business technology with unprecedented depth and breadth. Selection is subjective, based on our selection of technologies that we believe are important and of most interest to InfoWorld readers. InfoWorld does not accept marketing material for publication and reserves the right to edit all contributed content. Please send all inquiries to newtechforum@infoworld.com.

Copyright © 2021 IDG Communications, Inc.

Leave a Comment