Reality sank in this week for software developers who built applications using Docker containers over the past few years: The free ride is over.
Earlier this year, Docker announced that it was going to impose new limits on how free users of its Docker Hub service would be able to access public container images stored in that repository, and those changes started rolling out Monday. The move comes almost exactly a year after Docker sold the enterprise version of its software business and 75% of its employees to Mirantis, leaving a much smaller company behind.
Developers had time to prepare, but many — especially users of the Kubernetes container management service — were still caught off guard this week by the new limits. In response, several vendors offered tips for getting around the issues involved, and AWS announced a plan to offer its own public container registry in the near future.
Containers were a huge step forward for modern software development. They allow developers to package all the building blocks needed to run an application into, well, a container, which can be deployed across a wide variety of cloud or self-managed servers. Docker raised more than $300 million in funding after it created a developer-friendly way to use containers in the mid-2010s, but despite wide use of its container format, the company has struggled to find a business model.
Container images are essentially blueprints for the container, and they are usually stored somewhere readily accessible for developers to grab when they are updating their applications with new code. Developers can store their images in private if they prefer, but over the past few years lots of developers and companies opted to publish public container images to boost adoption and awareness of their software products. The convenience of those building blocks of code being publicly available meant that many people built applications using them.
Docker builds its own certified images for Docker Hub users to employ, along with certified images published by third-party developers and a trove of community-generated images. When it was a fast-growing enterprise tech unicorn, Docker offered those services for free, but the company can't afford such largess at this point in its history.
That entire repository is pretty big — over 15 petabytes worth of container images — and storing that much data is not cheap. Earlier this year Docker said it would delay a plan to delete inactive images after a community uproar, but as of Nov. 2 it imposed new limits on how many times free users of Docker Hub could grab images over a six-hour period, given that the bandwidth costs associated with serving those images are also not cheap.
The rise of automated continuous integration services provided by companies like CircleCI and JFrog exacerbated the problem, said Donnie Berkholz, vice president of products for Docker. Those services automatically check container images for updates when deploying changes to software, which is great for their users but a load on Docker.
"On the order of 30% of our traffic was coming from 1% of our users, and that's not sustainable when those users are free," Berkholz said.
Users of Docker's paid services — which also include features designed for teams and large software organizations — will not face the rate limits, and Docker worked out a deal that will lift the limits for most of CircleCI's customers, too.
Deep-pocketed cloud providers see a different opportunity. Microsoft's GitHub announced plans for its own free public container registry in September, and on Monday AWS announced vague plans for a public container registry that it will likely outline during its upcoming re:Invent virtual conference.
The storage and bandwidth costs associated with hosting container images are rounding errors for companies such as Microsoft and AWS, and developer goodwill is a valuable commodity. AWS will likely encourage users of its public container service to run those containers on AWS, and while GitHub still operates at an arm's length from Microsoft, similar suggestions for Azure users wouldn't be surprising.
In the end, Docker's move is a signal that a relatively permissive and free-wheeling era of cloud computing is winding down as it becomes an enormous business. It also highlights the importance of the software supply chain: Modern software applications pull code from a wide variety of places, and disruptions to those supply chains can have profound effects on application performance or availability.