enterpriseprotocol | enterpriseauthorTom KrazitNoneAre you keeping up with the latest cloud developments? Get Tom Krazit and Joe Williams' newsletter every Monday and Thursday.d3d5b92349
×

Get access to Protocol

Will be used in accordance with our Privacy Policy

I’m already a subscriber
Protocol | Enterprise

Software developers scramble as Docker puts limits on container use

New cost-saving limits for free users of a Docker service that's become central to a lot of modern software have forced developers to assess their options.

Boxes

Docker says it can longer afford to offer one of it's very well-used container services for free.

Image: Clayton Shonkwiler and Protocol

Reality sank in this week for software developers who built applications using Docker containers over the past few years: The free ride is over.

Earlier this year, Docker announced that it was going to impose new limits on how free users of its Docker Hub service would be able to access public container images stored in that repository, and those changes started rolling out Monday. The move comes almost exactly a year after Docker sold the enterprise version of its software business and 75% of its employees to Mirantis, leaving a much smaller company behind.

Developers had time to prepare, but many — especially users of the Kubernetes container management service — were still caught off guard this week by the new limits. In response, several vendors offered tips for getting around the issues involved, and AWS announced a plan to offer its own public container registry in the near future.

Containers were a huge step forward for modern software development. They allow developers to package all the building blocks needed to run an application into, well, a container, which can be deployed across a wide variety of cloud or self-managed servers. Docker raised more than $300 million in funding after it created a developer-friendly way to use containers in the mid-2010s, but despite wide use of its container format, the company has struggled to find a business model.

Container images are essentially blueprints for the container, and they are usually stored somewhere readily accessible for developers to grab when they are updating their applications with new code. Developers can store their images in private if they prefer, but over the past few years lots of developers and companies opted to publish public container images to boost adoption and awareness of their software products. The convenience of those building blocks of code being publicly available meant that many people built applications using them.

Docker builds its own certified images for Docker Hub users to employ, along with certified images published by third-party developers and a trove of community-generated images. When it was a fast-growing enterprise tech unicorn, Docker offered those services for free, but the company can't afford such largess at this point in its history.

That entire repository is pretty big — over 15 petabytes worth of container images — and storing that much data is not cheap. Earlier this year Docker said it would delay a plan to delete inactive images after a community uproar, but as of Nov. 2 it imposed new limits on how many times free users of Docker Hub could grab images over a six-hour period, given that the bandwidth costs associated with serving those images are also not cheap.

The rise of automated continuous integration services provided by companies like CircleCI and JFrog exacerbated the problem, said Donnie Berkholz, vice president of products for Docker. Those services automatically check container images for updates when deploying changes to software, which is great for their users but a load on Docker.

"On the order of 30% of our traffic was coming from 1% of our users, and that's not sustainable when those users are free," Berkholz said.

Users of Docker's paid services — which also include features designed for teams and large software organizations — will not face the rate limits, and Docker worked out a deal that will lift the limits for most of CircleCI's customers, too.

Deep-pocketed cloud providers see a different opportunity. Microsoft's GitHub announced plans for its own free public container registry in September, and on Monday AWS announced vague plans for a public container registry that it will likely outline during its upcoming re:Invent virtual conference.

The storage and bandwidth costs associated with hosting container images are rounding errors for companies such as Microsoft and AWS, and developer goodwill is a valuable commodity. AWS will likely encourage users of its public container service to run those containers on AWS, and while GitHub still operates at an arm's length from Microsoft, similar suggestions for Azure users wouldn't be surprising.

In the end, Docker's move is a signal that a relatively permissive and free-wheeling era of cloud computing is winding down as it becomes an enormous business. It also highlights the importance of the software supply chain: Modern software applications pull code from a wide variety of places, and disruptions to those supply chains can have profound effects on application performance or availability.

Protocol | Fintech

Jack Dorsey is so money: What Tidal and banking do for Square

Teaming up with Jay-Z's music streaming service may seem like a move done for flash, but it's ultimately all about the money (and Cash).

Jay-Z performs at the Tidal-X concert at the Barclays Center in Brooklyn in 2017.

Photo: Theo Wargo/Getty Images

It was a big week for Jack Dorsey, who started by turning heads in Wall Street, and then went Hollywood with an unexpected music-streaming deal.

Dorsey's payments company, Square, announced Monday that it now has an actual bank, Square Financial Services, which just got a charter approved. On Thursday, Dorsey announced Square was taking a majority stake in Tidal, the music-streaming service backed by Jay-Z, for $297 million.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Signal at (510)731-8429.

Sponsored Content

The future of computing at the edge: an interview with Intel’s Tom Lantzsch

An interview with Tom Lantzsch, SVP and GM, Internet of Things Group at Intel

An interview with Tom Lantzsch

Senior Vice President and General Manager of the Internet of Things Group (IoT) at Intel Corporation

Edge computing had been on the rise in the last 18 months – and accelerated amid the need for new applications to solve challenges created by the Covid-19 pandemic. Tom Lantzsch, Senior Vice President and General Manager of the Internet of Things Group (IoT) at Intel Corp., thinks there are more innovations to come – and wants technology leaders to think equally about data and the algorithms as critical differentiators.

In his role at Intel, Lantzsch leads the worldwide group of solutions architects across IoT market segments, including retail, banking, hospitality, education, industrial, transportation, smart cities and healthcare. And he's seen first-hand how artificial intelligence run at the edge can have a big impact on customers' success.

Protocol sat down with Lantzsch to talk about the challenges faced by companies seeking to move from the cloud to the edge; some of the surprising ways that Intel has found to help customers and the next big breakthrough in this space.

What are the biggest trends you are seeing with edge computing and IoT?

A few years ago, there was a notion that the edge was going to be a simplistic model, where we were going to have everything connected up into the cloud and all the compute was going to happen in the cloud. At Intel, we had a bit of a contrarian view. We thought much of the interesting compute was going to happen closer to where data was created. And we believed, at that time, that camera technology was going to be the driving force – that just the sheer amount of content that was created would be overwhelming to ship to the cloud – so we'd have to do compute at the edge. A few years later – that hypothesis is in action and we're seeing edge compute happen in a big way.

Keep Reading Show less
Saul Hudson
Saul Hudson has a deep knowledge of creating brand voice identity, especially in understanding and targeting messages in cutting-edge technologies. He enjoys commissioning, editing, writing, and business development, in helping companies to build passionate audiences and accelerate their growth. Hudson has reported from more than 30 countries, from war zones to boardrooms to presidential palaces. He has led multinational, multi-lingual teams and managed operations for hundreds of journalists. Hudson is a Managing Partner at Angle42, a strategic communications consultancy.
Protocol | Enterprise

Can we talk? Microsoft unveils voice and text-chat service for developers.

Web and mobile developers will be able to use Azure Communication Services to let customers chat with service reps directly from their apps or web sites.

Microsoft is adding more communication services to Azure.

Photo: Microsoft

One year after the pandemic forced businesses to adapt in countless ways, the race to overhaul how they interact with their customers is starting to heat up.

Microsoft said Tuesday it would release Azure Communication Services into the wild this week, kicking off the first day of its Ignite virtual conference. The service, first introduced at the autumn version of Ignite last September, allows developers to embed voice, text chat, SMS or video capabilities into their applications.

Keep Reading Show less
Tom Krazit

Tom Krazit ( @tomkrazit) is a senior reporter at Protocol, covering cloud computing and enterprise technology out of the Pacific Northwest. He has written and edited stories about the technology industry for almost two decades for publications such as IDG, CNET, paidContent, and GeekWire. He has written and edited stories about the technology industry for almost two decades for publications such as IDG, CNET and paidContent, and served as executive editor of Gigaom and Structure.

Transforming 2021

Blockchain, QR codes and your phone: the race to build vaccine passports

Digital verification systems could give people the freedom to work and travel. Here's how they could actually happen.

One day, you might not need to carry that physical passport around, either.

Photo: CommonPass

There will come a time, hopefully in the near future, when you'll feel comfortable getting on a plane again. You might even stop at the lounge at the airport, head to the regional office when you land and maybe even see a concert that evening. This seemingly distant reality will depend upon vaccine rollouts continuing on schedule, an open-sourced digital verification system and, amazingly, the blockchain.

Several countries around the world have begun to prepare for what comes after vaccinations. Swaths of the population will be vaccinated before others, but that hasn't stopped industries decimated by the pandemic from pioneering ways to get some people back to work and play. One of the most promising efforts is the idea of a "vaccine passport," which would allow individuals to show proof that they've been vaccinated against COVID-19 in a way that could be verified by businesses to allow them to travel, work or relax in public without a great fear of spreading the virus.

Keep Reading Show less
Mike Murphy

Mike Murphy ( @mcwm) is the director of special projects at Protocol, focusing on the industries being rapidly upended by technology and the companies disrupting incumbents. Previously, Mike was the technology editor at Quartz, where he frequently wrote on robotics, artificial intelligence, and consumer electronics.

Protocol | Fintech

IBM’s huge bet on building a cloud for banks

Howard Boville left his post as Bank of America's CTO to lead Big Blue's bold cloud offensive. Can he make it work?

IBM is embarking on an ambitious bid to build a cloud banking platform.

Image: Scott Eells/Getty Images

Moving to the cloud can be burdensome for a bank: If you don't know exactly how and where your company's data is being stored, meeting regulations that control it can be almost impossible. Howard Boville is betting he can solve that problem.

Last spring, Boville left his post as chief technology officer at Bank of America to lead IBM's ambitious bid to build a cloud banking platform. The concept: that any bank or fintech would automatically be following the rules in any part of the world it operates, as soon as it started using the platform.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Signal at (510)731-8429.

Latest Stories