Enterprise

Software developers scramble as Docker puts limits on container use

New cost-saving limits for free users of a Docker service that's become central to a lot of modern software have forced developers to assess their options.

Boxes

Docker says it can longer afford to offer one of it's very well-used container services for free.

Image: Clayton Shonkwiler and Protocol

Reality sank in this week for software developers who built applications using Docker containers over the past few years: The free ride is over.

Earlier this year, Docker announced that it was going to impose new limits on how free users of its Docker Hub service would be able to access public container images stored in that repository, and those changes started rolling out Monday. The move comes almost exactly a year after Docker sold the enterprise version of its software business and 75% of its employees to Mirantis, leaving a much smaller company behind.

Developers had time to prepare, but many — especially users of the Kubernetes container management service — were still caught off guard this week by the new limits. In response, several vendors offered tips for getting around the issues involved, and AWS announced a plan to offer its own public container registry in the near future.

Containers were a huge step forward for modern software development. They allow developers to package all the building blocks needed to run an application into, well, a container, which can be deployed across a wide variety of cloud or self-managed servers. Docker raised more than $300 million in funding after it created a developer-friendly way to use containers in the mid-2010s, but despite wide use of its container format, the company has struggled to find a business model.

Container images are essentially blueprints for the container, and they are usually stored somewhere readily accessible for developers to grab when they are updating their applications with new code. Developers can store their images in private if they prefer, but over the past few years lots of developers and companies opted to publish public container images to boost adoption and awareness of their software products. The convenience of those building blocks of code being publicly available meant that many people built applications using them.

Docker builds its own certified images for Docker Hub users to employ, along with certified images published by third-party developers and a trove of community-generated images. When it was a fast-growing enterprise tech unicorn, Docker offered those services for free, but the company can't afford such largess at this point in its history.

That entire repository is pretty big — over 15 petabytes worth of container images — and storing that much data is not cheap. Earlier this year Docker said it would delay a plan to delete inactive images after a community uproar, but as of Nov. 2 it imposed new limits on how many times free users of Docker Hub could grab images over a six-hour period, given that the bandwidth costs associated with serving those images are also not cheap.

The rise of automated continuous integration services provided by companies like CircleCI and JFrog exacerbated the problem, said Donnie Berkholz, vice president of products for Docker. Those services automatically check container images for updates when deploying changes to software, which is great for their users but a load on Docker.

"On the order of 30% of our traffic was coming from 1% of our users, and that's not sustainable when those users are free," Berkholz said.

Users of Docker's paid services — which also include features designed for teams and large software organizations — will not face the rate limits, and Docker worked out a deal that will lift the limits for most of CircleCI's customers, too.

Deep-pocketed cloud providers see a different opportunity. Microsoft's GitHub announced plans for its own free public container registry in September, and on Monday AWS announced vague plans for a public container registry that it will likely outline during its upcoming re:Invent virtual conference.

The storage and bandwidth costs associated with hosting container images are rounding errors for companies such as Microsoft and AWS, and developer goodwill is a valuable commodity. AWS will likely encourage users of its public container service to run those containers on AWS, and while GitHub still operates at an arm's length from Microsoft, similar suggestions for Azure users wouldn't be surprising.

In the end, Docker's move is a signal that a relatively permissive and free-wheeling era of cloud computing is winding down as it becomes an enormous business. It also highlights the importance of the software supply chain: Modern software applications pull code from a wide variety of places, and disruptions to those supply chains can have profound effects on application performance or availability.

Enterprise

Why foundation models in AI need to be released responsibly

Foundation models like GPT-3 and DALL-E are changing AI forever. We urgently need to develop community norms that guarantee research access and help guide the future of AI responsibly.

Releasing new foundation models doesn’t have to be an all or nothing proposition.

Illustration: sorbetto/DigitalVision Vectors

Percy Liang is director of the Center for Research on Foundation Models, a faculty affiliate at the Stanford Institute for Human-Centered AI and an associate professor of Computer Science at Stanford University.

Humans are not very good at forecasting the future, especially when it comes to technology.

Keep Reading Show less
Percy Liang
Percy Liang is Director of the Center for Research on Foundation Models, a Faculty Affiliate at the Stanford Institute for Human-Centered AI, and an Associate Professor of Computer Science at Stanford University.

Every day, millions of us press the “order” button on our favorite coffee store's mobile application: Our chosen brew will be on the counter when we arrive. It’s a personalized, seamless experience that we have all come to expect. What we don’t know is what’s happening behind the scenes. The mobile application is sourcing data from a database that stores information about each customer and what their favorite coffee drinks are. It is also leveraging event-streaming data in real time to ensure the ingredients for your personal coffee are in supply at your local store.

Applications like this power our daily lives, and if they can’t access massive amounts of data stored in a database as well as stream data “in motion” instantaneously, you — and millions of customers — won’t have these in-the-moment experiences.

Keep Reading Show less
Jennifer Goforth Gregory
Jennifer Goforth Gregory has worked in the B2B technology industry for over 20 years. As a freelance writer she writes for top technology brands, including IBM, HPE, Adobe, AT&T, Verizon, Epson, Oracle, Intel and Square. She specializes in a wide range of technology, such as AI, IoT, cloud, cybersecurity, and CX. Jennifer also wrote a bestselling book The Freelance Content Marketing Writer to help other writers launch a high earning freelance business.
Climate

The West’s drought could bring about a data center reckoning

When it comes to water use, data centers are the tech industry’s secret water hogs — and they could soon come under increased scrutiny.

Lake Mead, North America's largest artificial reservoir, has dropped to about 1,052 feet above sea level, the lowest it's been since being filled in 1937.

Photo: Mario Tama/Getty Images

The West is parched, and getting more so by the day. Lake Mead — the country’s largest reservoir — is nearing “dead pool” levels, meaning it may soon be too low to flow downstream. The entirety of the Four Corners plus California is mired in megadrought.

Amid this desiccation, hundreds of the country’s data centers use vast amounts of water to hum along. Dozens cluster around major metro centers, including those with mandatory or voluntary water restrictions in place to curtail residential and agricultural use.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).

Workplace

Indeed is hiring 4,000 workers despite industry layoffs

Indeed’s new CPO, Priscilla Koranteng, spoke to Protocol about her first 100 days in the role and the changing nature of HR.

"[Y]ou are serving the people. And everything that's happening around us in the world is … impacting their professional lives."

Image: Protocol

Priscilla Koranteng's plans are ambitious. Koranteng, who was appointed chief people officer of Indeed in June, has already enhanced the company’s abortion travel policies and reinforced its goal to hire 4,000 people in 2022.

She’s joined the HR tech company in a time when many other tech companies are enacting layoffs and cutbacks, but said she sees this precarious time as an opportunity for growth companies to really get ahead. Koranteng, who comes from an HR and diversity VP role at Kellogg, is working on embedding her hybrid set of expertise in her new role at Indeed.

Keep Reading Show less
Amber Burton

Amber Burton (@amberbburton) is a reporter at Protocol. Previously, she covered personal finance and diversity in business at The Wall Street Journal. She earned an M.S. in Strategic Communications from Columbia University and B.A. in English and Journalism from Wake Forest University. She lives in North Carolina.

Climate

New Jersey could become an ocean energy hub

A first-in-the-nation bill would support wave and tidal energy as a way to meet the Garden State's climate goals.

Technological challenges mean wave and tidal power remain generally more expensive than their other renewable counterparts. But government support could help spur more innovation that brings down cost.

Photo: Jeremy Bishop via Unsplash

Move over, solar and wind. There’s a new kid on the renewable energy block: waves and tides.

Harnessing the ocean’s power is still in its early stages, but the industry is poised for a big legislative boost, with the potential for real investment down the line.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).

Latest Stories
Bulletins