enterprise| enterpriseauthorKevin McAllisterBraintrust Post LayoutAre you keeping up with the latest cloud developments? Get Tom Krazit and Joe Williams' newsletter every Monday and Thursday.d3d5b92349
Get access to Protocol
Protocol's experts on the biggest questions in tech.
Get Braintrust in your inbox every Thursday.
Data-centric compute processing, neuromorphic computing and zero trust security are focal points for members of Protocol's Braintrust.
Chief Strategy Officer at Marvell
We are already seeing a compute paradigm shift within hyperscale data centers, and over the next several years we will start to see this shift replicate within the expanded enterprise. Until now, enterprise networks have been bound to controlled premises, running very specific applications and dealing with specific data. In general, this has been a reasonable volume of defined, structured and uniform data, from homogenous sources, that can be easily understood by the compute processor running applications. Mobility and cloud have extended the boundaries of the traditional campus environment to create a borderless enterprise.
Now, we are faced with two issues: First, the structure and format of the data coming into the enterprise is very different from location to location and generated from a variety of sources. This data needs to traverse over various physical mediums requiring multiple types of protocols and encoding and often travel over boundaries owned by various entities, necessitating encryption. Second, the volume of data flow is enormous. As a result, the application-centric compute of today's enterprise networks is no longer sufficient. We now need to augment that application-centric compute with data-centric compute processing to convert that data into information that is easily understood by the applications to which it is sent.
Additionally, with the advent of 5G, we will begin to see a lot more data being generated in a distributed manner. Instead of blindly sending an entire load of data to a central location for processing, we will start to see compute processing at each node of the network. These nodes will be able to extract and parse through copious amounts of data and convert it into a unified format to then pass on to a central policy management system. The result will be much more meaningful data being sent to the appropriate applications instead of high volumes of data overwhelming the network.
COO at VMware
The pandemic and its aftermath are changing the nature of work and the workplace, and the future of enterprise computing is tied to that shift now more than ever. The technologies that will come to dominate the landscape make that transition possible by empowering IT to be flexible, smart and secure.
Enterprises need to be able to build, run, manage, connect and secure any application on any cloud anywhere in the world. That is the future, and the technologies that underpin that — multi-cloud, Kubernetes and a microservices architecture — are the foundational support for IT to meet the business challenge of this time. In addition to these technologies, a major priority in the next five years will be adopting a modern approach to security that is cloud-delivered and built to protect modern infrastructure.
Finally, it is impossible to discuss the future without discussing artificial intelligence. In the next five years, I see the enterprise rearchitecting itself to prepare for the power of AI, because it is not just about AI technology itself, it's about having the compute and networking resources available to support next-generation applications.
Dr. Rich Uhlig
Senior Intel Fellow, VP and Director of Intel Labs
Computing over the next five years will feel the dramatic effects of data's exponential growth, as it becomes more embedded in our lives. To harness value and insights from data rapidly and securely, the industry needs innovations that are orders of magnitude greater than what we are able to do today. We need innovations across multiple fronts: from new computing architectures to breakthrough advances in high-speed, energy-efficient networking to new security protocols that further tighten the seals around data while still enabling us to derive value from it.
A few promising innovations are underway driven by research at organizations such as Intel Labs, providing an early glimpse of technologies that could make their way into the enterprise within five years. Neuromorphic computing, a new brain-inspired architecture, is finding early applications for edge usages in robotics, manufacturing and health care. Optical technologies in data centers are solving for the limits of electrical I/O. The field of integrated silicon photonics research is showing tremendous potential for future data centers that will be connected by light. Security innovations such as secure federated learning and fully homomorphic encryption will mature over the next few years, addressing many of the barriers around secure data sharing that limits us today.
At Intel Labs, these are just a few of the many computing challenges our team of researchers is focused on with a goal to commercialization, so technologies born in our Labs can ultimately scale for broadest societal impact.
SVP of Corporate Strategy and Communications at Micron Technology
The data-centric era has ushered in a new opportunity to tap data for business growth, but many companies continue to struggle to transform mounting data stores into competitive advantage. While much of this struggle relates to implementing a fundamental organizational data strategy, companies have also been limited by underlying infrastructure impediments to effective analysis of those large amounts of data. The computing architecture that ushered in the client server era and was honed for the agility and scale of the cloud has relied on a balance between compute, memory, storage and I/O that has served traditional enterprise applications well. At Micron, we have long foreseen this fundamental architectural challenge. So, we have turned to our own technology innovation across traditional DRAM and NAND storage products, as well as emerging technologies, to break through with new levels of capacity, performance and latency reductions.
We've been delighted to see the real value of this innovation emerge as we've worked with leading partners to test application performance with the capacity and speed delivered by a 3D XPoint-based Micron X100 SSD. For example, Micron engineers have tested the X100 performance against the leading cloud database, MongoDB. The results have been outstanding, with best-in-class performance with at least a 30% performance improvement using an X100-fueled platform. In the coming five years and beyond, we can't wait to deliver this next wave of data-centric innovation and see how enterprises use this technology to deliver business growth.
Co-founder & COO at Cloudflare
The hybrid work model has accelerated a huge shift from legacy, on-premise hardware to cloud-based services that can power and secure the work-from-anywhere economy. Increasingly, we'll see companies adopting a Zero Trust framework to secure their remote workforces, which means that organizations do not automatically trust any requests to corporate data or resources, and instead, verify every attempt to connect to corporate systems before allowing them access.
The old castle and moat scenario for business security — "Here's our castle, we're going to protect it with a moat around it" — doesn't work when your users are located everywhere. Companies must take a user-first model and understand who each user is to evaluate whether or not they can gain access.
Businesses of all sizes now understand the need to migrate to a more secure approach. According to a recent Forrester report, 76% of businesses today say their organization's security approach is "antiquated" and that they need to shift to a Zero Trust framework. With a globally distributed workforce, companies now have as many "offices" as people — so clunky on-premise systems designed to protect now empty offices no longer cut it.
Managing Partner at Menlo Ventures
Enterprise computing will become less reliant on Moore's Law in the coming years but will still continue to benefit from it. Software, on the other hand, will define and pace the innovation race going forward.
First, there will be cheaper, better and more diverse cloud services that will continue to allow developers to quickly and cheaply experiment, iterate and scale. We'll move beyond the myriad storage, compute and database services to more specific services that best suit the needs of a particular application. There are also a host of new application-specific stacks and frameworks emerging such as JAMstack, led by Netlify, for simple performant websites; platforms like ML Engine and Clarifai for consumable AI; and flexible infrastructure platforms like Render to build environments quickly. Along with this variety of services and stacks will come better tools that allow companies to deploy, operate and manage complexity, faster and with more certainty. Harness is a good example of such advances on the deployment side and HashiCorp on the secure, operate, and manage side.
While computing power will certainly advance in the coming years, most of the hardware/silicon advancement will be to optimize AI processing power and speed in TPU's. Software will broadly trump hardware as the key area of innovation in enterprise computing and further commoditize hardware to overcome underlying scalability and reliability issues. It's going to be an exciting many years ahead as the platforms described are about to hit the mainstream market and make it incredibly easy to build amazing applications on simple to manage infrastructure.
CTO at Puppet
I predict that automation and edge computing will define enterprise computing over the next five years — though I should note that it's almost impossible to narrow emerging technologies to what exists today. Our industry moves at lightning speed, and there are certainly countless technologies that are nascent — or even nonexistent — that will be considered indispensable five years from now. Even automation and edge computing will evolve in unpredictable ways to meet new use cases that emerge in the coming half-decade.
Automation will no doubt define the next layer of capability for the enterprise. As more and more enterprises automate their infrastructure and business workflows they increase the time employees have to focus on innovation and human-centric challenges. Enterprises will see an explosion of productivity as toil is automated away and employees pivot to higher level work. Automation establishes a streamlined process for companies that is far less prone to error — meaning enterprises will decrease costly mistakes as they simultaneously ramp up productivity. More enterprises are integrating automation into their organization as we speak, and over the next five years automation will expand to touch nearly every workflow — from infrastructure to simple business processes to customer engagement.
However, we can't talk about the future of enterprise computing without talking about edge. Edge computing is truly an emerging technology, and I think it will improve dramatically over the next five years. More high-intensity workloads will move to the edge as it becomes more mature, and as 5G rolls out around the world. Edge will also change the way we think about data: how we use, access, exploit and leverage it to scale and grow businesses. The shifts in automation as well as the expansion of the infrastructure estate to include edge will push organizations to be more intentional about data — where it sits, how you use it, how you access it and so on.
The work being done currently with automation will eventually scale to a hybrid estate that includes on-prem/private cloud, public cloud and the edge. Unsurprisingly, these big enterprise influencers have a lot of potential for overlap as each technology evolves to adapt to the rate of business.
See who's who in Protocol's Braintrust (updated Dec. 2, 2020).
Questions, comments or suggestions? Email email@example.com.
More from Braintrust