The disruption of a global pandemic and its initial impact on enterprise digital transformation momentum is behind us. But the rapid acceleration, the sudden shift to remote work, and a reliance on digital services should remind us of the impact of disruptive events as we eye up the arrival of generative AI on the scene.
The acceleration of enterprise digital transformation can be seen everywhere from DoorDash to the dominance of streaming entertainment to the establishment of a hybrid workforce, something that was unthinkable before 2020.
And we’re seeing it happen again with generative AI, sort of.
It’s not that generative AI isn’t cool—it is. And it isn’t that generative AI isn’t going to change a lot of things—from the way we work to the way we learn to the way we live life—it is. But on its own, generative AI isn’t any more useful than analytics. Both fail to produce value without a question in need of an answer. Its real impact is seen when it intersects with existing technologies.
The catalytic nature of generative AI generates significant impact, usually when it accelerates an existing trend.
Modern Applications
For example, modern applications were already set to overtake traditional applications in the next few years. But AI threw fuel on that fire and we’re already seeing modern apps breach the apex of dominance in the enterprise portfolio, because AI is a modern app, and so are the applications being built to take advantage of it.
APIs
APIs were already racing toward the top of the priority stack for delivery and security. AI has made everything about APIs a critical priority that’s likely to push general security off the throne. Because most folk are building modern apps, which rely on APIs, and integrating AI services using, you guessed it, APIs.
Hybrid and Multicloud
Generative AI relies on significant compute, storage, and network resources. The kind of resources that are going to amplify the existing hybrid IT operating model and exacerbate the challenges of multicloud estates. The brains behind generative AI—LLMs—are likely to live in a public cloud but there will be some that stay on-premises. And the apps being built to use those LLMs? They’ll be multicloud too. If you weren’t certain hybrid IT was here to stay, the reality of the resources required for training and inferencing along with a healthy requirement to maintain the privacy of private data is going to solidify the normalcy of the hybrid IT operating model.
AIOps
Generative AI is accelerating the shift to AIOps as well. It’s the tool AIOps was waiting for, and there’s no dearth of solutions already finding ways to take advantage of this technology’s ability to generate content, code, and queries. In fact, generative AI will take us beyond today’s most mature method—automated scripts—to a state in which the system is able to not only execute the scripts but generate them and the correct policies to boot. It moves the needle for automation from “automated” to “autonomous.” The impact on operations will be profound, although it won’t be fully felt for years, but it’s coming.
All of these factors will accelerate rapid changes to accommodate the needs of apps that leverage AI as well as the organisations that build and operate them. Privacy, security, and responsibility will drive innovation across every enterprise domain, but especially data, app delivery, and security.
But all of these—modern apps, APIs, multicloud, hybrid IT, and AIOps—were already trending upwards before OpenAI introduced ChatGPT. Generative AI simply accelerated the rate at which they were already heading. Which is pretty much what COVID did to enterprise digital transformation, except with AI we’re going to see a lot more change.
AI’s biggest impact is not going to come from its mere existence, but from how it impacts people, processes, and products.
Discussion about this post