- Akamai launched a new Inference Cloud product for edge AI
- The service uses Nvidia's Blackwell 6000 GPUs in 17 cities to start
- The GPUs enable AI applications to run closer to where users are
When Akamai acquired Linode in 2022, it knew it was taking a transformational leap into the cloud computing game. But the company couldn’t have known at the time just how much AI would inflate the opportunity in front of it. Now, with the launch of Akamai Inference Cloud, the company is poised to capitalize on an edge AI market that is expected to grow from $54 billion in revenue in 2024 to $157 billion by 2030.
CEO Tom Leighton told Fierce that it has been a long, multi-year journey to get to this point. But he said the company’s experience in the content distribution network realm gave it a leg up in its quest to build an edge cloud network robust enough to run AI applications close to users.
Akamai’s network already spanned 4,300 points of presence in over 700 cities across the globe. To build its Inference Cloud, the company deployed Nvidia’s Blackwell 6000 GPUs in 17 cities, with more on the way. Leighton said the GPUs will be added in more cities based on demand. In theory, the technology could be extended to all 700 of the cities it operates in, he said.
The company has long been running CPUs and low-end GPUs on its network. The difference with the new GPUs, Leighton said, is what kind of workloads they enable.
“We’ve been doing this for delivery for a long time and for lighter-weight compute applications, JavaScript kinds of applications,” Leighton said. “What’s new is that now we can run your containers there, heavier duty logic can be supported near the user.”
That means Inference Cloud can handle everything from 8K video flows and video intelligence to recommendation engines and VR experiences (think personalized retail agents and virtual fitting rooms). Akamai said it is also working with a top toymaker to enable AI-powered toys.
“Our base, they care about latency, they care about cost and they’re already running pretty much everything else to do with the applications on our platform,” including security and delivery, Leighton said. With the new service, users can actually run applications on the platform.
According to Synergy Research Group’s latest data, Akamai has around 1% market share in the cloud arena. That puts it on the same level as CoreWeave, Databricks, Snowflake and SAP, Synergy’s tally showed.
Asked about whether he’s worried about the oft-discussed AI bubble, Leighton said no. The kinds of applications that run on Akamai’s servers aren't going away, bubble or not, he said. After all, Akamai isn’t in the business of building sprawling data centers that need the latest GPUs to train new foundation models.
“For what we do, we’re buying workhorse equipment and it lasts over six years,” Leighton concluded. “These applications we’re looking at are just very basic things that commerce sites are going to want to do, media sites are going to want to do, whether or not there’s some part of the AI ecosystem where there’s a bubble that burst.”