Future of Internet in the age of AI
In this article, Cloudflare CEO Matthew Prince discusses AI's impact on internet infrastructure, highlighting the need for AI-capable edge computing and local inference to reduce network latency. He emphasizes the importance of regionalization in AI services due to regulatory complexities and outlines Cloudflare's approach to building a connectivity-focused network. Cloudflare aims to make internet connectivity faster, more secure, and more efficient, aligning closely with AI advancements.
First, the role of AI in internet infrastructure design.
A lot of the attention right now in the AI space is around training, which makes sense to do in traditional hyperscale public clouds. You put a bunch of machines very close to each other, have them hyper-network together, and use that to do training. The cluster of machines are similar to what you need to predict the weather or model nuclear blasts. They're traditional, you know, high-performance computing clusters, which is effectively what the Amazons and Googles and Microsoft's the world built.
The next part, though, of AI is going to be around inference, which I think will be pushed out as close to users as possible, with more than fifty percent happening on end devices. Whether that's an iPhone or a driverless car, you don't want to have to worry about network latency.