SAN FRANCISCO, Sept 29 (Bernama-BUSINESS WIRE) — Cloudflare, Inc. (NYSE: NET), the leading connectivity cloud company, today announced that it is collaborating with Microsoft to make it easier for companies to run AI in the location most suitable for their needs. As inference tasks become more and more distributed, this collaboration will enable businesses to seamlessly deploy AI models across a computing continuum that spans device, network edge, and cloud environments – maximizing the benefits of both centralized and distributed computing models. By leveraging ONNX Runtime across these three tiers, Cloudflare and Microsoft can ensure that AI models can be run wherever processing makes the most sense across this three-tier architecture – from the hyperscale cloud to the hyper-distributed network edge to devices themselves – that best addresses the bandwidth, latency, connectivity, processing, battery/energy, and data sovereignty and localization demands of a given application or service.
AI model training requires significant computational and storage resources in close proximity to one another, making centralized cloud platforms the best environment for the intensive calculations needed in model training. While training will continue to be centralized, inference tasks will be increasingly performed in more distributed locations, specifically on devices themselves and on edge networks. For example, some inference tasks (e.g., an autonomous vehicle braking at the sight of a pedestrian) will run on the physical device for the lowest possible latency. However, to navigate device limitations such as compute, storage, and battery power, more and more tasks will need to run on edge networks. Edge networks – close geographical proximity to end users and devices – will provide an optimal balance of computational resources, speed, and data privacy. Some applications may require moving through three tiers of this computing continuum, with device, edge network, and cloud environments working together to bring the best experience to the end user.