SAN FRANCISCO, May 17 (Bernama-BUSINESS WIRE) — Cloudflare, Inc. (NYSE: NET), the security, performance, and reliability company helping to build a better Internet, today announced that Cloudflare R2 Storage, the distributed object storage that eliminates egress costs, is providing essential infrastructure for leading generative artificial intelligence (AI) companies. Cloudflare is announcing several partnerships with AI infrastructure companies to help companies avoid vendor lock-in and make training generative AI models accessible and affordable.
Generative AI requires massive amounts of computing power and relies on graphics processing units (GPUs) to quickly and efficiently process enormous amounts of data to train the large language models (LLMs) that are at the core of their offerings. With the sudden acceleration of generative AI companies coming on the market, these companies are now facing a scarcity of processing power from their cloud providers. This requires companies to move data across multiple clouds or regions to find available GPUs, resulting in skyrocketing costs that cloud providers typically charge for data transfer – especially transfers that are frequent or over large distances. In addition, as new GPU chips optimized for AI workloads are released, AI startups want the flexibility to use the best technology available and not be locked into a single ecosystem. Cloudflare R2 Storage solves both these challenges by providing zero-cost egress – making it simple to migrate large volumes of data across clouds and easily use best-in-class technology.