Cloudflare announced that Cloudflare R2 Storage, the distributed object storage that eliminates egress costs, is providing essential infrastructure for leading generative artificial intelligence (AI) companies. Cloudflare is announcing several partnerships with AI infrastructure companies to help companies avoid vendor lock-in and make training generative AI models accessible and affordable.
Generative AI requires massive amounts of computing power and relies on graphics processing units (GPUs) to quickly and efficiently process enormous amounts of data to train the large language models (LLMs) that are at the core of their offerings. With the sudden acceleration of generative AI companies coming on the market, these companies are now facing a scarcity of processing power from their cloud providers. This requires companies to move data across multiple clouds or regions to find available GPUs, resulting in skyrocketing costs that cloud providers typically charge for data transfer – especially transfers that are frequent or over large distances. In addition, as new GPU chips optimised for AI workloads are released, AI startups want the flexibility to use the best technology available and not be locked into a single ecosystem. Cloudflare R2 Storage solves both these challenges by providing zero-cost egress – making it simple to migrate large volumes of data across clouds and easily use best-in-class technology.
“Cloudflare is providing the first developer platform built for the age of AI,” said Matthew Prince, co-founder and CEO of Cloudflare. “AI companies understand that Cloudflare is designed for speed and efficiency, enabling them to be competitive in this rapidly changing market. As they grow in popularity and traffic, we’re there to help with storage, security, and performance as costs and bots sneak up on them. Our global network and zero egress fees mean these developers can focus on innovating fast and worry less on the cost and technical decisions that sneak up on them.”
“AI has always put a high demand on storage needs, and Generative AI will only accelerate that trend,’ says Dave McCarthy, Research Vice President at IDC. “While the cloud has become a logical place to store data, 64% of technology decision makers claim they are spending more than their budgets. The secret is out – data transfer fees are a big factor in overall cloud costs.”
Cloudflare Partners with AI Infrastructure Companies
Cloudflare is announcing several new partnerships to support generative AI companies using R2 Storage as part of the infrastructure for their training models. These partnerships will ensure that the innovation around AI-specialised and distributed GPUs works the way it was intended, eliminating vendor lock-in and making training generative AI models accessible and affordable:
- CoreWeave is a specialised cloud provider, delivering a massive scale of GPUs on top of the industry’s fastest and most flexible infrastructure. “At CoreWeave, we build infrastructure for compute intensive use cases, empowering businesses to transform how we engage with technology through Generative AI and LLMs. This specialisation unlocks access to the scale and variety of GPUs that our clients require, on an infrastructure purpose-built for performance, agility, and efficiency that AI workloads rely on,” said Max Hjelm, VP of Sales, CoreWeave. “By partnering with Cloudflare’s R2 Storage, we’re able to further alleviate data lock-in driven by ballooning egress fees on the hyperscalers and empower multi-cloud for businesses that benefit from it in a meaningful way.”
- Lambda, Inc. is the world’s best deep learning cloud and the only public cloud designed for and focused on training LLMs & foundation models. “Lambda provides the easiest way to launch training and inference jobs on NVIDIA GPUs – and with our latest on-demand NVIDIA H100 launch at world’s best price – you can go from signing up to the cloud platform to train or finetune a Gen AI/LLM model within minutes,” said Mitesh Agrawal, Lambda COO and head of cloud. “Our partnership with Cloudflare R2 will help joint customers extend this amazing solution with a best of breed object storage solution without having to worry about egress costs, which is a very common issue in the multi-cloud world.”
- MosaicML helps customers easily train and deploy large AI models with full data privacy and model ownership and will now support R2 in their open-source libraries and training platform. “With the MosaicML training platform, customers can efficiently use Cloudflare R2 as the durable storage backend for training LLMs on any compute provider with zero egress fees,” said Naveen Rao, CEO and co-founder, MosaicML. “AI companies are facing outrageous cloud costs, and they are on the hunt for the tools that can provide them with the speed and flexibility to train their best model at the best price.”
AI Companies Rely on Cloudflare
AI companies like Character.ai, Leonardo.ai, Lexica.art, and SiteGPT.ai rely on Cloudflare to provide the tools needed to serve up real-time inferences, images, conversations, and more with users around the world.
- “Character AIis building the next generation of dialog agents across industries spanning entertainment, education, and more. R2 has been the glue behind our multi-cloud architecture for training and processing requests. We are now able to store our training and production data in R2 for access by any cloud, without egress fees, and get the best prices and performance across multiple cloud providers. Cloudflare is helping us rapidly expand our infrastructure and envision the future of conversational AI.” – Myle Ott, Founding Researcher, Character AI
- “Leonardo.ai brings AI-powered creative tools to users like indie game developers and creators. After launching, our explosive growth was a near existential crisis – the exorbitant data transfer costs from other providers were not sustainable for an early stage, high-growth startup in a data intensive space. Cloudflare’s R2 saved the day. The API compatibility meant switching over our apps was a breeze, with minimal engineering effort. The dashboard tool for migrating our existing data worked flawlessly. And the pricing is unbeatable, the best we found both on a per-GB and egress cost basis.” – Pete Werner, Head of AI, Leonardo.ai
- “Lexica‘s users are generating millions of images per month, letting them unleash their creativity and express themselves in new ways. Lexica.art wouldn’t be possible without R2 and Workers. Using these products has given us a lot of control over the images we serve and helps us keep the website snappy and responsive. With the number of requests we serve, it would easily cost 100x more on Amazon.” – Sharif Shameem, founder, Lexica.art
- “SiteGPT is working to make personalised chatbots accessible to every website. We use Cloudflare for everything – storage, cache, queues, and most importantly for training data and deploying the app on the edge, so I can ensure the product is reliable and fast. It’s also been the most affordable option, with competitors costing more for a single day’s worth of requests than Cloudflare costs in a month. The power of the developer platform and performance tools made it an easy choice to build with Cloudflare.” – Bhanu Teja Pachipulusu, founder, SiteGPT.ai
Discussion about this post