These startups are building cutting-edge AI models without the need for a data center
Researchers have utilized GPUs distributed globally, combined with private and public data, to train a new type of large language model (LLM). This move indicates that the mainstream approach to building artificial intelligence may be disrupted.
Two unconventional AI-building startups, Flower AI and Vana, collaborated to develop this new model, named Collective-1.
Flower's developed technology allows the training process to be distributed across hundreds of connected computers over the internet. The company's tech has been used by some firms to train AI models without the need for centralized computing resources or data. Vana, on the other hand, provided data sources such as private messages on X, Reddit, and Telegram.
By modern standards, Collective-1 is relatively small-scale, with 7 billion parameters—these parameters collectively empower the model—compared to today's most advanced models (such as those powering ChatGPT, Claude, and Gemini) with hundreds of billion parameters.
Nic Lane, a computer scientist at the University of Cambridge and co-founder of Flower AI, stated that this distributed approach is expected to scale well beyond Collective-1. Lane added that Flower AI is currently training a 300 billion parameter model with conventional data and plans to train a 1 trillion parameter model later this year—approaching the scale offered by industry leaders. "This could fundamentally change people's perception of AI, so we are going all-in," Lane said. He also mentioned that the startup is incorporating images and audio into training to create multimodal models.
Distributed model building may also shake up the power dynamics shaping the AI industry.
Currently, AI companies construct models by combining massive training data with large-scale computing resources centralized in data centers. These data centers are equipped with cutting-edge GPUs and interconnected via ultra-high-speed fiber-optic cables. They also heavily rely on datasets created by scraping public (though sometimes copyrighted) materials such as websites and books.
This approach implies that only the wealthiest companies and nations with a large number of powerful chips can effectively develop the most robust, valuable models. Even open-source models like Meta's Llama and DeepSeek's R1 are constructed by companies with large data centers. A distributed approach could allow small companies and universities to build advanced AI by aggregating homogeneous resources. Alternatively, it could enable countries lacking traditional infrastructure to build stronger models by networking multiple data centers.
Lane believes that the AI industry will increasingly move towards allowing training in novel ways that break out of a single data center. The distributed approach "allows you to scale computation in a more elegant way than a data center model," he said.
Helen Toner, an AI governance expert at the Emerging Technology Security Center, stated that Flower AI's approach is "interesting and potentially quite relevant" to AI competition and governance. "It may be hard to keep up at the cutting edge, but it may be an interesting fast-follower approach," Toner said.
Divide and Conquer
Distributed AI training involves rethinking how computation is allocated to build powerful AI systems. Creating LLMs requires feeding a model large amounts of text, adjusting its parameters to generate useful responses to prompts. In a data center, the training process is segmented to run parts of tasks on different GPUs and then periodically aggregated into a single master model.
The new approach allows work typically done in large data centers to be performed on hardware potentially miles apart and connected by relatively slow or unreliable internet connections.
Some major companies are also exploring distributed learning. Last year, Google researchers demonstrated a new scheme called DIstributed PAth COmposition (DiPaCo) for segmenting and integrating computation to make distributed learning more efficient.
To build Collective-1 and other LLMs, Lane collaborated with academic partners in the UK and China to develop a new tool called Photon to make distributed training more efficient. Lane stated that Photon enhances Google's approach by adopting a more efficient data representation and shared and integrated training schemes. This process is slower than traditional training but more flexible, allowing for the addition of new hardware to accelerate training, Lane said.
Photon was developed through a collaboration between researchers at Beijing University of Posts and Telecommunications and Zhejiang University. The team released the tool under an open-source license last month, allowing anyone to use this approach.
As part of Flower AI's efforts in building Collective-1, their partner Vana is developing a new method for users to share their personal data with AI builders. Vana's software enables users to contribute private data from platforms like X and Reddit to the training of large language models, specifying potential final uses and even receiving financial benefits from their contributions.
Anna Kazlauskas, co-founder of Vana, stated that the idea is to make unused data available for AI training while giving users more control over how their information is used in AI. "This data is usually unable to be included in AI models because it's not public," Kazlauskas said. "This is the first time that data contributed directly by users is being used to train foundational models, with users owning the AI model created from their data."
University College London computer scientist Mirco Musolesi has suggested that a key benefit of distributed AI training approaches may be unlocking novel data. "Extending this to cutting-edge models will allow the AI industry to leverage vast amounts of distributed and privacy-sensitive data, such as in healthcare and finance, for training without the risks of centralization," he said.
You may also like

Revisiting RWA: Nearly 50,000 people's first on-chain transaction was not Bitcoin, but stock indices and crude oil

Altcoin Price Outlook 2026: The Rotation Is Coming — Just Not the Way You Think
Bitcoin dominance at 58%, Fear & Greed at 39. If you think altcoin season is dead, you're reading the wrong signals. Here's what the data actually says about what comes next.

Oracle: The Second Battlefield Behind the Prediction Market War

a16z's key bet: Kalshi's weekly trading volume approaches $3 billion, transitioning from "prediction games" to financial infrastructure, the market begins to price "uncertainty."

Morning Report | Galaxy Digital announces Q1 2026 financial report; Liquid completes $18 million Series A financing; Polymarket plans to bring major exchanges to the U.S

From a banned economist to the new CEO of Xinhua: Fu Peng has figured out the second half of traffic

Why Private Credit Became the First True Bridge from TradFi to DeFi

Senior cryptocurrency investor: Blockchain is showing a siphoning effect on capital

When traditional crypto derivatives start to subtract: Insights from Hyper Trade's products

My view on blockchain has changed

Will AI Agents use bank cards? Why can't Agentic Payment avoid stablecoins and blockchain?

Deconstructing 80 mainstream payment institutions and wallets worldwide

The MiCA Fast Track for Cryptocurrency Licenses: Why OKX and BVNK Choose Malta

a16z Crypto: Stablecoins are rebuilding the global financial infrastructure

ENI's RWA ambition: to create an enterprise-level BaaS platform that allows Web2 institutions to "go beyond just asset on-chain."

Morning Report | a16z releases global financial new stack report; Websea's withdrawal channel suspected of running away; Strategy purchased 3,273 bitcoins last week

The most Crypto group of people is becoming the least Crypto

MSTR STRC In-depth Study: The BTC Financing Flywheel Behind the 11.5% Yield
Revisiting RWA: Nearly 50,000 people's first on-chain transaction was not Bitcoin, but stock indices and crude oil
Altcoin Price Outlook 2026: The Rotation Is Coming — Just Not the Way You Think
Bitcoin dominance at 58%, Fear & Greed at 39. If you think altcoin season is dead, you're reading the wrong signals. Here's what the data actually says about what comes next.
