OpenAI commits $20B to Cerebras over 3 years — up from $10B, equity warrants for up to 10% stake
·The Information
OpenAI doubled its Cerebras commitment to more than $20B (over $20 billion) over three years, expanding a January deal that was already worth $10B for 750 megawatts of compute capacity. Under the new terms, OpenAI receives warrants for a minority stake that could reach 10% as spending scales, with total outlay potentially hitting $30B. OpenAI also earmarked roughly $1B to help Cerebras build dedicated data centers for its workloads. The deal is explicitly positioned to reduce OpenAI's Nvidia dependency and lock in non-GPU wafer-scale silicon for inference at ChatGPT scale. Reported first by The Information on April 17.
openaicerebrascomputenvidiainfrastructure
Why it matters
This is the largest single commitment OpenAI has ever made to a non-Nvidia accelerator vendor, and the equity-warrant structure is a first-of-kind financing template: OpenAI becomes a partial owner of its own inference supplier. If Cerebras ships on the 750 MW ramp, OpenAI's inference unit economics become structurally decoupled from Nvidia list pricing — which is the single biggest variable cost in ChatGPT operations. Expect Anthropic and Google to accelerate their own custom-silicon economics (Trainium, TPU) in response, and Nvidia's 90%+ training share to finally start eroding in the inference tier.
Impact scorecard
8.4/10
Stakes
9.0
Novelty
8.0
Authority
8.5
Coverage
8.5
Concreteness
9.5
Social
7.5
FUD risk
2.0
Coverage24 outlets · 4 tier-1
The Information, Bloomberg, Reuters, SiliconAngle, CryptoBriefing, Meyka, …
X / Twitter9,800 mentions @sama · 14,000 likes
Reddit1,900 upvotes r/OpenAI
r/OpenAI, r/singularity, r/technology
Trust check
high
The Information primary scoop, widely replicated across Bloomberg and Reuters within hours. OpenAI published a corroborating partnership page. Numbers (over $20B, $1B DC, ~10% warrant stake) are attributed to named sources at The Information.
Kronos (AAAI 2026 accepted, arxiv 2508.02739) is the first open-source foundation model pre-trained on financial candlestick (K-line) sequences. A specialized tokenizer quantizes multi-dimensional OHLCV data into hierarchical discrete tokens; a decoder-only autoregressive transformer is pre-trained on 12B (12 billion) K-line records from 45 global exchanges. Results against the leading time-series foundation model (TSFM) and best non-pretrained baseline: 93% higher RankIC on price-series forecasting over TSFM and 87% over the non-pretrained baseline; 9% lower MAE on volatility forecasting; 22% improvement in generative fidelity for synthetic K-line sequences. Model, weights, and demo are open on GitHub (shiyu-coder/Kronos) — repo is currently GitHub-trending.
Google Research published Simula in Transactions on Machine Learning Research (April 16, 2026): a framework that reframes synthetic data generation as mechanism design, using reasoning-driven construction rather than sample-level optimization. The team (Tim R. Davidson, Benoit Seguin, Enrico Bacis, Cesar Ilharco, Hamza Harkous) generated datasets of up to 512K (512,000) data points across five domains — cybersecurity (CTI-MCQ, CTI-RCM), legal reasoning (LEXam), math (GSM8k), and multilingual knowledge (Global MMLU). Results show 'better data scales better': a 10% accuracy gain on math reasoning using Gemini 2.5 Flash as teacher and Gemma-3 4B as student. The four-step recipe is global diversification → local diversification → complexification → quality checks. Complexification helped math but hurt legal reasoning — the paper warns mechanism design is domain-dependent.
coleam00/Archon is a TypeScript open-source workflow harness that makes AI coding deterministic and repeatable through YAML-defined development processes. Hit 18.8k GitHub stars and is trending weekly. Latest release v0.3.6 on April 12, 2026 with 1,265 commits on dev branch. It ships 17 default workflows covering issue fixes, feature development, PR reviews, and refactoring. Core features: isolated execution (each run gets its own git worktree for parallel conflict-free processing), composable workflows (mix deterministic nodes like bash/tests/git with AI-powered steps like planning/code-gen/review), multi-platform (CLI, Web UI, Slack, Telegram, Discord, GitHub webhooks), and human gates (interactive approval steps). MIT licensed, requires Bun + Claude Code + GitHub CLI.