← Back to feed
AI

Gemma 4 crosses 10M downloads in one week; Gemma family at 500M total

Sundar Pichai confirmed Gemma 4 has been downloaded 10M+ times in its first week, and the full Gemma open-weights family has now crossed 500M lifetime downloads on Hugging Face and Kaggle. Gemma 4 ships with 9B and 31B dense variants plus a 27B MoE version, all under a license permitting commercial use. Speculative-decoding benchmarks on r/LocalLLaMA report +29% average throughput and +50% on code with an E2B draft model. Reinforces Google's open-weights-parity strategy against Llama and Mistral, and makes Gemma the default choice for teams optimizing latency on open models.

GoogleGemmaOpen WeightsHugging FaceDownload Milestone

Why it matters

A 500M-download lifetime milestone makes Gemma the most-adopted open-weights family after Llama. 10M in one week for Gemma 4 specifically indicates strong practitioner adoption, not just curiosity — enough that downstream tooling, finetunes, and quantized variants will stabilize around it within 4-6 weeks. Expect a wave of Gemma-4-based agent and coding products to launch over the next quarter, and renewed pressure on Meta to ship a Llama refresh.

Impact scorecard

7.4/10
Stakes
7.0
Novelty
6.5
Authority
9.0
Coverage
7.5
Concreteness
9.0
Social
8.0
FUD risk
1.5
Coverage18 outlets · 3 tier-1
X, The Verge, TechCrunch, Ars Technica, Hugging Face blog, Google AI blog, …
X / Twitter9,200 mentions
@sundarpichai · 5,807 likes
@GoogleDeepMind · 3,100 likes
Reddit4,100 upvotes
r/LocalLLaMA
r/LocalLLaMA, r/MachineLearning

Trust check

high

First-party announcement from Google CEO, download counts verifiable on Hugging Face model pages and Kaggle. Speculative-decoding numbers are Reddit community results — directionally reliable but not peer-reviewed.

Primary source ↗