Baidu released Ernie 5.1 over the weekend, pitching the model as an efficiency win rather than a scale-up. The company's release post says pre-training cost ran at roughly 6% of comparable models. That figure is self-reported, with no independent accounting of what counts as comparable.
Ernie 5.1 carries about one-third the total parameters of Ernie 5.0 and roughly half the active parameters at inference. Baidu got there by pulling an optimal sub-network out of the elastic matrix it already trained for Ernie 5.0, then continuing to train on top. The technique is called Once-For-All elastic training: a single pre-training pass produces a family of sub-models at varying depths and expert capacities.
On LMArena's search leaderboard, Ernie 5.1 scored 1,223 to land fourth globally and first among Chinese models. That's the strongest external data point. On agent benchmarks like τ³-bench and SpreadsheetBench-Verified, Baidu says the model beats DeepSeek-V4-Pro. AIME26 with tool use came in at 99.6, second to Gemini 3.1 Pro.
Post-training uses a four-stage pipeline: supervised fine-tuning, parallel training of domain expert models, on-policy distillation from those experts into a single student, and a final RL phase for open-ended tasks. Baidu says the split fixes the "seesaw" problem where reasoning gains erode creative writing.
The model is live on the Ernie Bot site and Baidu AI Studio. Baidu's Create 2026 developer conference runs May 13-14 in Beijing, where founder Robin Li is expected to share more. No standalone technical paper for 5.1 yet, so independent verification of the parameter and cost claims will have to wait.
Bottom Line
Ernie 5.1 scored 1,223 on LMArena's search leaderboard, ranking fourth globally and first among Chinese models.
Quick Facts
- Pre-training cost: about 6% of comparable models (Baidu-reported)
- Total parameters: roughly one-third of Ernie 5.0
- Active parameters: about half of Ernie 5.0
- LMArena Search score: 1,223, fourth globally
- AIME26 with tool use: 99.6, second to Gemini 3.1 Pro




