by Mistral AI
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe
| Signal | Strength | Weight | Impact |
|---|---|---|---|
| Context Window2026-03-03T20:27:07.679Z | 76 | 15% | +11.5 |
| Capabilities2026-03-03T20:27:07.679Z | 43 | 25% | +10.7 |
| Versatility2026-03-03T20:27:07.679Z | 33 | 10% | +3.3 |
| Output Capacity2026-03-03T20:27:07.679Z | 20 | 10% | +2.0 |
| Pricing Tier2026-03-03T20:27:07.679Z | 6 | 25% | +1.5 |
| Recency2026-03-03T20:27:07.679Z | 8 | 15% | +1.2 |
Cost Estimator
You save $8.79/month vs category average