Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe
| Signal | Strength | Weight | Impact |
|---|---|---|---|
| Capabilitiesjust now | 50 | 30% | +15.0 |
| Context Windowjust now | 76 | 15% | +11.5 |
| Output Capacityjust now | 20 | 15% | +3.0 |
| Pricingjust now | 6 | 25% | +1.5 |
| Recencyjust now | 6 | 15% | +0.9 |
Community and practitioner feedback adds real-world signal on top of benchmarks and pricing.
Share your experience with Mixtral 8x22B Instruct and help the community make better decisions.
Cost Estimator
You save $11.14/month vs category average
From verified sources. View all benchmarks