Mistral: Mixtral 8x22B Instruct
MistralMistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe
ANALYSIS STATUS
No analysis yet
This model exists in our database, but we only publish political alignment results once its full analysis run is completed.
A full run currently requires 114 completed bill analyses.