Mistral: Mixtral 8x7B Instruct

Mistral

Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe

ANALYSIS STATUS

No analysis yet

This model exists in our database, but we only publish political alignment results once its full analysis run is completed.

A full run currently requires 114 completed bill analyses.