Codestral Mamba

Mistral

A 7.3B parameter Mamba-based model designed for code and reasoning tasks. - Linear time inference, allowing for theoretically infinite sequence lengths - 256k token context window - Optimized for quick responses, especially beneficial for code productivity - Performs comparably to state-of-the-art transformer models in code and reasoning tasks - Available under the Apache 2.0 license for free use, modification, and distribution

ANALYSIS STATUS

No analysis yet

This model exists in our database, but we only publish political alignment results once its full analysis run is completed.

A full run currently requires 114 completed bill analyses.