Tencent: Hunyuan A13B Instruct

tencent

Hunyuan-A13B is a 13B active parameter Mixture-of-Experts (MoE) language model developed by Tencent, with a total parameter count of 80B and support for reasoning via Chain-of-Thought. It offers competitive benchmark performance across mathematics, science, coding, and multi-turn reasoning tasks, while maintaining high inference efficiency via Grouped Query Attention (GQA) and quantization support (FP8, GPTQ, etc.).

ANALYSIS STATUS

No analysis yet

This model exists in our database, but we only publish political alignment results once its full analysis run is completed.

A full run currently requires 114 completed bill analyses.