Tag
apache-2
2 articles

Trinity-Large-Thinking: 400B U.S.-Made Open Reasoning Model
Arcee AI just released Trinity-Large-Thinking, a 400B-parameter sparse MoE open-weights model trained on 2,048 NVIDIA B300 GPUs for $20M. Apache 2.0, agent-tuned, and 96% cheaper than Claude Opus 4.6.
By Aisha Patel · 7 min · Apr 30, 2026

Mistral Small 4: One Open-Source Model Replaces Three Separate AI Products
Mistral Small 4 unifies instruct, reasoning, and vision in one 119B MoE model under Apache 2.0.
By Marcus Rivera · 4 min · Mar 30, 2026