Open SourceTrinity-Large-Thinking: 400B U.S.-Made Open Reasoning Model
Arcee AI just released Trinity-Large-Thinking, a 400B-parameter sparse MoE open-weights model trained on 2,048 NVIDIA B300 GPUs for $20M. Apache 2.0, agent-tuned, and 96% cheaper than Claude Opus 4.6.
By Aisha Patel · 7 min · Apr 30, 2026