Arcee AI has launched a new general-purpose foundation model named Trinity, which boasts 400 billion parameters, making it one of the largest open-source models developed by a U.S. company. This startup, consisting of only 30 employees, aims to challenge the dominance of major tech firms like Google, Meta, and Microsoft in the AI model market.
While Trinity is optimized for coding and multi-step processes, it currently only supports text functionality. Future enhancements include the development of a vision model and a speech-to-text version, as confirmed by CTO Lucas Atkins. Despite its size, Trinity is not yet a state-of-the-art competitor, unlike Meta’s Llama 4 Maverick, which is already multi-modal.
Benchmark tests indicate that Trinity is competitive in areas such as coding, math, common sense, knowledge, and reasoning, sometimes outperforming Llama. Arcee AI aims to attract U.S. developers and academics by offering superior open-weight models, distancing them from Chinese alternatives. The company previously released two smaller models, the 26B-parameter Trinity Mini and the 6B-parameter Trinity Nano, in December.