arcee-ai

Arcee AI: Coder Large

Coder‑Large is a 32 B‑parameter offspring of Qwen 2.5‑Instruct that has been further trained on permissively‑licensed GitHub, CodeSearchNet and synthetic bug‑fix corpora. It supports a 32k context window, enabling multi‑file refactoring or long diff review in a single call, and understands 30‑plus programming languages with special attention to TypeScript, Go and Terraform. Internal benchmarks show 5–8 pt gains over CodeLlama‑34 B‑Python on HumanEval and competitive BugFix scores thanks to a reinforcement pass that rewards compilable output. The model emits structured explanations alongside code blocks by default, making it suitable for educational tooling as well as production copilot scenarios. Cost‑wise, Together AI prices it well below proprietary incumbents, so teams can scale interactive coding without runaway spend.

Input Cost
$0.50
per 1M tokens
Output Cost
$0.80
per 1M tokens
Context Window
32,768
tokens
Compare vs GPT-4o
Developer ID: arcee-ai/coder-large

Related Models

arcee-ai
$0.75/1M

Arcee AI: Virtuoso Large

Virtuoso‑Large is Arcee's top‑tier general‑purpose LLM at 72 B parameters, tuned t...

📝 131,072 ctx Compare →
arcee-ai
$0.18/1M

Arcee AI: Spotlight

Spotlight is a 7‑billion‑parameter vision‑language model derived from Qwen 2.5‑V...

📝 131,072 ctx Compare →
arcee-ai
Free/1M

Arcee AI: Trinity Mini (free)

Trinity Mini is a 26B-parameter (3B active) sparse mixture-of-experts language model featu...

📝 131,072 ctx Compare →