Overview
ArgentOS supports multiple model providers, allowing you to mix local and cloud models across different tiers. Each provider has its own authentication, endpoint configuration, and model catalog.Anthropic
Primary provider. Claude Haiku, Sonnet, and Opus.
Ollama
Local open-source models. Zero cost.
MiniMax
Alternative cloud provider with competitive pricing.
OpenRouter
Meta-provider routing to hundreds of models.
Anthropic
The primary provider for ArgentOS. Supports Claude Haiku, Sonnet, and Opus.- Endpoint:
https://api.anthropic.com/v1 - Auth: API keys (
sk-ant-api...) or setup tokens (sk-ant-oat01-...) - Models:
claude-haiku-4-20250514,claude-sonnet-4-20250514,claude-opus-4-20250514 - Features: Streaming, tool use, prompt caching (API keys only)
Prompt caching only works with standard API keys, not with Max subscription setup tokens.
Ollama (Local)
Run open-source models locally with zero cost.- Endpoint:
http://localhost:11434 - Auth: None required
- Models: Any model available in Ollama (Qwen3, Llama, Mistral, etc.)
- Default:
qwen3:30b-a3bfor the LOCAL tier
MiniMax
An alternative cloud provider with competitive pricing.- Endpoint:
https://api.minimax.io/v1(portal API) - Auth: Coding Plan key (
sk-cp-...) - Models:
MiniMax-M2.1(general),M2-her(roleplay/character),MiniMax-VL-01(vision)
Z.AI
GLM-series models from Zhipu AI.- Endpoint: Z.AI API
- Auth: Z.AI API key
- Models:
GLM-4.7,GLM-4.7-FlashX,GLM-4.7-Flash
OpenRouter
A meta-provider that routes to many model providers.- Endpoint:
https://openrouter.ai/api/v1 - Auth: OpenRouter API key
- Models: Access to hundreds of models from multiple providers
