Meerkat is provider-agnostic. Your tools, sessions, hooks, and configuration work identically across Anthropic, OpenAI, and Gemini. Switch providers by changing the model name — no code changes. Set an API key for at least one.
Provider setup
export ANTHROPIC_API_KEY = "sk-ant-..."
Model Context Max output Best for claude-opus-4-6200K / 1M (beta) 128K Complex reasoning, highest quality claude-sonnet-4-5200K / 1M (beta) 64K Balanced performance and cost claude-opus-4-5200K 64K Legacy Opus (still supported) claude-haiku-4-5200K 64K Fast, simple tasks
config.toml (active realm)
[ agent ]
model = "claude-opus-4-6"
max_tokens_per_turn = 16384
export OPENAI_API_KEY = "sk-..."
Model Context Best for gpt-5.21M General purpose, advanced reasoning gpt-5.2-pro1M Highest quality reasoning
config.toml (active realm)
[ agent ]
model = "gpt-5.2"
max_tokens_per_turn = 8192
export GOOGLE_API_KEY = "AIza..."
Model Context Best for gemini-3-pro-preview1M Advanced reasoning, complex tasks gemini-3-flash-preview1M Fast, balanced performance
config.toml (active realm)
[ agent ]
model = "gemini-3-pro-preview"
max_tokens_per_turn = 8192
Environment variables
Variable Fallback Provider RKAT_ANTHROPIC_API_KEYANTHROPIC_API_KEYAnthropic Claude RKAT_OPENAI_API_KEYOPENAI_API_KEYOpenAI GPT RKAT_GEMINI_API_KEYGEMINI_API_KEY, GOOGLE_API_KEYGoogle Gemini
The RKAT_* variants take precedence over provider-native names, so you can run Meerkat with dedicated keys separate from other tools.
SDK feature flags
When using Meerkat as a Rust library, enable only the providers you need:
Feature Description Default anthropicAnthropic Claude support Yes openaiOpenAI GPT support Yes geminiGoogle Gemini support Yes all-providersAll LLM providers (convenience alias) No
Anthropic only (smallest binary)
All providers
meerkat = { version = "0.5" , features = [ "anthropic" , "jsonl-store" ] }
Provider parameters
Provider-specific options can be passed via the --param CLI flag or provider_params in the SDK:
Parameter Description thinking_budgetToken budget for extended thinking (integer) top_kTop-k sampling parameter (integer)
rkat run --model claude-sonnet-4-5 --param thinking_budget= 10000 "Solve this problem"
Parameter Values Description reasoning_effortlow, medium, highReasoning effort for o-series models seedInteger Seed for deterministic outputs
rkat run --model gpt-5.2 --param reasoning_effort=high "Prove this theorem"
Parameter Description thinking_budgetToken budget for extended thinking (integer) top_kTop-k sampling parameter (integer)
rkat run --model gemini-3-flash-preview --param thinking_budget= 5000 "Analyze this data"
Model catalog
The meerkat-models crate maintains a curated catalog of all supported models with their capabilities, context windows, output limits, and provider profile rules. This catalog is the single source of truth for model defaults, allowlists, and capability detection.
Query the catalog programmatically from any surface:
CLI : rkat models catalog
RPC : models/catalog
REST : GET /models/catalog
MCP : meerkat_models_catalog
Auto-detection
The provider is automatically inferred from the model name:
claude-* models use Anthropic
gpt-*, o1-*, o3-*, chatgpt-* models use OpenAI
gemini-* models use Gemini
You can override this with --provider on the CLI or provider in API requests.