49. Model provayderlari¶
- Bu sahifa LLM/model provayderlarini qamrab oladi (WhatsApp/Telegram kabi chat kanallari emas).
- Modelni tanlash qoidalari uchun qarang: /concepts/models.
2. Tezkor qoidalar¶
-
- Model havolalari
provider/modelformatidan foydalanadi (misol:opencode/claude-opus-4-6).
- Model havolalari
-
- Agar
agents.defaults.modelsni o‘rnatsangiz, u ruxsat etilgan ro‘yxatga (allowlist) aylanadi.
- Agar
-
- CLI yordamchilari:
openclaw onboard,openclaw models list,openclaw models set <provider/model>.
- CLI yordamchilari:
6. Ichki provayderlar (pi‑ai katalogi)¶
- OpenClaw pi‑ai katalogi bilan birga yetkazib beriladi. 8. Ushbu provayderlar hech qanday
models.providerskonfiguratsiyasini talab qilmaydi; faqat autentifikatsiyani sozlab, modelni tanlang.
9. OpenAI¶
-
- Provayder:
openai
- Provayder:
-
- Autentifikatsiya:
OPENAI_API_KEY
- Autentifikatsiya:
-
- Namuna model:
openai/gpt-5.1-codex
- Namuna model:
-
- CLI:
openclaw onboard --auth-choice openai-api-key
- CLI:
14. {
agents: { defaults: { model: { primary: "openai/gpt-5.1-codex" } } },
}
15. Anthropic¶
-
- Provayder:
anthropic
- Provayder:
-
- Autentifikatsiya:
ANTHROPIC_API_KEYyokiclaude setup-token
- Autentifikatsiya:
-
- Namuna model:
anthropic/claude-opus-4-6
- Namuna model:
-
- CLI:
openclaw onboard --auth-choice token(setup-tokenni joylashtiring) yokiopenclaw models auth paste-token --provider anthropic
- CLI:
20. {
agents: { defaults: { model: { primary: "anthropic/claude-opus-4-6" } } },
}
21. OpenAI Code (Codex)¶
-
- Provayder:
openai-codex
- Provayder:
-
- Autentifikatsiya: OAuth (ChatGPT)
-
- Namuna model:
openai-codex/gpt-5.3-codex
- Namuna model:
-
- CLI:
openclaw onboard --auth-choice openai-codexyokiopenclaw models auth login --provider openai-codex
- CLI:
26. {
agents: { defaults: { model: { primary: "openai-codex/gpt-5.3-codex" } } },
}
27. OpenCode Zen¶
-
- Provayder:
opencode
- Provayder:
-
- Autentifikatsiya:
OPENCODE_API_KEY(yokiOPENCODE_ZEN_API_KEY)
- Autentifikatsiya:
-
- Namuna model:
opencode/claude-opus-4-6
- Namuna model:
-
- CLI:
openclaw onboard --auth-choice opencode-zen
- CLI:
32. {
agents: { defaults: { model: { primary: "opencode/claude-opus-4-6" } } },
}
33. Google Gemini (API kalit)¶
-
- Provayder:
google
- Provayder:
-
- Autentifikatsiya:
GEMINI_API_KEY
- Autentifikatsiya:
-
- Namuna model:
google/gemini-3-pro-preview
- Namuna model:
-
- CLI:
openclaw onboard --auth-choice gemini-api-key
- CLI:
38. Google Vertex, Antigravity va Gemini CLI¶
-
- Provayderlar:
google-vertex,google-antigravity,google-gemini-cli
- Provayderlar:
-
- Autentifikatsiya: Vertex gcloud ADC’dan foydalanadi; Antigravity/Gemini CLI esa o‘ziga xos autentifikatsiya jarayonlaridan foydalanadi
-
- Antigravity OAuth biriktirilgan plagin sifatida yetkazib beriladi (
google-antigravity-auth, sukut bo‘yicha o‘chirilgan).
- Antigravity OAuth biriktirilgan plagin sifatida yetkazib beriladi (
-
- Yoqish:
openclaw plugins enable google-antigravity-auth
- Yoqish:
-
- Kirish:
openclaw models auth login --provider google-antigravity --set-default
- Kirish:
-
- Gemini CLI OAuth biriktirilgan plagin sifatida yetkazib beriladi (
google-gemini-cli-auth, sukut bo‘yicha o‘chirilgan).
- Gemini CLI OAuth biriktirilgan plagin sifatida yetkazib beriladi (
-
- Yoqish:
openclaw plugins enable google-gemini-cli-auth
- Yoqish:
-
- Kirish:
openclaw models auth login --provider google-gemini-cli --set-default
- Kirish:
-
- Eslatma:
openclaw.jsonichiga mijoz identifikatori (client id) yoki maxfiy kalitni (secret) kiritmaysiz. 48. CLI orqali kirish jarayoni tokenlarni shlyuz xostidagi autentifikatsiya profillarida saqlaydi.
- Eslatma:
49. Z.AI (GLM)¶
-
- Provayder:
zai
- Provayder:
-
- Avtorizatsiya:
ZAI_API_KEY
- Avtorizatsiya:
-
- Namunaviy model:
zai/glm-4.7
- Namunaviy model:
-
- CLI:
openclaw onboard --auth-choice zai-api-key
- CLI:
-
- Aliaslar:
z.ai/*vaz-ai/*zai/*ga normallashtiriladi
- Aliaslar:
5. Vercel AI Gateway¶
-
- Provayder:
vercel-ai-gateway
- Provayder:
-
- Avtorizatsiya:
AI_GATEWAY_API_KEY
- Avtorizatsiya:
-
- Namunaviy model:
vercel-ai-gateway/anthropic/claude-opus-4.6
- Namunaviy model:
-
- CLI:
openclaw onboard --auth-choice ai-gateway-api-key
- CLI:
10. Boshqa ichki (built-in) provayderlar¶
-
- OpenRouter:
openrouter(OPENROUTER_API_KEY)
- OpenRouter:
-
- Namunaviy model:
openrouter/anthropic/claude-sonnet-4-5
- Namunaviy model:
-
- xAI:
xai(XAI_API_KEY)
- xAI:
-
- Groq:
groq(GROQ_API_KEY)
- Groq:
-
- Cerebras:
cerebras(CEREBRAS_API_KEY)
- Cerebras:
-
- Cerebras’dagi GLM modellari
zai-glm-4.7vazai-glm-4.6identifikatorlaridan foydalanadi.
- Cerebras’dagi GLM modellari
-
- OpenAI-mos asosiy URL:
https://api.cerebras.ai/v1.
- OpenAI-mos asosiy URL:
-
- Mistral:
mistral(MISTRAL_API_KEY)
- Mistral:
-
- GitHub Copilot:
github-copilot(COPILOT_GITHUB_TOKEN/GH_TOKEN/GITHUB_TOKEN)
- GitHub Copilot:
20. models.providers orqali provayderlar (custom/base URL)¶
- Maxsus provayderlarni yoki OpenAI/Anthropic‑mos proksilarni qo‘shish uchun
models.providers(yokimodels.json) dan foydalaning.
22. Moonshot AI (Kimi)¶
- Moonshot OpenAI-mos endpointlardan foydalanadi, shuning uchun uni maxsus provayder sifatida sozlang:
-
- Provayder:
moonshot
- Provayder:
-
- Avtorizatsiya:
MOONSHOT_API_KEY
- Avtorizatsiya:
-
- Namunaviy model:
moonshot/kimi-k2.5
- Namunaviy model:
-
Kimi K2 model IDlari:
-
{/moonshot-kimi-k2-model-refs:start/ && null}
-
moonshot/kimi-k2.5
-
moonshot/kimi-k2-0905-preview
-
moonshot/kimi-k2-turbo-preview
-
moonshot/kimi-k2-thinking
-
moonshot/kimi-k2-thinking-turbo{/moonshot-kimi-k2-model-refs:end/ && null}
34. {
agents: {
defaults: { model: { primary: "moonshot/kimi-k2.5" } },
},
models: {
mode: "merge",
providers: {
moonshot: {
baseUrl: "https://api.moonshot.ai/v1",
apiKey: "${MOONSHOT_API_KEY}",
api: "openai-completions",
models: [{ id: "kimi-k2.5", name: "Kimi K2.5" }],
},
},
},
}
35. Kimi Coding¶
- Kimi Coding Moonshot AI’ning Anthropic-mos endpointidan foydalanadi:
-
- Provayder:
kimi-coding
- Provayder:
-
- Avtorizatsiya:
KIMI_API_KEY
- Avtorizatsiya:
-
- Namunaviy model:
kimi-coding/k2p5
- Namunaviy model:
40. {
env: { KIMI_API_KEY: "sk-..." },
agents: {
defaults: { model: { primary: "kimi-coding/k2p5" } },
},
}
41. Qwen OAuth (bepul daraja)¶
- Qwen Qwen Coder + Vision uchun qurilma-kod (device-code) oqimi orqali OAuth kirishni taqdim etadi.
- Biriktirilgan plaginini yoqing, so‘ng tizimga kiring:
44. openclaw plugins enable qwen-portal-auth
openclaw models auth login --provider qwen-portal --set-default
- Model havolalari:
-
qwen-portal/coder-model
-
qwen-portal/vision-model
- Sozlash tafsilotlari va eslatmalar uchun /providers/qwen ga qarang.
49. Synthetic¶
- Synthetic
syntheticprovayderi ortida Anthropic-mos modellalarni taqdim etadi:
- Provayder:
synthetic - Autentifikatsiya:
SYNTHETIC_API_KEY - Namuna model:
synthetic/hf:MiniMaxAI/MiniMax-M2.1 - CLI:
openclaw onboard --auth-choice synthetic-api-key
{
agents: {
defaults: { model: { primary: "synthetic/hf:MiniMaxAI/MiniMax-M2.1" } },
},
models: {
mode: "merge",
providers: {
synthetic: {
baseUrl: "https://api.synthetic.new/anthropic",
apiKey: "${SYNTHETIC_API_KEY}",
api: "anthropic-messages",
models: [{ id: "hf:MiniMaxAI/MiniMax-M2.1", name: "MiniMax M2.1" }],
},
},
},
}
MiniMax¶
MiniMax models.providers orqali sozlanadi, chunki u maxsus endpointlardan foydalanadi:
- MiniMax (Anthropic‑mos):
--auth-choice minimax-api - Autentifikatsiya:
MINIMAX_API_KEY
Sozlash tafsilotlari, model variantlari va konfiguratsiya namunalarini ko‘rish uchun /providers/minimax sahifasiga qarang.
Ollama¶
Ollama is a local LLM runtime that provides an OpenAI-compatible API:
- Provider:
ollama - Auth: None required (local server)
- Example model:
ollama/llama3.3 - Installation: https://ollama.ai
# Install Ollama, then pull a model:
ollama pull llama3.3
{
agents: {
defaults: { model: { primary: "ollama/llama3.3" } },
},
}
Ollama is automatically detected when running locally at http://127.0.0.1:11434/v1. See /providers/ollama for model recommendations and custom configuration.
Local proxies (LM Studio, vLLM, LiteLLM, etc.)¶
Example (OpenAI‑compatible):
{
agents: {
defaults: {
model: { primary: "lmstudio/minimax-m2.1-gs32" },
models: { "lmstudio/minimax-m2.1-gs32": { alias: "Minimax" } },
},
},
models: {
providers: {
lmstudio: {
baseUrl: "http://localhost:1234/v1",
apiKey: "LMSTUDIO_KEY",
api: "openai-completions",
models: [
{
id: "minimax-m2.1-gs32",
name: "MiniMax M2.1",
reasoning: false,
input: ["text"],
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
contextWindow: 200000,
maxTokens: 8192,
},
],
},
},
},
}
Notes:
- For custom providers,
reasoning,input,cost,contextWindow, andmaxTokensare optional. When omitted, OpenClaw defaults to: reasoning: falseinput: ["text"]cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 }contextWindow: 200000maxTokens: 8192- Recommended: set explicit values that match your proxy/model limits.
CLI examples¶
openclaw onboard --auth-choice opencode-zen
openclaw models set opencode/claude-opus-4-6
openclaw models list
See also: /gateway/configuration for full configuration examples.