Skip to main content

Documentation Index

Fetch the complete documentation index at: https://eondr.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

In the current source, AI provider configuration lives in Settings → Community Plugins → I18N → 语言模型. This page is responsible for provider profiles, connectivity testing, and llmResponseFormat.

Supported providers

#ProviderPlatform
1OpenAI-compatible (custom)platform.openai.com
2Geminiaistudio.google.com
3Ollama (local)ollama.com
4DeepSeekplatform.deepseek.com
5Zhipu AI (GLM)open.bigmodel.cn
6Moonshot (Kimi)platform.moonshot.cn
7Aliyun (Qwen / Tongyi)dashscope.console.aliyun.com
8Baidu (ERNIE / Qianfan)console.bce.baidu.com/qianfan
9ByteDance (Doubao / Ark)console.volcengine.com/ark
10Groqconsole.groq.com
11SiliconFlowsiliconflow.cn
12OpenRouteropenrouter.ai
13DeepInfradeepinfra.com
14Mistral AIconsole.mistral.ai
15MiniMaxplatform.minimaxi.com
16StepFunplatform.stepfun.com

Add a provider profile

1

Open the `语言模型` tab

Go to Settings → Community Plugins → I18N → 语言模型.
2

Choose a provider

Select the target platform from the provider dropdown.
3

Create or switch a profile

The current UI supports adding, switching, and deleting profiles.
4

Fill in connection fields

When you create a profile, the plugin writes that provider’s default URL and default model. Then you fill in the API key, model name, and price fields.
5

Run connection testing

Click the test button to run connectivity diagnostics.

Fields on this page

These belong to the active provider profile and are stored independently per profile.
These fields drive cost estimation and are stored as CNY per million tokens.
This global field is edited on the 语言模型 page. Its current default is text.

Fields edited elsewhere

These global translation fields are read and written from the AST / Regex / Theme AI panels:
  • llmLanguage
  • llmStyle
  • llmBatchSize
  • llmConcurrencyLimit
  • llmTimeout
Ollama defaults to http://localhost:11434.