OpenClaw
To use TokenDock with OpenClaw, you can configure your LLM provider in the openclaw.json file.
Configuration
Open your ~/.openclaw/openclaw.json file and add TokenDock as an OpenAI-compatible provider:
JSON
{
"providers": {
"tokendock": {
"type": "openai",
"baseURL": "https://tokendock.ai/v1",
"apiKey": "your-tokendock-api-key",
"models": ["Qwen3.6-Plus-256k", "DeepSeek-V3"]
}
},
"agents": {
"defaults": {
"provider": "tokendock",
"model": "Qwen3.6-Plus-256k"
}
}
}
Restart OpenClaw for the changes to take effect. You can now use TokenDock models for your AI agent tasks.