Sign up, grab an API key, and point OpenClaw at General Compute. Same models, purpose-built ASICs, 7x faster output.
Create an account at app.generalcompute.com and grab your API key. You get $5 in free credit. Or hand this prompt to your agent:
Sign me up for a General Compute API account and get an API key. Fetch instructions from https://docs.generalcompute.com/agent-signup and follow them.
The agent signup flow sends a verification code to your email, then returns an API key. Save it immediately — it's only shown once.
Check what's available. No auth required:
curl https://api.generalcompute.com/v1/public/models
Full pricing and specs at docs.generalcompute.com/models
Copy the prompt below and hand it to your OpenClaw agent, or manually add the provider to your config:
Add General Compute as a custom provider to my OpenClaw config: Provider name: generalcompute Base URL: https://api.generalcompute.com/v1 API key: <paste your GC API key> API type: openai-completions Then set my default model to one of these General Compute models: - deepseek-v3.2 (best reasoning, 8k context) - minimax-m2.5 (best general-purpose, 160k context) - gpt-oss-120b (fast + cheap, 128k context)
Quick reference
Run a quick test to confirm everything is connected:
curl https://api.generalcompute.com/v1/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-v3.2",
"messages": [{"role": "user", "content": "Hello"}],
"stream": true
}'Replace YOUR_API_KEY with the key from step 1. You should see streamed tokens immediately.
Reach out at founders@generalcompute.com or check the full docs.