Route Claude API Through a Custom Endpoint with ANTHROPIC_BASE_URL
Route Claude API Through a Custom Endpoint with ANTHROPIC_BASE_URL
If you want to run Claude Code or a macOS AI agent through a corporate proxy, a GitHub Copilot Business bridge, or any other Anthropic-compatible gateway, you only need one environment variable: ANTHROPIC_BASE_URL. This post walks through what it does, when you need it, and how Fazm ships it as a first-class setting so users never have to edit config files.
Why You Would Want a Custom Endpoint
There are three realistic reasons to route Claude API calls through something other than the default api.anthropic.com:
- Corporate network constraints. Your company allows outbound traffic only through a specific gateway that logs, audits, or rate-limits every API call. Hitting Anthropic directly is blocked at the firewall.
- Existing AI budget. Your team already pays for GitHub Copilot Business, AWS Bedrock, or another provider that speaks the Anthropic protocol through a local translator. You would rather use that spend than buy Anthropic credits separately.
- Self-hosted or regional gateways. You run a Claude-compatible proxy for caching, observability, or compliance with a data residency requirement.
In all three cases, the application does not need to know anything about the upstream. It just needs a URL to send requests to.
What ANTHROPIC_BASE_URL Actually Does
The official Anthropic SDKs (Python, TypeScript, and the @agentclientprotocol/claude-agent-acp package that Claude Code is built on) all honor the ANTHROPIC_BASE_URL environment variable. When it is set, the SDK uses that URL as the root for every API call instead of https://api.anthropic.com. The request format, auth header, and streaming protocol all stay the same. Your proxy just needs to accept the same shape of request on the other end.
Setting It on the Command Line
For Claude Code and any script that uses the anthropic Python or Node package, one-shot usage is a single line:
export ANTHROPIC_BASE_URL="http://127.0.0.1:8766"
claude
That points Claude Code at a proxy running on localhost port 8766. The proxy is whatever you want it to be: a GitHub Copilot translator, a Bedrock adapter, a logging middleware. The client does not care.
To make it persistent, add the export to your shell profile:
echo 'export ANTHROPIC_BASE_URL="http://127.0.0.1:8766"' >> ~/.zshrc
To unset it for one command (force the default Anthropic API) without touching the shell config:
env -u ANTHROPIC_BASE_URL claude
Setting It Inside a GUI App
Environment variables work fine for terminal tools but break down for GUI apps. macOS GUI apps launched from the Dock, Spotlight, or Finder do not inherit your shell environment. Setting ANTHROPIC_BASE_URL in ~/.zshrc will have zero effect on a running app unless it reads the variable from somewhere else.
This is why Fazm exposes the setting directly in the UI. Go to Settings > Advanced > AI Chat > Custom API Endpoint. Toggle it on, paste your proxy URL, and the app writes it to UserDefaults. The next time the AI subprocess starts, Fazm injects it as ANTHROPIC_BASE_URL in the child process environment. No shell magic, no plist editing, no restart of the whole app required.
Note
The setting lives in Settings > Advanced > AI Chat. It only affects the AI chat subprocess, not any other network call the app makes. Analytics, crash reports, and auto-updater traffic still go to their default endpoints.
When to Use Which Approach
| Scenario | Best approach | Why |
|---|---|---|
| Local CLI scripts, Claude Code | Shell export in ~/.zshrc | Lowest friction, works with every tool that uses the Anthropic SDK |
| macOS GUI app (Fazm) | In-app setting | GUI apps do not inherit shell env, need an explicit UI |
| Corporate rollout across many laptops | Managed profile or in-app setting | Shell exports are fragile across user accounts and update cycles |
| Temporary debugging | env VAR=... command one-shot | Does not pollute the shell or persist past the command |
| CI runners | Pipeline secret + env injection | Keeps the URL out of source but in the process environment at runtime |
The GitHub Copilot Bridge Pattern
One user, Iván, built a local Node proxy that accepts Anthropic-format requests and translates them into GitHub Copilot Business chat completions. He set ANTHROPIC_BASE_URL=http://127.0.0.1:8766, pointed Fazm at it, and tool calling worked end to end without any Fazm code changes. This is the pattern in practice:
- Run a local HTTP server that speaks the Anthropic
/v1/messagesshape - On each incoming request, translate the
messagesarray andtoolsdefinition into whatever format the upstream expects - Stream the response back in the Anthropic SSE format the client is expecting
- Forward auth however the upstream wants (Copilot uses GitHub tokens; Bedrock uses AWS signatures)
The translator is small because the two protocols are structurally similar. The hard part is tool calling: tool definitions, tool_use blocks, and tool_result blocks all need to survive the round trip. Iván reported that agentic flows, multi-turn tool calling, and streaming all worked through his bridge.
Warning
Protocol translators are unofficial by definition. Features that are specific to the Claude API (computer use, cache control, extended thinking) may not round-trip cleanly through a proxy that targets a different upstream. Test the exact workflow you care about before relying on it.
A Minimal Working Proxy
Here is the smallest useful Anthropic-compatible proxy in Node. It logs every request and passes the rest through to api.anthropic.com. You can use this as a starting point for any translator:
import http from 'node:http';
import https from 'node:https';
const UPSTREAM = 'api.anthropic.com';
const PORT = 8766;
http.createServer((req, res) => {
console.log(`[proxy] ${req.method} ${req.url}`);
const options = {
hostname: UPSTREAM,
port: 443,
path: req.url,
method: req.method,
headers: { ...req.headers, host: UPSTREAM },
};
const upstreamReq = https.request(options, (upstreamRes) => {
res.writeHead(upstreamRes.statusCode, upstreamRes.headers);
upstreamRes.pipe(res);
});
req.pipe(upstreamReq);
upstreamReq.on('error', (err) => {
console.error('[proxy] upstream error:', err);
res.writeHead(502);
res.end(JSON.stringify({ error: err.message }));
});
}).listen(PORT, () => {
console.log(`[proxy] listening on http://127.0.0.1:${PORT}`);
});
Run it, then:
export ANTHROPIC_BASE_URL="http://127.0.0.1:8766"
claude
Every Claude Code request now flows through your log line. From here, you can swap the upstream, add a cache, rewrite headers, or translate the body.
Common Pitfalls
- Trailing slash mismatch. Some proxies treat
/v1/messagesand/v1/messages/as different routes. If you see 404s after settingANTHROPIC_BASE_URL, try both with and without a trailing slash on the base URL. - Streaming gets swallowed. If your proxy buffers the response body, SSE streaming will not work. Always
pipethe upstream response straight to the client, never.on('data')and collect. - Tool-use IDs dropped on the floor. Claude tool calls use an
idfield to match results back to calls. A naive translator that strips unknown fields will break multi-turn tool use. Preserve every field you do not understand. - Auth header swapped at the wrong layer. Do not remove
x-api-keyat the proxy if the upstream still expects it. Only rewrite auth if your upstream uses a different scheme (bearer token, HMAC, etc). - GUI app does not pick up the change. If you set
ANTHROPIC_BASE_URLin your shell and relaunch a GUI app from the Dock, it will not see the variable. Use the in-app setting instead, or launch the app from Terminal withopen -a "Fazm Dev.app"after exporting the variable. - Cert pinning. If your proxy uses a self-signed TLS cert, the Anthropic SDK will reject the connection. Either use
http://on localhost, install your cert into the system keychain, or setNODE_EXTRA_CA_CERTSfor Node-based clients.
Checklist Before You Ship This to Users
/v1/messages with the same body shape as AnthropicWrapping Up
ANTHROPIC_BASE_URL is one of those environment variables that almost nobody reads in the docs until they need it. Once you do need it, it unlocks a surprisingly wide range of deployments: corporate proxies, cached gateways, cross-provider translators, and observability layers. For CLI tools, a shell export is enough. For GUI apps like Fazm, shipping it as an in-app setting removes the last bit of friction and makes the feature usable by everyone, not just people who live in a terminal.
Fazm is an open source macOS AI agent. The Custom API Endpoint setting is available in Settings > Advanced > AI Chat. Open source on GitHub.