AI Optimizer gives you one local point where OpenAI-powered traffic can be observed, cached, and routed more efficiently.
Paste in the license from your welcome email and activate the app on your machine.
Save your API key so AI Optimizer can route and cache OpenAI-powered requests locally.
Once the proxy is running, AI Optimizer listens on http://localhost:3000/v1.
In many setups, the main change is updating your OpenAI base URL so traffic flows through AI Optimizer first.
The app shows request totals, cache hits, and hit rate so you can confirm the optimizer is working while you use your normal workflow.
For many OpenAI-compatible tools, the practical change is using AI Optimizer as the local base URL:
OPENAI_BASE_URL=http://localhost:3000/v1
That lets your workflow hit the local optimizer first instead of going directly to OpenAI every time.