BYOK lets you use your own LLM provider API keys through a gateway platform, keeping direct provider pricing while gaining routing and orchestration features.
You only pay credits per request. No monthly subscription. Paid credits never expire.
Replace multiple AI subscriptions with one wallet that includes routing, failover, and optimization.
Bring Your Own Key (BYOK) is a pattern where you provide your own API keys for LLM providers (OpenAI, Anthropic, Google, etc.) to a gateway platform instead of using the gateway's pooled access. The gateway routes your requests using your keys, giving you direct provider pricing, your own rate limits, and your existing billing relationship — while still providing routing, failover, comparison, and other gateway features.
BYOK solves a key tension in LLM infrastructure: you want the operational benefits of a gateway (unified API, failover, comparison) without paying a markup on token costs. With BYOK, you keep your negotiated enterprise pricing, your own rate limit allocation, and your direct billing relationship with each provider. The gateway adds value through routing and orchestration, not through token markup.
Trusting a gateway with your provider keys requires strong security. LLMWise encrypts all BYOK keys using Fernet (AES-128) with a dedicated encryption key. Keys are stored encrypted at rest and only decrypted in memory for the duration of a request. When BYOK keys are available, requests route directly to the provider rather than through a shared pool, and credit charges are skipped entirely since you pay the provider directly.
Add your provider keys in the LLMWise dashboard under Settings. Once configured, requests to that provider's models automatically use your key. You still get LLMWise features — Auto routing, Compare mode, failover, usage analytics — but pay the provider directly for token usage instead of using LLMWise credits. You can mix BYOK and credit-based access: use your own OpenAI key for GPT requests while using LLMWise credits for Anthropic models.
LLMWise gives you five orchestration modes — Chat, Compare, Blend, Judge, and Mesh — with built-in optimization policy, failover routing, and replay lab. No monthly subscription is required and paid credits do not expire.
Start free with 40 creditsKnowledge depth for this concept and direct paths to adjacent terms.
You only pay credits per request. No monthly subscription. Paid credits never expire.
Replace multiple AI subscriptions with one wallet that includes routing, failover, and optimization.