Glossary

What Is BYOK (Bring Your Own Key)?

BYOK lets you use your own LLM provider API keys through a gateway platform, keeping direct provider pricing while gaining routing and orchestration features.

You only pay credits per request. No monthly subscription. Paid credits never expire.

Replace multiple AI subscriptions with one wallet that includes routing, failover, and optimization.

Why teams start here first
No monthly subscription
Pay-as-you-go credits
Start with trial credits, then buy only what you consume.
Failover safety
Production-ready routing
Auto fallback across providers when latency, quality, or reliability changes.
Data control
Your policy, your choice
BYOK and zero-retention mode keep training and storage scope explicit.
Single API experience
One key, multi-provider access
Use Chat/Compare/Blend/Judge/Failover from one dashboard.
Definition

Bring Your Own Key (BYOK) is a pattern where you provide your own API keys for LLM providers (OpenAI, Anthropic, Google, etc.) to a gateway platform instead of using the gateway's pooled access. The gateway routes your requests using your keys, giving you direct provider pricing, your own rate limits, and your existing billing relationship — while still providing routing, failover, comparison, and other gateway features.

Why BYOK matters

BYOK solves a key tension in LLM infrastructure: you want the operational benefits of a gateway (unified API, failover, comparison) without paying a markup on token costs. With BYOK, you keep your negotiated enterprise pricing, your own rate limit allocation, and your direct billing relationship with each provider. The gateway adds value through routing and orchestration, not through token markup.

BYOK security considerations

Trusting a gateway with your provider keys requires strong security. LLMWise encrypts all BYOK keys using Fernet (AES-128) with a dedicated encryption key. Keys are stored encrypted at rest and only decrypted in memory for the duration of a request. When BYOK keys are available, requests route directly to the provider rather than through a shared pool, and credit charges are skipped entirely since you pay the provider directly.

Using BYOK with LLMWise

Add your provider keys in the LLMWise dashboard under Settings. Once configured, requests to that provider's models automatically use your key. You still get LLMWise features — Auto routing, Compare mode, failover, usage analytics — but pay the provider directly for token usage instead of using LLMWise credits. You can mix BYOK and credit-based access: use your own OpenAI key for GPT requests while using LLMWise credits for Anthropic models.

How LLMWise implements this

LLMWise gives you five orchestration modes — Chat, Compare, Blend, Judge, and Mesh — with built-in optimization policy, failover routing, and replay lab. No monthly subscription is required and paid credits do not expire.

Start free with 40 credits
Evidence snapshot

What Is BYOK (Bring Your Own Key)? concept coverage

Knowledge depth for this concept and direct paths to adjacent terms.

Core sections
3
concept angles covered
Related terms
3
connected topics linked
FAQs
4
common confusion resolved
Term type
Glossary
intro + practical implementation

Common questions

Is BYOK cheaper than using a gateway's built-in access?
Usually yes, especially at scale. With BYOK you pay the provider directly at your negotiated rate, with no gateway markup on tokens. LLMWise does not charge credits for BYOK requests. The savings increase with volume — enterprise customers with negotiated provider discounts see the most benefit.
Are my API keys secure with BYOK?
LLMWise encrypts all BYOK keys using Fernet (AES-128) encryption. Keys are stored encrypted at rest and only decrypted in memory during request processing. They are never logged, never included in error reports, and never shared across users. The encryption key is a mandatory environment variable separate from the application secrets.
What is BYOK in AI?
BYOK (Bring Your Own Key) is a pattern where you provide your own LLM provider API keys to a gateway platform. This lets you keep direct provider pricing and your own rate limits while gaining gateway features like routing, failover, and model comparison. LLMWise supports BYOK with Fernet-encrypted key storage.
Can I use BYOK for some providers and credits for others?
Yes. LLMWise lets you mix BYOK and credit-based access. For example, you might use your own OpenAI key for GPT requests while using LLMWise credits for Anthropic and Google models. Requests using BYOK keys skip credit charges entirely.

One wallet, enterprise AI controls built in

You only pay credits per request. No monthly subscription. Paid credits never expire.

Replace multiple AI subscriptions with one wallet that includes routing, failover, and optimization.

Chat, Compare, Blend, Judge, MeshPolicy routing + replay labFailover without extra subscriptions