Safe LLM Gateway Self-hosted model access, credentials, and usage control
Disconnected
Models 0
Requests 0

Gateway Operations

Manage provider credentials, virtual keys, model access, and usage from one local control surface.

OpenAI-compatible Virtual keys Usage control

Admin Session

Use the same bearer token configured in the container environment.

Gateway Status

Snapshot of the whole gateway right now.

Requests0
Tokens0
Cost$0
Models0
Not connected.

Usage By Key

Quick comparison of spend and traffic across your virtual keys.

Key Req USD

Recent Requests

Last recorded events across all keys.

No data yet.

Add Model

Add a single provider model with your own public alias.

Create Virtual Key

Plaintext key is shown only once and never stored unencrypted.

Provider Credentials

Store upstream service keys encrypted in SQLite and mark one default per provider.

Stored Credentials

Default credentials are used automatically for upstream calls.

Name Provider Preview Default Status Action

Configured Models

Toggle models on or off without deleting them.

Public Name Provider Upstream Status Price Action

Virtual Keys

Disable a key instantly without changing client code elsewhere.

Name Prefix Status Limits Allowed Models Last Used Action

Test Keys

Verify upstream provider credentials and test a virtual key against a specific model.

Provider Check

Validate upstream credential

Use this first to confirm the provider key is alive before testing a virtual key on top of it.

Virtual Key Check

Run one model test

Pick a model from the registry, send a tiny prompt, and inspect the live gateway response.

No tests yet.

Usage By Model

Model Provider Req Tokens USD

Usage By Key

Quick comparison of spend and traffic across your virtual keys.

Key Req USD