Bifrost

Open-source LLM gateway with MCP support and governance

Visit Bifrost →

Bifrost is an open-source LLM gateway that routes requests across multiple language model providers with a dynamic plugin architecture and built-in governance controls. It claims 40x faster performance than LiteLLM and integrates with Maxim for end-to-end evaluation and observability. It targets AI engineers and teams building production AI products who need unified model access with monitoring.

At a glance

Company
Bifrost
Pricing
free
API available
Yes
Self-hostable
Yes
Launched
2025-08
Last verified
2026-05-11

Capabilities

multi-model-routingobservabilityplugin-architecturegovernancemcp-supportllm-proxy

Categories

Alternatives

For AI agents: machine-readable markdown version of this page at /tools/bifrost-2.md, or send Accept: text/markdown.