Providers

Prysm supports multiple LLM providers through a unified OpenAI-compatible interface.

OpenAI

Default provider. No special configuration needed — just use OpenAI model names (gpt-4o, gpt-4o-mini, o1, etc.).

Anthropic

Prysm includes an automatic translation layer that converts OpenAI-format requests to Anthropic's native format and back. Use Claude model names directly:

response = client.chat.completions.create(
    model="claude-sonnet-4-20250514",
    messages=[{"role": "user", "content": "Hello!"}],
)

Prysm detects the claude-* prefix and routes to your Anthropic provider key automatically.

Google Gemini

Google's Gemini models support the OpenAI-compatible endpoint natively. Set the provider base URL to https://generativelanguage.googleapis.com/v1beta/openai and use your Google AI API key.

Custom / Self-Hosted Models

Any endpoint that speaks the OpenAI API format works with Prysm. This includes vLLM, Ollama, Together AI, Fireworks, and any other OpenAI-compatible server.