Getting Started

This is the shortest path to sending your first request through PrysmAI.

1. Create your project

Sign in to PrysmAI, then:

  • create your project
  • connect at least one upstream provider key

If you are self-hosting PrysmAI locally, use the Self-Hosted Proxy section later in this guide for stack setup and local base URLs.

2. Generate a Prysm API key

Go to API Keys and create a project key:

sk-prysm-...

3. Install the SDK

pip install prysmai

4. Send your first request

from prysmai import PrysmClient

client = PrysmClient(prysm_key="sk-prysm-...").llm()

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello from Prysm"}],
)
print(response.choices[0].message.content)

Change only the model name to route to another connected provider like Anthropic or Gemini.

5. Check the dashboard

After your first request, you should see:

  • the request in Activity
  • the trace in Requests
  • latency, token, and cost data attached to the record

6. Choose your runtime path

Proxy path

The Python SDK's llm() and async_llm() helpers point at Prysm's OpenAI-compatible proxy automatically.

https://prysmai.io/api/v1

MCP path

Connect your agent runtime to Prysm MCP:

https://prysmai.io/api/mcp

MCP docs:

https://prysmai.io/api/mcp/docs