Error Handling

The proxy preserves upstream error types. If the LLM provider returns an error, you get the same exception you'd get without Prysm.

import openai

try:
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": "test"}],
    )
except openai.AuthenticationError:
    print("Invalid upstream API key (check project settings)")
except openai.RateLimitError:
    print("Provider rate limited")
except openai.APIError as e:
    print(f"API error: {e}")

Prysm-Specific Errors

StatusErrorCause
401Invalid API keyThe sk-prysm-* key is missing, malformed, or revoked
403Request blockedSecurity scan detected a high-threat request (injection, PII, policy violation)
404Project not foundThe API key doesn't belong to any active project
429Rate limitedFree tier limit reached (10K requests/month)
502Provider errorUpstream provider returned an error or is unreachable
504TimeoutRequest to the upstream provider timed out

Troubleshooting

SymptomCauseFix
401 on every requestWrong API key formatEnsure key starts with sk-prysm-
502 on first requestProvider not configuredAdd provider API key in Settings
Traces missing in dashboardKey mismatchVerify the key in your code matches dashboard
High latencyCold start or provider slownessCheck provider status page; first request is slower
403 unexpected blocksSecurity too aggressiveLower threat threshold in Security settings