Advanced Features

Tool Calling / Function Calling

Prysm captures tool calls from OpenAI, Anthropic, and Google Gemini models. When a model returns tool calls, they're stored in the trace and displayed in the Request Explorer.

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "What's the weather in London?"}],
    tools=[{
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get current weather for a city",
            "parameters": {
                "type": "object",
                "properties": {"city": {"type": "string"}},
                "required": ["city"]
            }
        }
    }],
)

Logprobs

Request logprobs from OpenAI and they'll be captured in the trace.

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "The capital of France is"}],
    logprobs=True,
    top_logprobs=3,
)

What Gets Captured

FieldDescription
ModelWhich model was called
ProviderThe upstream provider
LatencyTotal request duration in milliseconds
TTFTTime to first token (streaming only)
Prompt tokensInput token count
Completion tokensOutput token count
CostCalculated cost based on model pricing (USD)
Statussuccess, error, or timeout
Request bodyFull messages array, tools, and parameters
Response bodyComplete model response
Tool callsFunction/tool calls returned by the model
LogprobsToken log probabilities (if requested)
User IDFrom header or prysm_context
Session IDFrom header or prysm_context
Custom metadataFrom header or prysm_context
Finish reasonstop, length, tool_calls, or content_filter