Skip to main content
1

Install the SDK

pip install pandaprobe[openai]
2

Set environment variables

export PANDAPROBE_API_KEY="your-api-key"
export PANDAPROBE_PROJECT_NAME="my-first-project"
export PANDAPROBE_ENDPOINT="https://your-pandaprobe-instance.com"
3

Wrap your OpenAI client

import pandaprobe
from pandaprobe.wrappers import wrap_openai
from openai import OpenAI

client = wrap_openai(OpenAI())

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "What is PandaProbe?"}],
)

print(response.choices[0].message.content)
4

View your trace

Open the PandaProbe dashboard. You should see a trace with an LLM span containing the input messages, output, model name, and token usage, all captured automatically.

What’s next?

Wrappers

Auto-trace OpenAI, Anthropic, and Gemini

Integrations

Trace LangGraph, CrewAI, and more

Manual Instrumentation

Full control with decorators and context managers
For short-lived scripts, call pandaprobe.flush() before the process exits to ensure all traces are sent. Long-running services flush automatically.