Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.pandaprobe.com/llms.txt

Use this file to discover all available pages before exploring further.

Before you begin, make sure you have:
  • A PandaProbe account. Sign up at app.pandaprobe.com.
  • A PandaProbe API key + project name.
  • An API key for the LLM provider you want to trace: OpenAI, Gemini, or Anthropic.
1

Install the SDK

pip install "pandaprobe[openai,gemini,anthropic]"
2

Set environment variables

export PANDAPROBE_API_KEY="your-api-key"
export PANDAPROBE_PROJECT_NAME="your-project-name"
Self-hosting PandaProbe? Set PANDAPROBE_ENDPOINT="your-endpoint".
3

Wrap your LLM client

Choose the provider you want to trace. Make sure the matching provider API key is available in your environment.
from pandaprobe.wrappers import wrap_openai
from openai import OpenAI

client = wrap_openai(OpenAI())

response = client.chat.completions.create(
    model="gpt-5.4",
    messages=[{"role": "user", "content": "What is PandaProbe?"}],
)

print(response.choices[0].message.content)
4

View your trace

Open the PandaProbe dashboard. You should see a trace with an LLM span containing the input messages, output, model name, and token usage, all captured automatically.

What’s next?

Wrappers

Wrappers for OpenAI, Anthropic, and Gemini

Integrations

Integrations for LangGraph, CrewAI, and more

Manual Instrumentation

Full control with decorators and context managers
For short-lived scripts, call pandaprobe.flush() before the process exits to ensure all traces are sent. Long-running services flush automatically.