Comparison
| Wrapper | Install Extra | Provider Methods Traced |
|---|---|---|
wrap_openai | pandaprobe[openai] | chat.completions.create, responses.create |
wrap_anthropic | pandaprobe[anthropic] | messages.create, messages.stream |
wrap_gemini | pandaprobe[gemini] | models.generate_content, models.generate_content_stream |
Quick example
Wrappers work seamlessly with manual instrumentation. If a wrapper call happens inside a
pandaprobe.start_trace() or @pandaprobe.trace context, the LLM span is automatically nested as a child span.Provider guides
OpenAI
Chat Completions and Responses API, streaming, and tool spans
Anthropic
Messages API, streaming patterns, and extended thinking
Google Gemini
generate_content, async, streaming, and thinking mode