Integrations

Works with
your stack

One SDK connects to all major LLM providers and agent frameworks. Auto-instrumentation means you're up and running in minutes.

LLM Providers

Every major provider supported

OpenAI

stable

GPT-4, GPT-3.5, and embeddings

Anthropic

stable

Claude 3 Opus, Sonnet, Haiku

Google AI

stable

Gemini Pro and Ultra

Cohere

stable

Command and Embed models

Mistral

stable

Mistral Large, Medium, Small

AWS Bedrock

stable

Multiple foundation models

Azure OpenAI

stable

Enterprise OpenAI deployments

Ollama

beta

Local model deployments

Frameworks

Native support for agent frameworks

LangChain

stable

Callback handlers for chains and agents

LlamaIndex

stable

Query engine and index observability

AutoGPT

beta

Autonomous agent tracing

CrewAI

beta

Multi-agent workflow visibility

Haystack

stable

Pipeline component tracing

Semantic Kernel

beta

Microsoft SK integration

SDKs

Your language, your way

Native SDKs for Python, TypeScript, and Go. Install with a single command and start tracing immediately.

Python 3.8+
pip install reactorjet
TypeScript ES2020+
npm install @reactorjet/core
JavaScript ES2020+
npm install @reactorjet/core
Go 1.19+
go get github.com/reactorjet/core-go
quickstart.py
from reactorjet import ReactorCore

# Auto-instrument OpenAI
reactor = ReactorCore(api_key="your-key")
reactor.instrument_openai()

# Your existing code works unchanged
response = openai.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}]
)
# Traces automatically captured

Ready to connect
your stack?

Get started in under 5 minutes. No configuration required.