LLM Providers
Every major provider supported
OpenAI
stableGPT-4, GPT-3.5, and embeddings
Anthropic
stableClaude 3 Opus, Sonnet, Haiku
Google AI
stableGemini Pro and Ultra
Cohere
stableCommand and Embed models
Mistral
stableMistral Large, Medium, Small
AWS Bedrock
stableMultiple foundation models
Azure OpenAI
stableEnterprise OpenAI deployments
Ollama
betaLocal model deployments
Frameworks
Native support for agent frameworks
LangChain
stableCallback handlers for chains and agents
LlamaIndex
stableQuery engine and index observability
AutoGPT
betaAutonomous agent tracing
CrewAI
betaMulti-agent workflow visibility
Haystack
stablePipeline component tracing
Semantic Kernel
betaMicrosoft SK integration
SDKs
Your language, your way
Native SDKs for Python, TypeScript, and Go. Install with a single command and start tracing immediately.
Python 3.8+
pip install reactorjet TypeScript ES2020+
npm install @reactorjet/core JavaScript ES2020+
npm install @reactorjet/core Go 1.19+
go get github.com/reactorjet/core-go quickstart.py
from reactorjet import ReactorCore
# Auto-instrument OpenAI
reactor = ReactorCore(api_key="your-key")
reactor.instrument_openai()
# Your existing code works unchanged
response = openai.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}]
)
# Traces automatically captured
Ready to connect
your stack?
Get started in under 5 minutes. No configuration required.