Supported frameworks

Provider patches, framework patches, and OTel-native integrations.

TokenJam supports three integration tiers, listed from least to most opinionated:

  1. Native OTel: agents that already emit OpenTelemetry. No SDK install needed.
  2. Provider patches: intercept at the LLM API level (Anthropic, OpenAI, Bedrock, etc.).
  3. Framework patches: instrument higher-level abstractions (LangChain, CrewAI, AutoGen).

Native OTel

FrameworkStatusNotes
Claude CodeBuilt-intj onboard --claude-code
OpenClawBuilt-indiagnostics-otel plugin
OpenAI Agents SDKBuilt-inNative OTel exporter
Google ADKBuilt-inNative OTel exporter
Strands Agent SDK (AWS)Built-inNative OTel exporter
LlamaIndexBuilt-inopentelemetry-instrumentation-llama-index
HaystackBuilt-inNative OTel exporter
Pydantic AIBuilt-inNative OTel exporter
Semantic KernelBuilt-inNative OTel exporter

Just point OTEL_EXPORTER_OTLP_ENDPOINT at http://127.0.0.1:7391 and start your agent.

Provider patches (Python)

Intercept the API client directly. Framework-agnostic, works inside any orchestrator.

from tokenjam.sdk.integrations.anthropic import patch_anthropic
from tokenjam.sdk.integrations.openai    import patch_openai
from tokenjam.sdk.integrations.gemini    import patch_gemini
from tokenjam.sdk.integrations.bedrock   import patch_bedrock
from tokenjam.sdk.integrations.litellm   import patch_litellm

patch_litellm() covers all providers LiteLLM routes to: OpenAI, Anthropic, Bedrock, Vertex, Cohere, Mistral, Ollama, and more. If you use LiteLLM, you don’t need the individual patches.

OpenAI-compatible providers (Groq, Together, Fireworks, xAI, Azure OpenAI) work via patch_openai(base_url=...).

Framework patches (Python)

FrameworkPatch functionInstruments
LangChainpatch_langchainBaseLLM, BaseTool
LangGraphpatch_langgraphCompiledGraph
CrewAIpatch_crewaiTask, Agent
AutoGenpatch_autogenConversableAgent

Import and call once at startup:

from tokenjam.sdk.integrations.langchain import patch_langchain
patch_langchain()

Spans nest naturally. A CrewAI Task that calls a LangChain tool produces a parent-child span tree, with the tool’s underlying API call as a leaf.

TypeScript

The TypeScript SDK currently ships with the manual SpanBuilder interface. Framework patches for LangChain JS, OpenAI Agents SDK, Vercel AI SDK, and Mastra are on the Roadmap.

NemoClaw integration

NemoClaw isn’t a framework you instrument; it’s a sandbox runtime. TokenJam connects to the OpenShell Gateway WebSocket and turns sandbox events into spans and alerts. See NemoClaw integration.