See every LLM call, replay from any point, and let humans approve high-stakes actions. Open-source observability for AI agents.
docker compose up
# API: http://localhost:5185
# Frontend: http://localhost:5173
Every prompt, model response, tool call, and error — captured as a structured DAG with full payload, latency, and cost per step.
Click any event, modify its payload, and fork into a new branch. Tracewire warns you about irreversible side-effects before replaying.
Agents pause and wait for human approval before high-stakes actions. Approve, reject, or escalate — with full audit trail.
Drop in an adapter for LangChain, Vercel AI SDK, Semantic Kernel, or any LLM client. Your agent code stays unchanged.
Flag emails, database writes, and API calls as side-effects. Tracewire surfaces them during replay so you know what can't be undone.
Organizations, workspaces, scoped API keys, and workspace-level replay policies. Built for teams from day one.
Instrument your agent in the language you already use.
from tracewire import trace
async with trace("my-agent", api_key="your-key") as t:
t.log_event("Prompt", {"content": "hello"})
t.log_event("ModelResponse", {"content": response}, latency_ms=450)
# Agent pauses — reviewer sees Approve/Reject in the UI
decision = await t.pause_for_human(timeout=120)
import { trace } from "tracewire-sdk";
import { wrapLanguageModel } from "tracewire-sdk/adapters/ai-sdk";
// Wrap your Vercel AI SDK model — everything auto-captured
await trace("my-agent", async (t) => {
const model = wrapLanguageModel(openai("gpt-4o"), t);
const { text } = await generateText({ model, prompt: "Hello!" });
}, { apiKey: "your-key" });
using Tracewire.Sdk;
using Tracewire.Sdk.Adapters;
await using var t = await TracewireTrace.StartAsync("my-agent", apiKey: "your-key");
var llm = new ChatClientAdapter(t, "gpt-4o");
var response = await llm.CallAsync("Hello!", async prompt =>
{
var result = await openAiClient.CompleteChatAsync([new UserChatMessage(prompt)]);
return result.Value.Content[0].Text;
});
| Framework | Language | Adapter |
|---|---|---|
| LangChain | Python | TracewireCallbackHandler |
| AutoGen | Python | TracewireAutoGenMiddleware |
| CrewAI | Python | TracewireCrewCallback |
| Vercel AI SDK | TypeScript | wrapLanguageModel() |
| LangChain.js | TypeScript | createLangChainCallback() |
| Semantic Kernel | .NET | SemanticKernelAdapter |
| Any LLM client | .NET | ChatClientAdapter |