Home Open Source Article

AI Observer: Open-Source Local Observability for AI Coding Assistants

TL;DR

Self-hosted OpenTelemetry backend for tracking token usage, costs, and performance across Claude Code, Gemini CLI, and OpenAI Codex—single binary, zero external dependencies.

Key Points

  • Single ~54MB binary with embedded frontend; ~97MB Docker images for linux/amd64 and linux/arm64
  • DuckDB-powered analytics with real-time WebSocket dashboard and historical JSONL/JSON import
  • Cost tracking for 67+ models across Claude, Codex, and Gemini with pricing data embedded
  • OTLP-native ingestion (HTTP/JSON and HTTP/Protobuf) with local-only data persistence

Why It Matters

Developers using AI coding assistants gain visibility into token consumption, API costs, and performance without shipping telemetry to third-party services. This addresses a critical gap in local development workflows where cost tracking and debugging of AI tool behavior has been opaque, enabling teams to optimize spending and understand tool performance characteristics.
View on GitHub

Source: github.com