Home Open Source Article

Boring Semantic Layer Enables LLM-Native Data Querying Without SQL

TL;DR

BSL combines dlt's schema discovery with LLM inference to auto-generate semantic models that serve dashboards, APIs, and AI agents from a single source of truth.

Key Points

  • LLM generates 80% of semantic model automatically from normalized database schema; engineers review and validate
  • Python-native library (pip install) runs in-process without external servers or infrastructure overhead
  • Native Model Context Protocol support lets LLMs query semantic layer directly instead of writing SQL, reducing hallucinations
  • Single semantic definition propagates to Streamlit dashboards, FastAPI endpoints, and MCP chatbots via git-based governance

Why It Matters

This solves the fragmentation problem where BI tools lock semantic definitions inside proprietary formats. Multi-modal data consumption (dashboards, APIs, AI agents) can now share one governed definition, eliminating definition drift and reducing modelling work from days to minutes. The approach prioritizes deterministic, debuggable SQL over complex inference—a philosophical shift toward production-grade reliability.
View the demo on GitHub

Source: dlthub.com