Back to Blog
Product Comparisons

Cube D3 vs Mitzu: Agentic BI on a Universal Semantic Layer vs Agentic Product Analytics on the Warehouse

Two takes on the agentic analytics stack — one built on a hand-authored universal semantic layer, the other on an auto-built product-analytics semantic layer and a deterministic query engine.

Cube D3 brings AI agents to a universal, hand-authored semantic layer for BI, embedded analytics and chat. Mitzu specialises in agentic product analytics with an auto-built semantic layer and a deterministic SQL engine. Compare architecture, methodology, and SQL examples.

István Mészáros
István Mészáros

Co-founder & CEO

May 14, 2026
10 min read
Cube D3 vs Mitzu: Agentic BI on a Universal Semantic Layer vs Agentic Product Analytics on the Warehouse

TL;DR

Cube D3 is an agentic analytics platform built on Cube's universal semantic layer — three AI agents (Semantic Model, Workbook, Analytics Chat) sit on top of cubes and views that are hand-authored in YAML, JavaScript or Python. Mitzu is an agentic product analytics platform. The Analytics Agent assembles funnel, retention, segmentation, journey, and cohort specifications; a deterministic query engine turns them into SQL. Both run on the customer's warehouse. Cube fans out across many warehouses and many consumers (BI tools, spreadsheets, embedded, AI agents). Mitzu specialises in product, growth and marketing behavioural questions.

Use this comparison to evaluate tools through an agentic analytics lens: which platform fits a universal-BI-semantic-layer surface, and which fits product analytics methodology where a deterministic engine — not the LLM — has to own the query.

Cube calls itself "the agentic analytics platform built on a semantic layer," and was named in the 2026 Gartner Market Guide for Agentic Analytics. Cube D3 is their agentic product, layering AI agents on top of the universal semantic layer that already powers BI, embedded analytics and dashboards. Mitzu sits in an adjacent but narrower category — agentic product analytics — with an auto-built semantic layer and a deterministic query engine that owns the SQL. Both are warehouse-native; the architectures and the use cases are different.

What is Cube D3?

Cube started as an open-source semantic layer (cube-js/cube) and grew into a commercial platform — Cube Cloud — covering BI, embedded analytics, and now agentic analytics under the D3 banner. The product page positions Cube as "Universal Semantic Layer. Business Intelligence. Embedded Analytics."

Cube D3 ("data in cube") was announced in June 2025 and ships three AI agents on top of the semantic layer:

  • Semantic Model Agent — proposes and edits cubes, views, measures and dimensions; helps build the semantic layer from scratch or extend an existing one.
  • Workbook Agent — assembles reports and visualisations inside the Workbooks surface, with point-and-click, SQL, and Python paths alongside the AI flow.
  • Analytics Chat Agent — answers business questions in natural language, running multiple queries against the semantic layer and summarising the result.

The platform exposes its agents via MCP (available in Claude Desktop) and A2A so external agents — Claude, ChatGPT, custom — can call into Cube the way they would any tool. Data access is standardised on REST, GraphQL and SQL. See Cube's MCP write-up for the protocol details.

Underneath, the semantic layer is the same one Cube has shipped for years. Data models are authored in YAML, JavaScript or Python. Cubes represent business entities; views compose cubes into the surfaces consumers actually query; measures, dimensions, segments, joins and pre-aggregations live inside cubes. It is a BI-shaped model — strong for descriptive analytics, embedded dashboards, and metric APIs — and the model is hand-authored upfront, then maintained as the warehouse evolves.

What is Mitzu?

Mitzu is an agentic product analytics platform that runs on your data warehouse and answers behavioural questions through natural-language conversation, without writing SQL. The category is narrower than Cube's — Mitzu is specialised for product, growth and marketing behavioural questions on event data, not for general BI or embedded dashboarding.

Mitzu meets users in three places: the in-app Analytics Agent, the Slack Agent in any public or private channel, and a remote MCP server that exposes Mitzu's capabilities to any MCP-compatible agent (Claude, Cursor, ChatGPT, custom). Setup is handled by a Configuration Agent that scans the warehouse, recognises common event schemas (Segment, Snowplow, Firebase, GA4, custom), maps user and group identifiers, and builds the semantic layer automatically. Supported warehouses include Snowflake, BigQuery, Databricks, Redshift, ClickHouse, Athena, Trino/Presto, Postgres, Firebolt, Starburst and MS Fabric.

The trust differentiator: Mitzu's agent does not write SQL. It assembles structured analysis specifications — funnel steps with a conversion window, retention cohorts and return events, segmentation filters with sampled property values, journey definitions — and a deterministic query engine turns those specifications into SQL. The same specification produces the same SQL every time. Methodology errors that LLMs reliably make (a funnel without a window, a retention chart that double-counts, a cohort defined wrong) are guard-railed by the engine, not by prompt engineering.

Cube D3 vs Mitzu: side-by-side

Cube D3Mitzu
CategoryAgentic BI on a universal semantic layerAgentic product analytics on the warehouse
Who writes the SQLLLM grounded in the hand-authored semantic modelDeterministic query engine, from a typed analysis specification
Semantic layer shapeCubes, views, measures, dimensions, segments, joins, pre-aggregationsEvents, event properties, entities, dimension properties, sampled values
Semantic layer authoringHand-authored in YAML, JavaScript or Python; maintained over timeAuto-built by the Configuration Agent from the warehouse; analyst reviews and refines
Methodology primitivesNone native — funnels, retention and journeys are composed in SQL or model codeFunnel, retention, segmentation, journey, cohort as first-class primitives
WarehousesSnowflake, BigQuery, Databricks, Redshift, ClickHouse, Postgres, MySQL, Trino and othersSnowflake, BigQuery, Databricks, Redshift, ClickHouse, Athena, Trino/Presto, Postgres, Firebolt, Starburst, MS Fabric
ConsumersBI tools, spreadsheets, embedded apps, custom AI agents — many surfaces, one modelIn-app Analytics Agent, Slack Agent, remote MCP server — three product-analytics-shaped surfaces
Agent surfacesSemantic Model Agent, Workbook Agent, Analytics Chat Agent; MCP + A2A accessAnalytics Agent, Configuration Agent, Slack Agent; remote MCP server
Source modelCube Core is open source; Cube Cloud / Cube D3 are commercialCommercial SaaS; self-hosted deployment available on Enterprise
Best forUniversal semantic layer feeding BI, embedded analytics and AI agents from one modelProduct, growth and marketing behavioural questions where methodology must be right

SQL examples: a funnel, two architectures

Take a typical product analytics question: "What is our 7-day signup-to-activation conversion rate, broken down by acquisition channel, for the last 30 days?"

Cube D3: a semantic model and an LLM-authored query

In Cube you'd model the underlying events as a cube, then expose a view the agents can reach. A minimal model in YAML:

# model/cubes/events.yml — hand-authored
cubes:
  - name: events
    sql_table: analytics.events
    dimensions:
      - name: user_id
        sql: user_id
        type: string
      - name: event_name
        sql: event_name
        type: string
      - name: channel
        sql: "{CUBE}.properties:channel::text"
        type: string
      - name: event_time
        sql: event_time
        type: time
    measures:
      - name: count
        type: count
      - name: distinct_users
        sql: user_id
        type: count_distinct

views:
  - name: activity
    cubes:
      - join_path: events
        includes: "*"

With the model in place, the Analytics Chat agent grounds an LLM query against the view. A plausible LLM output for the funnel question:

-- Plausible LLM output against the Cube view.
-- Methodology depends on the prompt, the grounding, and the day.
WITH signups AS (
  SELECT user_id,
         min(event_time) AS signup_at,
         any_value(channel) AS channel
  FROM activity
  WHERE event_name = 'signup'
    AND event_time >= now() - interval '30 days'
  GROUP BY user_id
),
activations AS (
  SELECT user_id, min(event_time) AS activated_at
  FROM activity
  WHERE event_name = 'activated'
    AND event_time >= now() - interval '37 days'
  GROUP BY user_id
)
SELECT s.channel,
       count(*) AS signups,
       count(*) FILTER (WHERE a.activated_at <= s.signup_at + interval '7 days') AS activated_in_7d,
       round(100.0 * count(*) FILTER (WHERE a.activated_at <= s.signup_at + interval '7 days')
             / nullif(count(*), 0), 1) AS conv_pct
FROM signups s
LEFT JOIN activations a USING (user_id)
GROUP BY s.channel
ORDER BY signups DESC;

The semantic layer keeps the join graph and column names consistent. The conversion window, the anchor row for channel attribution, and the distinct-user logic are still authored by the LLM in the prompt. A different prompt, or a slightly different schema, can return: a window measured against the wrong anchor, an activation that pre-dates the signup counted as a conversion, or channel attribution joined off the wrong row when a user has multiple signups. None of these are SQL bugs — they are methodology choices an LLM is making implicitly, every time.

Mitzu: SQL from a deterministic engine

The Mitzu agent does not write the SQL. It assembles a funnel specification — roughly: { first_event: "signup", subsequent_events: ["activated"], conversion_window: "7d", breakdown: "channel", date_range: "last_30_days" } — and the deterministic engine emits the same SQL every time:

-- Engine output for a 2-step funnel with a 7-day conversion window,
-- broken down by channel, for the last 30 days. Same spec → same SQL.
WITH step_1 AS (
  SELECT user_id,
         min(event_time)             AS step_1_at,
         any_value(channel)          AS channel
  FROM events
  WHERE event_name = 'signup'
    AND event_time >= now() - interval '30 days'
    AND event_time <  now()
  GROUP BY user_id
),
step_2 AS (
  SELECT s1.user_id,
         s1.channel,
         min(e.event_time) AS step_2_at
  FROM step_1 s1
  INNER JOIN events e
    ON e.user_id = s1.user_id
   AND e.event_name = 'activated'
   AND e.event_time >  s1.step_1_at
   AND e.event_time <= s1.step_1_at + interval '7 days'
  GROUP BY s1.user_id, s1.channel
)
SELECT s1.channel                                AS channel,
       count(DISTINCT s1.user_id)                AS step_1_users,
       count(DISTINCT s2.user_id)                AS step_2_users,
       round(100.0 * count(DISTINCT s2.user_id)
             / nullif(count(DISTINCT s1.user_id), 0), 1) AS conv_pct
FROM step_1 s1
LEFT JOIN step_2 s2 USING (user_id)
GROUP BY channel
ORDER BY step_1_users DESC;

The conversion window is enforced strictly (activation must be after signup and within 7 days). Distinct users prevent double-counting. Channel comes from the signup row, so attribution is consistent. The engine has been generating this shape of SQL in production for years; the agent's job is to assemble the specification, not to author the query.

The SQL is shown to the analyst as a verification artifact — not the agent's authored work.

UI surfaces — where the work happens

Cube D3 centres on a workspace UI where the Semantic Model Agent and an analyst co-author cubes, views and metrics; Workbooks combine point-and-click, SQL, Python and AI flows for analyst-led report building; Analytics Chat is the natural-language entry point for business users. The same semantic layer also feeds BI tools (Tableau, Power BI, Looker), spreadsheets (Excel, Google Sheets), embedded analytics, and external agents over MCP.

Mitzu puts the agent in the surfaces non-analysts already use. Inside the app, an Analytics Agent chat sits next to the existing point-and-click insights — funnels, retention, segmentation, journeys — so the agent's output and the manual exploration share the same semantic layer and the same engine. In Slack, the Mitzu agent (@mitzu) reads thread context and answers in-channel; this is the surface heavily used by PMs, growth leads and founders who never open the app. The remote MCP server exposes the same capabilities to any external agent. See the product analytics page for screenshots of the in-app surfaces.

Advantages and trade-offs

Cube D3

StrengthsTrade-offs
Universal semantic layer — one model feeds BI tools, embedded analytics, spreadsheets and AI agents over the same definitions.Model is hand-authored. Cubes, views, measures and dimensions need to be written and maintained as the warehouse evolves.
Open-source roots (Cube Core) plus a commercial Cloud offering — full visibility into the modelling layer.LLM authors SQL on top of the semantic layer. Methodology errors on funnels, retention and journeys are still possible.
Strong fit for BI and embedded analytics — caching, pre-aggregations and access policies designed for high-throughput dashboarding.Semantic shape is BI-flavoured (measures, dimensions, joins). It does not natively express funnels, retention windows, cohort definitions or sampled filter values.
MCP and A2A endpoints let external agents reach the same semantic layer used by humans.Cube D3 is a commercial product; the AI surface lives in Cube Cloud rather than the open-source core.
Broad warehouse coverage and broad consumer coverage — one stop for many analytics surfaces.Cross-domain reach comes with breadth, not depth, for product analytics specifically.

Mitzu

StrengthsTrade-offs
The agent does not write SQL — a deterministic query engine does, from a typed specification. Same input, same SQL, same answer.Narrower scope — Mitzu is built for product, growth and marketing behavioural questions, not BI dashboarding, embedded analytics or financial reporting.
Auto-built semantic layer specialised for product analytics — events, event properties, entities, dimension properties and sampled filter values. No hand-authored YAML.Requires event data already in the warehouse. Companies without a warehouse, or with events trapped in a third-party tool that will not export, are not the fit.
Funnel, retention, segmentation, journey and cohort are first-class primitives.Open-ended statistical exploration belongs in a notebook (Hex, Deepnote, Jupyter), not in Mitzu.
Warehouse-agnostic — runs on Snowflake, BigQuery, Databricks, Redshift, ClickHouse, Athena, Trino/Presto, Postgres, Firebolt, Starburst and MS Fabric.Self-hosted deployment is available on the Enterprise tier; lower tiers are SaaS.
Three surfaces share one semantic layer: in-app Analytics Agent, Slack Agent, and a remote MCP server for any external agent.
Per-editor seat pricing with unlimited events; warehouse compute stays under the customer's control.

Capability scorecard

Where each tool stands on the capabilities that matter for product analytics work.

CapabilityCube D3Mitzu
Runs on the customer's warehouse
Multi-warehouse support
Open-source semantic layer core✅ Cube Core
Self-hosted deployment✅ Cube Cloud / self-host options✅ Enterprise tier
Universal semantic layer for BI + embedded + AI
Auto-built semantic layer (no hand-authored YAML)
Deterministic SQL engine (agent does not write SQL)
Native funnel methodology
Native retention methodology
Native segmentation, journey and cohort primitives
Sampled property values for filters
Reviewable SQL surfaced for every answer
MCP server for external agents✅ Remote MCP
Slack agent
Pre-aggregations / caching layer for BI workloads
Embedded analytics surface for end customers

When to choose Cube D3, Mitzu, or both?

These are not substitutes. Cube D3 is a universal semantic layer with an agentic layer on top — strong when many surfaces share one model. Mitzu is an agentic product analytics platform with a deterministic engine — strong when methodology has to be right on funnels, retention and behavioural questions. The right choice depends on which shape of work dominates the team's analytics workload.

  • Choose Cube D3 when you need one semantic layer feeding BI tools, spreadsheets, embedded analytics and AI agents from a single model — and you have the engineering cycles to author and maintain that model in YAML, JavaScript or Python.
  • Choose Mitzu when product, growth or marketing teams need to ask diagnostic behavioural questions (why did week-2 retention drop, did the new pricing page move trial-to-paid, which onboarding step has the highest drop-off) and you want methodology guard-rails the LLM cannot break.
  • Run both when Cube already serves BI and embedded analytics across the organisation and product analytics is a separate workload — let Cube D3 keep owning the broad semantic surface and let Mitzu specialise in the behavioural layer. External agents can reach both via MCP.

FAQ

Can Cube D3 replace a product analytics tool?

For descriptive product metrics — DAU, weekly events by feature, signups by channel — Cube D3 will answer well, especially once the underlying cubes are modelled. For diagnostic product analytics — funnels with strict conversion windows, retention cohorts, journey trees, churn analysis, sampled filter values, impact analysis on a release — methodology lives in whatever SQL the LLM authors against the semantic layer. A deterministic engine that owns the methodology removes a class of error that a BI-shaped semantic layer alone cannot.

Does Mitzu have a universal semantic layer like Cube?

Mitzu's semantic layer is specialised, not universal. It stores events, event properties, entities (users, sessions, accounts, teams), dimension properties on those entities, and sampled property values — the things product analytics questions actually need. It does not aim to replace a BI semantic layer or feed embedded customer-facing analytics. If those are the workloads, a universal semantic layer like Cube is the right shape.

Is Cube open source? Is Mitzu open source?

Cube has an open-source core (cube-js/cube) plus a commercial Cube Cloud where Cube D3 lives. Mitzu is a commercial SaaS product; the Enterprise tier supports self-hosted deployment in the customer's own infrastructure. Pricing is per editor seat with unlimited events on every tier — see the pricing page for current details.

Can I use Cube D3 and Mitzu together?

Yes. Both expose MCP endpoints. A team using Cube as the universal semantic layer for BI and embedded analytics can run Mitzu in parallel for product analytics, with both layers reading from the same warehouse. An external agent (Claude, Cursor, ChatGPT, custom) can be wired to call Cube for BI questions and Mitzu for behavioural questions, picking the right backend per question type.

Where does the data live in either tool?

In the customer's warehouse. Cube reads the warehouse through its semantic layer; Mitzu reads the warehouse through its deterministic engine. Neither moves data into a vendor silo. Compliance, data residency and cost control stay on the customer's side of the line.

References

Key Takeaways

  • The architectural distinction is who writes the SQL: an LLM grounded in a hand-authored semantic layer (Cube D3) or a deterministic engine driven by a typed analysis specification (Mitzu).
  • Cube's semantic layer is BI-shaped — cubes, views, measures, dimensions, joins, segments — authored upfront and maintained over time. Mitzu's is product-analytics-shaped — events, properties, entities, sampled values — built automatically by a Configuration Agent.
  • Funnels, retention, journeys and cohorts are first-class primitives in Mitzu, not SQL the LLM has to compose correctly each time.
  • Both architectures keep the data in the customer's warehouse and neither requires data egress.

About the Author

István Mészáros

Co-founder & CEO

LinkedIn: https://www.linkedin.com/in/imeszaros/

Co-founder and CEO of Mitzu. Passionate about product analytics and helping companies make data-driven decisions.

Share this article

Subscribe to our newsletter

Get the latest insights on product analytics.

Ready to transform your analytics?

See how Mitzu can help you gain deeper insights from your product data.

Get Started

How to get started with Mitzu

Start analyzing your product data in three simple steps

Connect your data warehouse

Securely connect Mitzu to your existing data warehouse in minutes.

Define your events

Map your product events and user properties with our intuitive interface.

Start analyzing

Create funnels, retention charts, and user journeys without writing SQL.