Embedded Analytics in Python: A Developer's Guide (2026)

Embed dashboards in Python apps with FastAPI, Django, or Flask. Guest-token pattern for multi-tenant row-level security, three frontend rendering options (web component, Streamlit iframe, Dash iframe), and how Databrain, Metabase, Cube, Superset, and Lightdash compare for Python teams.

Rahul Pattamatta
Co‑Founder and CEO of DataBrain
Published On:
April 28, 2026
Updated On:
April 28, 2026
Updated On:
March 24, 2026

Key Takeaways

  • Embedding analytics into a Python app from a FastAPI / Django / Flask backend is a 1–5 day project; building the same capability in Streamlit + custom auth + multi-tenancy is 4–8 weeks of plumbing.
  • The integration shape is identical across vendors: your Python backend exchanges a long-lived API key for a short-lived guest token scoped to a tenant ID; the frontend renders a component (web component, JS SDK, or iframe) using only the guest token.
  • The frontend does not have to be React. Web components (<dbn-dashboard>) work in plain HTML, inside a Streamlit app via st.components.v1.html, inside a Dash app via html.Iframe, or inside any framework you choose.
  • Token-scoped row-level security is the killer feature: pass clientId when minting the token, every query for that token gets a WHERE tenant_id = $clientId filter - provided you've defined your metrics correctly. This eliminates 90% of the "build multi-tenancy" plumbing.
  • Build vs. embed in Python is asymmetric. Streamlit and Dash are excellent for internal dashboards (one tenant, one team) but a poor fit for customer-facing multi-tenant analytics. The crossover point comes faster than most teams expect.

If you'd rather build the dashboard from scratch in Streamlit / Dash / Gradio, we have a separate guide: Python Dashboard: The Complete 2026 Guide. This article assumes you've decided to embed.

You'll learn:

  • Three architectural patterns for embedding analytics in a Python-backed app (and which to avoid)
  • How to install and call an embed API from Python
  • Generating guest tokens from FastAPI, Django, and Flask
  • Implementing row-level security via tokens (so tenant A can't see tenant B's data)
  • Three frontend rendering options (plain HTML web component, Streamlit iframe / components.v1.html, Dash html.Iframe)
  • Triggering dashboard refreshes from a Python data pipeline (Airflow, Prefect, Dagster)
  • How Databrain, Metabase, Cube, Superset, Preset, and Lightdash compare for Python teams

Reference code: the Node.js equivalent of this pattern is in dbn-demos-updated/dbn-demo-react/backend/server.js. The Python translations below follow the same /api/v2/guest-token/create contract, line-by-line.

Most Python-backed SaaS products eventually need customer-facing analytics - a dashboard inside the product where each tenant sees their own data, can filter it, drill in, and export. The honest engineering math: building this from scratch in Streamlit or Dash takes 4–8 weeks once you account for multi-tenancy, RBAC, exports, scheduled email reports, theming, and the long tail of "can you also add a Sankey?". Embedding a pre-built analytics layer is a 1–5 day project from a Python backend.

This guide covers what "embedded analytics in Python" actually looks like in 2026: the three integration patterns from a Python backend, what the FastAPI / Django / Flask code looks like end-to-end, how guest tokens give you multi-tenant row-level security without you building it, three frontend rendering options (web component, Streamlit iframe, Dash iframe), and an honest comparison of the platforms Python teams actually use - Databrain, Metabase, Cube, Superset, Preset, Lightdash.

When to Embed vs. Build From Scratch

Before any code, the honest decision:

ApproachTime to shipMulti-tenant out of the box?Best for
Custom build in Streamlit / Dash + your own auth + RBAC + multi-tenancy + exports4–8 weeksNo - you build itInternal-only dashboards, single-tenant tools
Pure-Python full-stack (Reflex, NiceGUI)4–8 weeksNoPure-Python teams who'd rather not embed JS
Custom Flask/FastAPI + React6 weeks – 6 monthsNo - you build it (this is the hardest path)When the dashboard is the product
Embedded analytics (Databrain, Metabase, Cube, Lightdash)1–5 daysYes - token-scoped RLSCustomer-facing dashboards in a SaaS product

Build custom when:

  • The dashboard is internal only (no customer access, no multi-tenancy, no exports needed)
  • You have a dedicated analytics team and a multi-year horizon
  • The dashboard is the product itself (you're Mixpanel, Amplitude, Observable)

Embed when:

  • Analytics is a feature inside a broader SaaS product, not the product itself
  • You have multi-tenant data (each customer sees only their rows)
  • You need more than 5–6 chart types
  • You need exports, scheduled email reports, drill-downs, and filter bars out of the box
  • You want to ship this quarter, not next year

Most Python-backed SaaS teams we've worked with started by building in Streamlit or Dash, hit the multi-tenancy wall, then switched to embedding. The maintenance cost of 20+ chart types with drill-downs and per-tenant access control was the breaking point. BerryBox, Freightify, and SpotDraft all went through this pattern - see their stories.

If you're still deciding, our embedded analytics build vs. buy guide goes into the full engineering-cost math (multi-tenancy and AI infrastructure are where most build estimates blow up). For the broader context on what embedded analytics means, see the complete embedded analytics guide.

The Three Architectural Patterns

There are three ways analytics platforms integrate into a Python-backed app. The Python piece is the same in all three (your backend mints a token); the difference is how the frontend renders the dashboard.

1. iframes (Streamlit / Dash apps embedded as iframes)

Your existing Streamlit or Dash app, embedded in your customer-facing web app via <iframe src="https://your-streamlit-app.com/?embed=true&tenant=42">.

  • Pros: zero frontend integration; full style isolation; trivial to implement; reuses Streamlit/Dash skills your data team already has.
  • Cons: awkward auth (cross-origin cookies, token-in-URL is leak-prone); can't style beyond what Streamlit/Dash exposes; resizing is fiddly; breaks down at multi-tenant scale because each request still hits your Streamlit/Dash backend.
  • Who uses this: internal teams that have a Streamlit dashboard they want to "expose to customers" without rebuilding it.

iframe-embedding a Streamlit app is fine for internal use. For customer-facing SaaS at any scale, you usually want one of the other two.

2. Hosted analytics URL (Metabase, DataBrain hosted dashboard URL)

The vendor hosts the dashboard; your Python backend mints a signed URL scoped to the tenant; you embed via <iframe src="https://analytics.vendor.com/embed/dashboards/123?token=...">.

  • Pros: zero infrastructure on your side; multi-tenancy via the signed URL; works with any frontend.
  • Cons: still iframe-based (same UX caveats); per-vendor signing convention you have to follow exactly.
  • Who uses this: Metabase Pro / Cloud, Looker Studio, Superset Embedded.

3. Web component served from a Python backend (Databrain <dbn-dashboard>, Embeddable)

The vendor ships an HTML custom element. Your Python backend mints a guest token. The frontend renders <dbn-dashboard token="..." dashboard-id="..."> - this works in plain HTML, inside Streamlit, inside Dash, or inside any framework.

  • Pros: Shadow DOM isolates styles (no CSS conflicts); works in React, Vue, Streamlit, Dash, plain HTML - one integration code path; smaller surface area than full SDKs; per-tenant auth via the token, not the URL.
  • Cons: slightly less idiomatic in heavily-typed React (need to declare JSX types); peer-dependency footprint when the component is React-under-the-hood.
  • Who uses this: Databrain, Embeddable, some newer platforms.

This is the approach we'll use in the code below. If your team builds in more than one frontend framework - or if your "frontend" is sometimes a Streamlit app and sometimes a customer-facing React app - web components are the cleanest option because the Python backend is the same code in every case.

Step 1: The Databrain API Contract (Same Across Vendors in Spirit)

Every embed SDK works on the same principle: your backend exchanges your long-lived API key for a short-lived guest token scoped to a specific tenant. The frontend only ever sees the guest token. The Databrain endpoint is POST /api/v2/guest-token/create; Metabase, Cube, and Lightdash have analogous endpoints. The Python code for all of them follows the same shape - only the URL and the body keys differ.

The Databrain request body:

{
  "clientId": "tenant-42",
  "dataAppName": "your-data-app-name",
  "permissions": {
    "isEnableManageMetrics": false,
    "isEnableCustomizeLayout": false,
    "isEnableUnderlyingData": false,
    "isEnableDownloadMetrics": true
  },
  "expiryTime": 3600
}
  • clientId is the tenant boundary. Whatever you pass here determines what data the token can see. In a SaaS app, pass your own tenant ID or workspace ID here - and make sure every metric definition in your dashboard filters on this field. Miss it in one metric and you've shipped a cross-tenant leak. Treat metric definitions with the same rigor as SQL parameter binding in your main app.
  • permissions is your embed RBAC. Different user roles (viewer, editor, admin) get different guest tokens with different permissions.
  • expiryTime in seconds (default 3600 = 1 hour). The frontend refreshes the token before expiry.

The response:

{ "token": "eyJhbGciOiJIUzI1NiI...", "expiresAt": "2026-04-28T08:00:00Z" }

Now let's wire this from three Python backends.

Step 2a: FastAPI Guest-Token Endpoint

# main.py
import os
from contextlib import asynccontextmanager
from typing import Annotated
import httpx
from fastapi import Depends, FastAPI, HTTPException, Request
from pydantic import BaseModel

DATABRAIN_API_BASE = os.environ.get("DATABRAIN_API_BASE_URL", "https://api.usedatabrain.com")
DATABRAIN_API_TOKEN = os.environ["DATABRAIN_API_TOKEN"]
DATABRAIN_DATA_APP_NAME = os.environ["DATABRAIN_DATA_APP_NAME"]


# Reuse a single httpx.AsyncClient across all requests via FastAPI's
# lifespan. The naive `async with httpx.AsyncClient(...) as client:`
# pattern inside the request handler creates a new TCP connection pool
# per request - at any meaningful traffic volume that exhausts file
# descriptors and adds 50-200ms of TCP/TLS handshake to every call.
@asynccontextmanager
async def lifespan(app: FastAPI):
    app.state.http = httpx.AsyncClient(timeout=10.0)
    yield
    await app.state.http.aclose()


app = FastAPI(lifespan=lifespan)


class GuestTokenRequest(BaseModel):
    permissions: dict | None = None


# In production, replace this stub with your real auth dependency
# (JWT decoding, session lookup, OAuth introspection, etc.).
async def get_current_tenant_id(authorization: Annotated[str | None, "Authorization header"] = None) -> str:
    if not authorization:
        raise HTTPException(status_code=401, detail="Missing Authorization header")
    # ... validate the token and resolve tenant_id from your DB ...
    return "tenant-42"


@app.post("/api/guest-token")
async def create_guest_token(
    request: Request,
    body: GuestTokenRequest,
    tenant_id: Annotated[str, Depends(get_current_tenant_id)],
):
    payload = {
        "clientId": tenant_id,
        "dataAppName": DATABRAIN_DATA_APP_NAME,
        "permissions": body.permissions or {
            "isEnableDownloadMetrics": True,
            "isEnableManageMetrics": False,
            "isEnableCustomizeLayout": False,
        },
        "expiryTime": 3600,
    }

    resp = await request.app.state.http.post(
        f"{DATABRAIN_API_BASE}/api/v2/guest-token/create",
        headers={
            "Authorization": f"Bearer {DATABRAIN_API_TOKEN}",
            "Content-Type": "application/json",
        },
        json=payload,
    )

    if resp.status_code >= 400 or "token" not in resp.json():
        raise HTTPException(status_code=502, detail="Failed to mint guest token")

    return {"token": resp.json()["token"]}

Run with uvicorn main:app --reload. The endpoint at POST /api/guest-token returns a token scoped to the authenticated tenant - that's the entire backend integration.

Three things to notice:

  • Tenant ID comes from the auth dependency, never from the request body. This is the single most important security rule when embedding analytics. Trust your own auth, not anything the frontend sends.
  • The Databrain API token lives in environment variables. Never expose it to the frontend.
  • The httpx.AsyncClient is shared across all requests via lifespan. Most FastAPI tutorials show async with httpx.AsyncClient(...) as client: inside the handler, which spins up a fresh TCP connection pool on every call - fine for tutorials, a footgun in production. The lifespan pattern above is the same code with the pool reused.

Step 2b: Django Guest-Token View

# views.py
import os
import httpx
from django.http import JsonResponse
from django.views.decorators.http import require_POST
from django.contrib.auth.decorators import login_required

DATABRAIN_API_BASE = os.environ.get("DATABRAIN_API_BASE_URL", "https://api.usedatabrain.com")
DATABRAIN_API_TOKEN = os.environ["DATABRAIN_API_TOKEN"]
DATABRAIN_DATA_APP_NAME = os.environ["DATABRAIN_DATA_APP_NAME"]


@login_required
@require_POST
def guest_token(request):
    # Resolve tenant_id from the authenticated user.
    # If you use django-tenants or a similar multi-tenant package, this is one line.
    tenant_id = request.user.tenant_id  # adapt to your model

    payload = {
        "clientId": str(tenant_id),
        "dataAppName": DATABRAIN_DATA_APP_NAME,
        "permissions": {
            "isEnableDownloadMetrics": True,
            "isEnableManageMetrics": False,
            "isEnableCustomizeLayout": False,
        },
        "expiryTime": 3600,
    }

    with httpx.Client(timeout=10.0) as client:
        resp = client.post(
            f"{DATABRAIN_API_BASE}/api/v2/guest-token/create",
            headers={
                "Authorization": f"Bearer {DATABRAIN_API_TOKEN}",
                "Content-Type": "application/json",
            },
            json=payload,
        )

    data = resp.json()
    if resp.status_code >= 400 or "token" not in data:
        return JsonResponse({"error": "Failed to mint guest token"}, status=502)

    return JsonResponse({"token": data["token"]})

URL config:

# urls.py
from django.urls import path
from . import views

urlpatterns = [
    path("api/guest-token", views.guest_token, name="guest-token"),
]

Step 2c: Flask Guest-Token Route

# app.py
import os
import httpx
from flask import Flask, jsonify, request
from functools import wraps

DATABRAIN_API_BASE = os.environ.get("DATABRAIN_API_BASE_URL", "https://api.usedatabrain.com")
DATABRAIN_API_TOKEN = os.environ["DATABRAIN_API_TOKEN"]
DATABRAIN_DATA_APP_NAME = os.environ["DATABRAIN_DATA_APP_NAME"]

app = Flask(__name__)


def require_auth(f):
    @wraps(f)
    def wrapper(*args, **kwargs):
        # Replace with your real auth: Flask-Login, Flask-JWT-Extended, etc.
        token = request.headers.get("Authorization")
        if not token:
            return jsonify({"error": "Unauthorized"}), 401
        # ... validate token, resolve tenant_id ...
        request.tenant_id = "tenant-42"
        return f(*args, **kwargs)
    return wrapper


@app.post("/api/guest-token")
@require_auth
def guest_token():
    payload = {
        "clientId": request.tenant_id,
        "dataAppName": DATABRAIN_DATA_APP_NAME,
        "permissions": {
            "isEnableDownloadMetrics": True,
            "isEnableManageMetrics": False,
            "isEnableCustomizeLayout": False,
        },
        "expiryTime": 3600,
    }

    with httpx.Client(timeout=10.0) as client:
        resp = client.post(
            f"{DATABRAIN_API_BASE}/api/v2/guest-token/create",
            headers={
                "Authorization": f"Bearer {DATABRAIN_API_TOKEN}",
                "Content-Type": "application/json",
            },
            json=payload,
        )

    data = resp.json()
    if resp.status_code >= 400 or "token" not in data:
        return jsonify({"error": "Failed to mint guest token"}), 502

    return jsonify({"token": data["token"]})


if __name__ == "__main__":
    app.run(port=5001)

Three frameworks, identical contract. The hard part of "embedded analytics in Python" is not Python - it's defining your metrics so they all filter on the tenant ID. The Python backend is ~25 lines.

Step 3: Row-Level Security via Guest Tokens

Row-level security is the hardest thing to build from scratch and the biggest reason teams switch from custom Streamlit / Dash to embedded.

In a multi-tenant SaaS, tenant A must never see tenant B's rows, even if they know each other's URLs. Building this yourself in Python means:

  1. A tenant-aware middleware on every API route
  2. WHERE clauses injected into every query (or PostgreSQL RLS policies, if you're careful)
  3. Audit logs for access attempts
  4. UI that hides/shows features based on tenant plan
  5. Integration tests for every tenant-scoping bug you'll create

With embed SDKs, this is reduced to passing the right clientId when minting the token. The token itself carries the scope:

payload = {"clientId": request.user.tenant_id, ...}

The analytics platform resolves every query for that token with a WHERE tenant_id = $clientId filter - as long as every metric definition in your dashboard references that field. If one metric skips the filter, that metric leaks across tenants. The platform reduces the surface area a lot, but it doesn't eliminate the need to review metric definitions when you add new ones.

This is also how you do user-level access on top of tenant-level access. Two patterns:

  • Soft filtering via token context: pass extra context like { "userId": ..., "role": ... } in the token (Databrain supports this via params) and have your metric definitions filter on it.
  • Permission flags in the token: the permissions object controls what features show up (metric creation, layout customisation, downloads).

The shorthand: you define your row-scoping rules once in the platform, and every embed for every customer honors them. No leaky middleware, no forgotten WHERE clause across 30 routes.

Step 4: Three Frontend Rendering Options

The Python backend is the same; the frontend depends on what you're rendering inside.

Option A: Plain HTML (no JS framework)

The simplest path. Works in any HTML page - Jinja, Django templates, FastAPI's Jinja support, even a static HTML file.

<!-- templates/dashboard.html -->
<!DOCTYPE html>
<html>
<head>
  <title>Dashboard</title>
  <script type="module" src="https://cdn.usedatabrain.com/web-components/dbn-dashboard.js"></script>
</head>
<body>
  <div id="dashboard"></div>

  <script>
    async function render() {
      const res = await fetch('/api/guest-token', {
        method: 'POST',
        credentials: 'include',
      });
      const { token } = await res.json();

      const el = document.createElement('dbn-dashboard');
      el.setAttribute('token', token);
      el.setAttribute('dashboard-id', 'revenue-overview');
      document.getElementById('dashboard').appendChild(el);

      // Refresh the token before the 1-hour expiry.
      setInterval(render, 50 * 60 * 1000);
    }

    render();
  </script>
</body>
</html>

That's the entire frontend. No npm install, no React, no build step. Works in Django templates, Flask Jinja, or FastAPI's Jinja2Templates.

Option B: Inside a Streamlit app

If your customer-facing app is itself a Streamlit dashboard (not common, but happens for AI / data-team tools that started internal and grew customer-facing), you can embed via st.components.v1.html:

# pages/Analytics.py
import streamlit as st
import streamlit.components.v1 as components
import httpx

st.title("Analytics")

# Mint a token from the same Python backend that mints them for your web frontend.
token_resp = httpx.post(
    "https://api.your-app.com/api/guest-token",
    headers={"Authorization": f"Bearer {st.session_state.access_token}"},
)
token = token_resp.json()["token"]

components.html(
    f"""
    <script type="module" src="https://cdn.usedatabrain.com/web-components/dbn-dashboard.js"></script>
    <dbn-dashboard token="{token}" dashboard-id="revenue-overview"></dbn-dashboard>
    """,
    height=900,
)

The token is minted server-side by your Python backend, passed to Streamlit, then handed off to the web component. Same auth model as Option A.

Option C: Inside a Dash app

For Dash, use html.Iframe pointing at a small HTML page your Python backend serves with the web component already wired up. This is the path the Dash maintainers themselves recommend - there's no first-party way to render arbitrary HTML inside a Dash component, and the long-standing third-party workaround (dash-dangerously-set-inner-html) has been archived since 2018 and shouldn't be used in new code.

# pages/dashboard.py
from dash import register_page, html

register_page(__name__, path="/dashboard")

def layout(token=None, **kwargs):
    return html.Div([
        html.H1("Analytics"),
        html.Iframe(
            src=f"/embed/dashboard?token={token}",
            style={
                "width": "100%",
                "height": "900px",
                "border": "none",
            },
        ),
    ])

The /embed/dashboard endpoint is a tiny route in the same Flask app underneath Dash (app.server.route("/embed/dashboard")) that returns the same HTML from Option A - <script> tag plus <dbn-dashboard> element. The iframe boundary keeps the web component's runtime out of Dash's React tree and avoids the deprecated raw-HTML path entirely.

Don't use dash-dangerously-set-inner-html. PyPI marks it as archived; the last release was 0.0.2 in December 2018. Older Dash tutorials still reference it - those tutorials are stale.

Step 5: Triggering Dashboard Refreshes from a Python Data Pipeline

If your data lands via Airflow / Prefect / Dagster, you can poke the analytics layer to refresh after each pipeline run:

# airflow_dag.py
from airflow import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime
import httpx, os

DATABRAIN_API_BASE = "https://api.usedatabrain.com"
DATABRAIN_API_TOKEN = os.environ["DATABRAIN_API_TOKEN"]


def refresh_dashboards():
    # After your ETL completes, invalidate cached aggregations in the analytics layer.
    # Each vendor has a slightly different endpoint; this is the Databrain shape.
    resp = httpx.post(
        f"{DATABRAIN_API_BASE}/api/v2/data-app/refresh-cache",
        headers={"Authorization": f"Bearer {DATABRAIN_API_TOKEN}"},
        timeout=30.0,
    )
    resp.raise_for_status()


with DAG("daily_analytics_refresh", start_date=datetime(2026, 1, 1), schedule="@daily") as dag:
    refresh = PythonOperator(task_id="refresh", python_callable=refresh_dashboards)

Same pattern for Prefect tasks (@flow + @task), Dagster ops, or a plain cron + python script.py. The point: the Python backend that mints tokens is the same Python codebase that owns the data pipeline, so you can wire them together cleanly.

How Embedded Analytics Platforms for Python Compare

The embedded-analytics-from-Python market in 2026 is a small handful of platforms with very different shapes. Honest breakdown:

PlatformIntegrationStrengthWeaknessStarts at
DatabrainWeb components (<dbn-dashboard>) + REST APIFull analytics app with 48 chart types, drill-downs, RLS, exports, scheduled reports, multi-framework via web componentsCommercial-only - no free tier or OSS option for evaluation; smaller OSS community than Metabase$999–$1,995/mo, unlimited viewers
Metabase Embedding (Cloud / Pro)Signed URL (static embeds) or Modular Embedding SDK for React (interactive)Mature OSS product, huge community, granular React componentsInteractive embedding (SDK), tenant isolation, white-label, RLS all require Pro at $575/mo (10 users included, then per-user) - Starter only does static embeds with a "Powered by Metabase" badgeFree OSS (static, badged) / Starter $100/mo (static, badged) / Pro $575/mo for interactive + RLS
CubeREST/GraphQL API + JS SDKBest if you've adopted Cube as a semantic layer; headless-firstNo dashboard UI out of the box - you build the chartsFree (OSS) / $300+/mo (Cloud)
Apache Supersetiframe + signed URLsOSS, Python-native, large chart libraryMulti-tenancy is DIY; embedding requires careful setupFree (OSS), self-hosted
Preset (managed Superset)iframe + signed URLsManaged Superset with multi-tenant guest tokens built inEmbedded dashboards are not in the Free tier - they're a paid add-on on Professional, billed by viewer-license capacityFree Starter (no embeds) / Professional $20/user/mo + Embedded add-on $500/mo per 50 viewer licenses
LightdashReact SDK (@lightdash/sdk)Strong dbt integration, headless API for Python backendsSupports React 18 and 19, Next.js 15+; CORS configuration required on your Lightdash instance (Cloud customers must contact Lightdash support to set the env vars)Free (OSS) / $50+/mo (Cloud, per-user)

Pricing disclaimer: starts-at numbers are as of April 2026 and move frequently - always cross-check with each vendor's current pricing page before relying on these for a build-vs-buy decision.

Quick picking guide:

  • If you want the fastest path to a multi-tenant customer-facing dashboard with many chart types, drill-downs, exports, and frontend-framework flexibility (web component works in HTML, Streamlit, Dash, React): Databrain
  • If you're already using Metabase internally and want to extend to customers: Metabase Embedding
  • If your team has already invested in Cube as a semantic layer: Cube (and build the chart UI yourself)
  • If you want pure OSS with no vendor lock-in and you're willing to do the multi-tenancy plumbing: Superset
  • If you want managed Superset with embedded guest tokens out of the box: Preset
  • If your analytics are driven from dbt: Lightdash

What You Get Out of the Box vs. Building It

If you skipped the custom build guide because you've already decided to embed, here's the concrete feature gap you're not rebuilding:

CapabilityCustom Build Cost (Streamlit / Dash)Embedded Cost
Chart types~1 week per new type past Plotly's basics30–48 built-in (varies by vendor)
Drill-downs with cross-filtering2–4 weeks (callback graphs in Dash, session-state acrobatics in Streamlit)Declarative config
Data connectors1 week per source (Postgres, Snowflake, BigQuery…)15–20 built-in
Multi-tenancy / row-level security2–4 weeks + a permanent maintenance taxToken-scoped RLS
CSV / PDF export1–2 weeks + server-side PDF renderingBuilt-in
Scheduled email reports2–3 weeks (Celery / RQ + templating + deliverability)Built-in
Dashboard builder UIRarely worth building - multi-quarter projectIncluded (for admins)
Auth / SSO / RBAC2–3 weeks if you need anything beyond basic passwordToken-scoped, configured once
Time to production4–8 weeks1–5 days

The interesting column isn't time-to-ship - it's maintenance tax. Every one of those capabilities needs ongoing engineering forever. Embedded analytics moves that off your roadmap.

When to Build Anyway

Embedding isn't always right. Build custom in Streamlit / Dash / Gradio when:

  • The dashboard is internal only. No customers, no multi-tenancy, no exports, no RBAC. Streamlit ships in a day.
  • **The dashboard is the product.** If you're Mixpanel, you don't embed Metabase.
  • You need UI that no vendor exposes. Interactive simulations, custom drawing tools, dashboards with real-time collaboration cursors.
  • You have an analytics team. If you have three engineers whose full-time job is dashboards, building gives them leverage.
  • The data is ultra-sensitive and can't leave your VPC. Many embedded platforms offer self-hosted deployment on Docker / Kubernetes / VMs (Databrain self-hosted, Metabase Enterprise on-prem, Superset OSS), but if your compliance posture forbids any vendor anywhere in the data path, build.

If any of those describe you, start with our guide on creating a dashboard in Python from scratch - the runnable code is at github.com/databrainhq/dbn-demos-updated/tree/main/python-tutorial-scratch. And before you commit, work through the build vs. buy cost breakdown - the multi-tenancy and AI-infrastructure cost lines are where homegrown Python dashboard estimates consistently miss.

For everyone else, embed. The labor math is rarely close.

Next Steps

Covers FastAPI, Django, Flask, httpx, Databrain API v2 guest-token endpoint. Last updated April 2026.

Rahul Pattamatta is co-founder of Databrain, an embedded analytics platform for SaaS.

Frequently Asked Questions

Can I embed a Streamlit dashboard in my SaaS app?

Yes via iframe (<iframe src="https://your-streamlit-app.com/?embed=true">), but the constraints are real: cross-origin auth is awkward, you can't style beyond what Streamlit exposes, resizing is fiddly, and multi-tenancy means you need a tenant param in the URL - which is leak-prone. For internal use, fine. For customer-facing SaaS analytics, an embedded analytics platform with token-scoped multi-tenancy beats a Streamlit iframe almost every time.

How do I add multi-tenancy to a Python dashboard?

If you're building from scratch in Streamlit / Dash, you have to wire it yourself: tenant-aware middleware on every route, WHERE clauses injected into every query, audit logs, integration tests. Budget 2–4 weeks plus a permanent maintenance tax. If you embed, multi-tenancy is one parameter (clientId) in the guest-token request - the analytics platform handles the rest. See the "Row-Level Security" section above.

Does this work with Django?

Yes - the Django guest-token view is in Step 2b above. Use @login_required as your auth dependency, resolve tenant_id from request.user, and call the same POST /api/v2/guest-token/create endpoint. The frontend (whether your Django templates serve HTML, or you have a separate React frontend) renders the web component using only the returned token.

Do I need React to embed analytics in a Python app?

No. Web component embeds (<dbn-dashboard>) work in plain HTML, inside Django/Flask/Jinja templates, inside a Streamlit app via st.components.v1.html, inside a Dash app via an html.Iframe pointing at a tiny route in your backend that serves the component, or inside Vue / Angular / Svelte. The Python backend is the same in every case.

Can I generate guest tokens from FastAPI?

Yes - the FastAPI guest-token endpoint is in Step 2a above. Use a shared httpx.AsyncClient (instantiated in FastAPI's lifespan and reused across requests - not a fresh async with httpx.AsyncClient(...) per call, which exhausts file descriptors at scale), your existing auth dependency to resolve tenant_id, and POST /api/v2/guest-token/create against the analytics API. ~30 lines including the lifespan setup.

Make analytics your competitive advantage

Get it touch with us and see how Databrain can take your customer-facing analytics to the next level.

Interactive analytics dashboard with revenue insights, sales stats, and active deals powered by Databrain