SQL templates and dbt models
Analyst-ready warehouse roll-ups for Fabric Experiments — SQL templates, dbt project, and R notebooks.
The @fabricorg/experiments-notebooks package ships three layers of
analyst-ready material so you can plug Fabric Experiments into your
existing analytics stack without writing the boilerplate:
- Standalone SQL templates for ad-hoc analysis (Databricks SQL, Snowflake, Postgres, BigQuery).
- dbt project that turns raw exposure / conversion events into materialized marts.
- R notebooks that read the dbt outputs and render lift / posterior / SRM reports.
Source: packages/notebooks/.
Warehouse contract
These layers all assume the standard Fabric Experiments event schema, written to your warehouse by the Cloudflare exposure-worker → R2 → Auto Loader path:
experiments_exposure(
experiment_id STRING,
subject_id STRING,
variant_key STRING,
manifest_version BIGINT,
at TIMESTAMP,
failure STRING NULL -- non-null when variant code threw
)
experiments_conversion(
subject_id STRING,
event_name STRING,
value DOUBLE NULL,
at TIMESTAMP
)If you publish via fx publish --static instead of using the manifest-worker /
exposure-worker pair, write your own ingest into tables that match this
schema and the templates below work unchanged.
SQL templates
Run any of these directly via databricks sql query --query-file <path>,
the Databricks SQL Warehouse UI, or any JDBC client. Each template documents
its substitution variables in a header comment.
| File | Computes |
|---|---|
01-conversion-rate-lift.sql | Per-variant CR + lift vs control |
02-bayesian-posterior.sql | Beta-binomial posterior mean + 95% CI |
03-srm-check.sql | Sample-Ratio Mismatch χ² |
04-cuped-adjusted-lift.sql | CUPED variance reduction |
dbt project
packages/notebooks/dbt/ is a self-contained dbt project that materializes
the warehouse contract into analyst-ready marts. Mirrors the role
mojito-snowplow-storage
plays for Mojito users.
dbt/
dbt_project.yml
models/
staging/
_sources.yml # raw tables exposed by the exposure-worker path
stg_exposures.sql # deduped one-row-per-(exp, subject)
stg_conversions.sql # subject-distinct per (event, day)
stg_recipe_failures.sql # variant-code crashes
marts/
fct_experiment_results.sql # per (exp, variant, event) aggregate
fct_experiment_daily.sql # per (exp, variant, day) ramp/SRM
_marts.yml # tests + docsSetup
Configure ~/.dbt/profiles.yml (Databricks shown — Snowflake, BigQuery,
Postgres adapters all work):
fx:
target: prod
outputs:
prod:
type: databricks
catalog: fx_prod
schema: marts
host: <workspace>.cloud.databricks.com
http_path: /sql/1.0/warehouses/<id>
token: "{{ env_var('DATABRICKS_TOKEN') }}"Then:
cd packages/notebooks/dbt
dbt build # parse + run + test all modelsMarts
fct_experiment_results — per (experiment, variant, event) aggregate.
Refresh nightly. The Bayesian posterior + SRM χ² in
@fabricorg/experiments-stats consume these counts directly via the
hosted API:
SELECT * FROM fx_prod.marts.fct_experiment_results
WHERE experiment_id = 'homepage-cta'
ORDER BY event_name, variant_key;fct_experiment_daily — per (experiment, variant, day) exposure
counts. Powers ramp-up and SRM-monitoring dashboards:
SELECT day, variant_key, exposed_n
FROM fx_prod.marts.fct_experiment_daily
WHERE experiment_id = 'homepage-cta'
ORDER BY day, variant_key;R notebooks
Best-effort .Rmd ports of the most-used reports from
mojito-r-analytics:
01-conversion-rate-lift.Rmd— per-variant CR with bar chart02-bayesian-posterior.Rmd— Beta-binomial posterior densities + Monte Carlo P(beats control)03-srm-check.Rmd— χ² with configurable alarm threshold
Connect via DBI + odbc to your warehouse:
DATABRICKS_JDBC_URL=...
DATABRICKS_TOKEN=...
Rscript -e "rmarkdown::render('packages/notebooks/r/01-conversion-rate-lift.Rmd')"These are not validated in CI and intended as a starting point — adapt the warehouse connection block for your driver of choice.
See also
- Web SDK reference — emitting exposure events from the browser.
- YAML schema — declaring the experiments these tables describe.
- Migrating from Mojito — for teams moving off
mojito-r-analytics.