Data Warehouse
The Data Warehouse integration is currently in active development as part of Experiments v2. This page describes the flow as it stands in the current prototype — naming, scope, and behaviour may change before general availability.
Connect Flagsmith to your data warehouse (Snowflake, BigQuery, or Databricks) to stream flag evaluation and custom event data for experimentation and downstream analysis. Once connected, Flagsmith writes every flag evaluation and custom event to the configured warehouse, where your team can query it, join it with existing business data, and use it to compute experiment metrics.
The Data Warehouse integration is required to run Experiments — Flagsmith reads from the warehouse to compute metric values per variation. Outside of experiments, it's also useful as a durable store of flag-evaluation history for audit and analysis.
Scope
- The connection is organisation-scoped: one connection per organisation, available across every project in that organisation.
- Configure it once and every project inherits it.
Integration Setup
Step 1: Open Organisation Integrations
Screenshot placeholder — Organisation Integrations page with the Data Warehouse card highlighted. Target path:
/img/integrations/data-warehouse/integrations-list.png
- Go to Organisation Integrations from the organisation nav.
- Locate the Data Warehouse card.
- Click Add Integration.
Step 2: Choose a warehouse
Screenshot placeholder — Configuration form showing the Snowflake / BigQuery / Databricks selector cards. Target path:
/img/integrations/data-warehouse/config-form.png
- Pick your warehouse provider — Snowflake, BigQuery, or Databricks.
- Fill in the connection details (account URL, database, schema, warehouse, user, authentication method).
Step 3: Test and connect
Screenshot placeholder — Test-passed state showing the inline success banner above the action row. Target path:
/img/integrations/data-warehouse/test-passed.png
- Click Test Connection to verify your credentials without saving them. Flagsmith attempts to authenticate and reports the result inline.
- If the test passes, click Connect to save the configuration and start streaming data.
- If the test fails, review the error, correct your credentials, and re-run the test.
Editing any connection field clears the last test result. Re-run the test before connecting so you aren't saving untested credentials.
Step 4: Verify data is flowing
Screenshot placeholder — Connected state — live stats card showing 24h flag evaluations and custom events, plus connection details grid. Target path:
/img/integrations/data-warehouse/connected.png
Once connected, the warehouse page shows:
- 24-hour flag evaluation count — confirms Flagsmith is writing to your warehouse
- 24-hour custom event count — confirms your app is writing events Flagsmith can read for metric computation
- Connection details — read-only summary of what's configured, with Edit and Disconnect controls
Managing the connection
- Edit — opens the configuration form with existing values prefilled. Any change clears the last test result; re-run the test before saving.
- Disconnect — stops data streaming and clears the configuration. Historical data already written to your warehouse is unaffected.
How it Works
Flagsmith writes to the warehouse asynchronously, so flag-evaluation latency for your app is unchanged. Evaluations queue and flush in background workers — expect ingestion lag measured in minutes, not seconds.
- Every call to
Get Identity Flagsresults in one evaluation record per flag being written to the configured warehouse. - Custom events sent via the SDK's analytics endpoint are written to a sibling table in the same warehouse.
- Table schemas are managed by Flagsmith — you don't need to create or migrate tables yourself.
Next steps
- Learn how to run Experiments on top of the data you're now streaming.
- Explore other analytics integrations if you also want to stream to a SaaS platform.