Learn how to export Analyze/Optimize analytics data to Delta Lake.
Need to know
This feature is only available on specific Enterprise plans.
Connect Analyze/Optimize to your data warehouse so your analytics data flows straight into your existing infrastructure. Exporting your data gives you more control over reporting and lets your team run deeper analysis by keeping everything in one place. Once your developers configure the integration, you'll plug in the credentials they provide to start sending newly collected site data to Delta Lake automatically.
Step 1: Configure Delta Lake (developers)
Before you can export data to Delta Lake, your developers need to configure it and provide you with some info that you'll enter in your Analyze/Optimize settings during step 2.
Provide this document to your developers: Configure Delta Lake as a data export destination
Step 2: Configure Analyze/Optimize
Use the following steps only after your developers have confirmed they've configured your data warehouse and provided you with the required authentication info.
Open your site in Webflow, then:
- Go to the Insights tab > Integrations > Data Export integrations tab
- Click Connect data destination under "Data Exports"
- Select Delta Lake
- Enter the info your developers provided (fields are context-sensitive based on what you select):
-
Storage — select the storage type (e.g., S3, GCS, etc.)
-
Bucket name — i.e., the name your developers created for the bucket
-
S3 bucket region — i.e., the location your developers selected for the S3 bucket
-
Folder name — i.e., the name of the folder to create and write data to
-
Bucket host — i.e., the host address for the bucket
-
Bucket port — i.e., the port number for the bucket
-
Storage account name — i.e., the name of the Azure storage account that contains the bucket
-
Auth method — swap between using a IAM Role and user/password
-
Destination service account email — i.e., email address associated with IAM Role
-
JSON Token — i.e., credentials for the user/password account
-
Retention window (days) — i.e., the number of days a file must be deleted before it is vacuumed
-
Column mapping mode — select between None, ID, or Name mapping modes
-
Disable deletion vectors — select Yes or No
-
Disable change data feed — select Yes or No
- Click Test connection (review the troubleshooting info if the test fails)
- Click Save if the test was successful
Newly collected data will be exported on a nightly basis (every 24 hours). Historic data is not exported.
Good to know
It's not yet possible to disconnect a data destination from settings. Contact support to disconnect one.
Troubleshooting
If Test connection produces an error, it means the authentication info you entered is invalid or the developers didn't configure your data warehouse according to the provided documentation.
If the authentication info was copied from certain sources (e.g., web page or email), extra HTML formatting associated with that source can introduce hidden characters when you paste into Webflow. Try the following to remove any hidden characters:
- Paste the info into a plain text editor (e.g., Notepad on Windows or TextEdit on Mac)
- Copy the info directly from that plain text editor
- Paste the "clean" version of the info directly into the Webflow fields
- Click Test connection again
If the issue persists, check with the developers to see if the info they provided you is accurate and provide them with the Delta Lake configuration doc so they can verify it's correctly configured.