Snowplow is a popular choice for event instrumentation. If you are using Snowplow, you can share your event data with Kubit in a couple of ways:
- If you are already storing the Snowplow event tables in Snowflake/BigQuery/Redshift/Databricks the recommended integration approach is to share them with Kubit by following the corresponding Secure Data Sharing guide.
- In case you haven't built your data warehouse yet, you can configure a Snowplow destination, targeting the Kubit Snowflake account.
- The Kubit team will provide you with the Snowflake credentials for the destination.
- Follow the Snowflake destination guide depending on your Snowplow plan:
We recommend picking the Spark transformer as it supports deduplication prior to loading the data into the data warehouse.
Snowplow does not store any data and thus there are no built-in features to replay historical data. Events start streaming from the moment the Destination is configured. In case you need some historical tail and are already storing the event stream in a data warehouse we suggest sharing your data through a Secure Data Share instead of using a Destination.
Updated 5 months ago