Receive events from Snowplow


Snowplow is a popular choice for event instrumentation. If you are using Snowplow, you can share your event data with Kubit in a couple of ways:

  1. If you are already storing the Snowplow event tables in Snowflake/BigQuery/Redshift/Databricks the recommended integration approach is to share them with Kubit by following the corresponding Secure Data Sharing guide.
  2. In case you haven't built your data warehouse yet, you can configure a Snowplow destination, targeting the Kubit Snowflake account.

Snowflake Destination integration steps

  1. The Kubit team will provide you with the Snowflake credentials for the destination.
  2. Follow the Snowflake destination guide depending on your Snowplow plan:
    1. Cloud
    2. Enterprise/Open source:
      1. AWS
      2. GCP



We recommend picking the Spark transformer as it supports deduplication prior to loading the data into the data warehouse.

Handling historical data

Snowplow does not store any data and thus there are no built-in features to replay historical data. Events start streaming from the moment the Destination is configured. In case you need some historical tail and are already storing the event stream in a data warehouse we suggest sharing your data through a Secure Data Share instead of using a Destination.