Fractribution
Package Configuration Variablesโ
This package utilizes a set of variables that are configured to recommended values for optimal performance of the models. Depending on your use case, you might want to override these values by adding to your dbt_project.yml
file.
All variables in Snowplow packages start with snowplow__
but we have removed these in the below table for brevity.
Warehouse and trackerโ
Variable Name | Description | Default |
---|---|---|
page_views_source | The source (schema and table) of the derived snowplow_web_page_views table | {{ source('derived', 'snowplow_web_page_views') }} |
web_user_mapping_table | The schema and table name of the snowplow web user mapping table, if different to default | derived.snowplow_web_user_mapping |
conversions_source | The source (schema and table) of your conversion events, likely your atomic events table | {{ source('atomic', 'events') }} |
Operation and logicโ
Variable Name | Description | Default |
---|---|---|
conversion_window_start_date | The start date in UTC for the window of conversions to include | current_date()-31 |
conversion_window_end_date | The end date in UTC for the window of conversions to include | |
conversion_window_days | The last complete nth number of days (calculated from the last processed pageview within page_views_source) to dynamically update the conversion_window_start_date and end_date with. Will only apply if both variables are left as an empty string | 30 |
path_lookback_days | Restricts the model to marketing channels within this many days of the conversion (values of 30, 14 or 7 are recommended) | 30 |
path_lookback_steps | The limit for the number of marketing channels to look at before the conversion | 0 (unlimited) |
path_transforms | Dictionary of path transforms (and their argument, null if none) to perform on the full conversion path (see udfs.sql file) | {'exposure_path': null} |
use_snowplow_web_user_mapping_table | true if you are using the Snowplow web model for web user mappings (domain_userid => user_id ) | false |
snowplow__conversions_source_filter | A timestamp field the conversion source field is partitioned on (ideally) for optimized filtering, when left blank derived_tstamp is used | blank |
snowplow__conversions_source_filter_buffer_days | The number of days to extend the filter | 1 |
Contexts, filters, and logsโ
Variable Name | Description | Default |
---|---|---|
channels_to_exclude | List of channels to exclude from analysis (empty to keep all channels). For example, users may want to exclude the 'Direct' channel from the analysis. | [] |
channels_to_include | List of channels to include in the analysis (empty to keep all channels). For example, users may want to include the 'Direct' channel only in the analysis. | [] |
conversion_hosts | url_hosts to filter to in the data processing | [a.com] |
consider_intrasession_channels | If false , only considers the channel at the start of the session (i.e. first page view). If true , considers multiple channels in the conversion session as well as historically. | false |
Warehouse Specificโ
Variable Name | Description | Default |
---|---|---|
run_python_script_in_snowpark | A flag for if you wish to run the python scripts using Snowpark. | false |
attribution_model_for_snowpark | The attribution model to use when running in Snowpark; one of shapley , first_touch , last_touch , position_based , linear . See the package docs for more information. | shapley |
Output Schemasโ
By default all scratch/staging tables will be created in the <target.schema>_scratch
schema, the derived tables, will be created in <target.schema>_derived
and all manifest tables in <target.schema>_snowplow_manifest
. Some of these schemas are only used by specific packages, ensure you add the correct configurations for each packages you are using. To change, please add the following to your dbt_project.yml
file:
If you want to use just your connection schema with no suffixes, set the +schema:
values to null
models:
snowplow_fractribution:
+schema: my_derived_schema
Config Generatorโ
You can use the below inputs to generate the code that you need to place into your dbt_project.yml
file to configure the package as you require. Any values not specified will use their default values from the package.
Project Variables:
vars:
snowplow_fractribution: null