In this guide, we will guide you through the steps to send your custom backend data to Billy Grace. By syncing events that don't happen on your website with Billy Grace, we will be able to model attribution for conversions that are not occurring directly on the website, but are linked to a web event.
Example of linked web events:
Job application
Lead form submit
Booking of a test ride
Example of backend events that happen outside the website, but can be linked to web events:
An applicant is hired;
A lead became a client
A car is purchased after the test ride
Important: You can only import backend events with Billy Grace that can be linked by a reference_id to a web event. If this is not the case, check out our other Custom Import possibilities.
Importing custom backend event data is not always included in your package. Please contact your Customer Success Manager for more information.
There are two main ways of sending custom data:
SFTP: you can share a daily .csv or parquet file with the orders.
BigQuery: you can share access to a BigQuery table, from which we will extract the data.
It is essential that the data you share has the correct schema, so that we will be able to ingest it correctly:
If you share data with an incorrect schema, we will not be able to ingest it into your dashboard.
The schema we expect is listed below. Next to this, it is important that the data is available for ingestion at the correct time. Data should reach us before 02:00 AM (containing the data of the day before).
For example: a file containing media data for the 29th of July needs to reach us before 02:00 AM on the 30th of July. If you fail to do this, there might be up to a week lag before you see correct attribution for this channel.
More information on the specific schema we expect and how to connect the different methods is given below.
Schema
The schema of the data we expect is listed below:
Field Name | Data Type | Description |
date | String | Date in format YYYY-MM-DD. |
ev | String | The name of the new custom event you want to see in BG |
reference_ev | string | Name of the reference events. Can be multiple (in this case they need to be comma separated. I.e. 'eventtwo,eventone') |
reference_id | string | String containing the reference ID they need to match from the reference_ev |
value | float | The value of the conversion (if applicable) |
ev
: Name of the event (e.g., event_deal_closed).reference_ev
: The name of the event (value for theev
field) that this new event will link to.reference_id
: The ID of the original event that the new event should connect to. This ID must match exactly. If a submission event had a transaction_id of 12234, the reference ID should align with this.value
: If applicable, a numeric representation of the event's worth.
How to connect via BigQuery?
Follow these steps to share the specific table with the schema above.
Granting Permissions via IAM:
Go to the IAM & Admin section in their Google Cloud Console.
Click "Add" at the top of the IAM page.
Enter service account email (data-retriever@billy-grace.iam.gserviceaccount.com)
Assign appropriate roles that your service account needs to access the required resources (e.g., Viewer, Storage Object Viewer, etc.).
Click "Save".
Granting Permissions on Specific Resources:
If the service account needs access to specific resources, please grant permissions at the resource level (e.g., on a particular Cloud Storage bucket or BigQuery dataset).
For example, to grant access to a Cloud Storage bucket:
Go to the Cloud Storage section in the Google Cloud Console.
Select the bucket.
Click on "Permissions".
Click "Add" and enter your service account email (data-retriever@billy-grace.iam.gserviceaccount.com).
Assign the necessary roles (e.g., Storage Object Viewer).
Click "Save".
After completing this step, please navigate to Settings -> Import Custom Data and complete the form to activate your export.
How to connect via SFTP?
Navigate to Settings -> Import Custom Data. Here, you'll find personalized credentials, including a host, username, and password.
You're required to establish a daily file dump on your own, ensuring that a CSV or Parquet file, following the previously mentioned schema, is sent to us daily at the designated time.
Once set up, please create an export on the same page where you obtained your credentials by completing the form that appears when you click on 'Add Custom Import'.