Conviva Connect: Google Cloud Storage

 

Conviva Connect enables self-service configuration of the storage location, delivery frequency, and simple transformations of your content, ad, identity, and viewer data. Supported data storage destinations, such as databases, data lakes, and data warehouses, include Google BigQuery, Snowflake, Google Cloud Service (GCS), and Amazon S3.

GCS Delivery

Question

Example

What is your GCP project id? platform-piccolo-dev-0
What is your bucket name? daas_pre_delivery_test
Do you need any folder structure within the bucket?

Our delivery data will be in structure:

For daily delivery:

${bucket name/path}/yyyy-MM-dd/ _SUCCESS

${bucket name/path}/yyyy-MM-dd/ _common_metadata

${bucket name/path}/yyyy-MM-dd/ _metadata

${bucket name/path}/yyyy-MM-dd/part-r-00000.parquet

For hourly delivery:

${bucket name/path}/yyyy-MM-dd HH:mm:ss/ _SUCCESS

${bucket name/path}/yyyy-MM-dd HH:mm:ss/ _common_metadata

${bucket name/path}/yyyy-MM-dd HH:mm:ss/ _metadata

${bucket name/path}/yyyy-MM-dd HH:mm:ss/part-r-00000.parquet

please make sure there will be no path conflict

gs://daas_pre_delivery_test/test-account

Final delivery structure:

gs://daas_pre_delivery_test/test-account/2022-02-21/{data_files}

Do you need to provide your own service account or use Conviva service account (daas-cust-deliver-svc@platform-piccolo-prod-0-53a6.iam.gserviceaccount.com ) to access your bucket?

For either case, the service account requires to be granted following permissions:

 

  • storage.buckets.get

  • storage.objects.get

  • storage.objects.list

  • storage.objects.create

  • storage.objects.delete

  • storage.multipartUploads.*

 
What is the required file format? Choose between Avro, Parquet, and CSV.
Do you have Sessions Tags? If yes, do you want those to be included in Session Summary delivery? Yes, we have Session tags. Please include those in the Session Summary delivery.