Transfer Data from Campaign Manager to Cloud Storage - google-cloud-platform

My question is about how can I setup my Cloud storage bucket to retrieve data from my Campaign Manager account. I aim to process Camapaign report data in Big query, combining them with others data sources.
So in the documentation, it seems that it is possible with Transfert Data utility but I need before to store data files in a Cloud bucket and then it will be possible to use Data Transfer to get the data in BigQuery.
So how can I get Campaign Manager Data in a Google Cloud Storage?

Have you already tried following this documentation to setup BigQuery Data Transfer Service for Campaign manager? In the Before you begin section you'll need to contact either your campaign manager reseller or the campaign manager support to setup the Campaign Manager DTv2 files.
After completing this step, you will receive a Cloud Storage bucket name similar to the following: dcdt_-dcm_account123456
After doing this, you may now complete the rest of the documentation.

Related

Is there a way to automatically transfer data from aws report to google sheet

I get automated reports from aws server to my mail via a link. the aws link is similar to this (http://s3.ap-southeast-1.amazonaws.com/analytics/Company/Stock/MS_2020-08-03T02:43-9ebb780c.xlsx) - data might be deleted now.
Is there a way where i can automatically load the excel file from this report on to google sheet
Also, i dont have access to S3 server. its maintained by a different firm and we get only auto reports.

In GCP- how do I get a list of all prices

I would like to get all the different prices for different services in GCP in a REST API call.
Do I have to use the API of each service to get this, or is there one call that could get all the info on the different GCP services ?
Have a look at the documentation Get Started with the Cloud Billing API:
For the Cloud Billing Catalog API:
Programmatic access to the entire public Google Cloud catalog consisting of:
Billable SKUs
Public pricing
Relevant metadata
You can access the Cloud Billing API in one of the following ways:
REST API.
RPC API.
more details you can found at the section Get Google Cloud pricing information:
This page shows you how to use the Cloud Billing Catalog API to:
Get a list of all public services including relevant metadata about each service.
Get a list of all public SKUs within a service including:
Human readable description of the SKU.
Public pricing of the SKU.
Regions where the SKU is available for purchase.
Categorization data about the SKU.
with examples how to do it.
Keep in mind that calling the Cloud Billing Catalog API requires an API Key.
You can use the Cloud Billing Catalog API with your existing cost management tools, as well as to reconcile list pricing rates when you export billing data to Google BigQuery.

What is the equivalent Google Cloud Platform product related to AWS Data Exchange

What is the equivalent Google Cloud Platform product related to AWS Data Exchange
https://aws.amazon.com/data-exchange/
I've looked in the official Google Cloud Platform documentation that compares AWS and GCP products and they don't mention the AWS Data Exchange product at all.
So, most likely there isn't an equivalent GCP product to AWS Data Exchange.
That said, BigQuery supports third party data transfer, so maybe this is closest to what you're looking for:
Third party transfers for BigQuery Data Transfer Service allow you to automatically schedule and manage recurring load jobs for external data sources such as Salesforce CRM, Adobe Analytics, and Facebook Ads.

Importing Creatives from Campaign Manager into GCP using Pub Sub

I am trying to import Creatives from Campaign Manager into GCP using Cloud Pub Sub. There are more details on this page https://cloud.google.com/solutions/creative-analysis-at-scale-with-google-cloud-and-machine-learning but it does not give a picture of how to import creatives.
What's the step-by-step process for that ?
That page suggests storing the creatives in Google Cloud Storage. You'll need to upload your creatives into a Cloud Storage bucket. There are a variety of ways to do so (via the Cloud Console, via the gsutil tool, or via Cloud Storage REST APIs), discussed here.
You could set up Pub/Sub notifications on your Cloud Storage bucket, which you can configure to automatically publish a Pub/Sub message each time a creative is uploaded into your Cloud Storage bucket. Downstream in BigQuery, you can extract the Cloud Storage URI from the Pub/Sub message and feed it into the Vision API.

Accessing snowflake file system

I am new to snowflake and I have two questions regarding Snowflake on AWS.
I registered for a free account of Snowflake and It gave me a link to access its web UI and thereafter I could create a stage in WebUI using my exist AWS S3 bucket , however after loading of data , I am not sure , where does snowflake stores the data. Can I access its file system? Can I change its file system to my existing AWS S3?
While registration of Snowflake on AWS , I went to AWS Marketplace and Subscribed to snowflake account and it gave a snowflake webUI. Do I need to do anything else for deployment of Snowflake on AWS?
The data you imported from S3 into Snowflake now resides in a logical database table. The database stores its data in its own S3 bucket. The database storage format is proprietary, and a database abstract storage layer S3 bucket possibly contains data from multiple customers. The data is encrypted, and in the end Snowflake probably doesn't even know eg. which disk the data is on, they are S3 users like everyone else.
You can do almost anything from the GUI. But the GUI doesn't provide a proper archive for code and object history etc. Snowflake has recently acquired a company with a development tool, so maybe something more than the GUI is in the coming.