How to move data from BigQuery to Google Ads or DV360? - google-cloud-platform

all,
I have been looking to move my certain customer data to Google Ads/DV360.
Wanted to understand if there's any way to move data From BigQuery >> Google Ads?
Using scripts we can automate bulk uploads using spreadsheets etc. but this controls only campaigns. Is there any method I can automate "remarketing list" under "Segments" via Scripts? Also is there any limitations wrt number of records & size of the file etc.?
Upon research I discovered methods to move the data Vice-versa ie; Google Ads >> Big Query via "Transfer Service API", "Scripts", Manual method.
Exploring API's therefore, wanted to understand if API can help me in the above requirements or not? If Yes, please guide me on the same.
Your help & response would highly be appreciated. Have a nice day ahead.

Related

How to share large data sets to third party data consumers/services?

Lets assume I have a client who has plethora of data related to railways(Signals, tracks, train timings, hazard. offers etc). There are various internal department in railways which wants that data. Just like various weather websites get data from weather department and show that data on their website. Similar is my requirement that I want to share the data securely with other department and services. I want to look at best method to share the data to other services as quickly as possible when the data is available.
Possible Solution
API based: Create API for each department and share them data via API. This has its own pros and cons. This is something which came to my mind but we would have to create lot go API's I was looking if Azure and AWS has any other service which can do the same.
Azure based solution: I am looking for help in this if Azure and services it provides can help. Service bus, Event Grid, Event Hub etc can these be of any use?.
AWS based solution: Is there any service in AWS which can help here?. I dont have much exposure to AWS.
Any other solution ?
I have a fair idea that this could be built using API's but I am looking if I can get this done using cloud platforms like Azure and AWS> This will help in better integration of the product and can scale.

Connecting data from Big Query to Cloud Fucntion to perfrom NLP

I wish to perform sentimental analysis using Google Natural Language API.
I found a documentation that perform sentiment analysis directly on a file located in Cloud Storage, https://cloud.google.com/natural-language/docs/analyzing-sentiment#language-sentiment-string-python.
However, my data that i am working on is instead located in Big Query. I am wondering how do I call the data directly from Big Query table to do the Sentimental Analysis?
An example of the Big Query Table schema:
I wish to do NLP on the tweet columns of the table.
I tried to search for documentation on it but seems to not find anything.
I would appreciate any help or references. Thank You.
You can take a look at BigQuery Remote Functions which provide a direct integration with Cloud Functions and Cloud Run. The columns returned from BigQuery SQL can be passed to the Remote Functions and a custom code can be executed as per the requirements. Please do note that Remote Functions are still in preview and might not be suitable for production systems.
This should be fairly straightforward to do with Dataflow - you could write a pipeline that reads from BigQuery followed by a DoFn that uses Google's NLP Libraries, and then writes the results to BigQuery.
Some wrappers are already provided for you in https://github.com/apache/beam/blob/master/sdks/python/apache_beam/ml/gcp/naturallanguageml.py

Adding Google Analytics Segments to AWS Appflow

I am trying to add segments to my AWS Appflow that is pulling Google Analytics data.
This is because I am running into Sampling problems(Google Analytics summarizes a lot of the data and makes analysis impossible)
I can add date range filters, but even with that set the minimum, I am still in need of breaking the requests down further via segmenting. But I can not find any support articles or places online that have done similar.
I have used the Google Analytics API by itself without appflow and been able to get all the data without it sampling, but need to do something similar using appflow.
What is the correct way to add segments to a Google Analytics Appflow
Thanks in advance for any help

Google Merchant Center - retrieve the BestSellers_TopProducts_ report without the BigQuery Data Transfer service?

I'm trying to find a way to retrieve a specific Google Merchant Center report (BestSellers_TopProducts_) and upload it to BigQuery as part of a specific ETL process we're developing for a customer we have at my workplace.
So far, I know you can set up the BigQuery Data Transfer service so it automates the process of downloading this report but I was wondering if I could accomplish the same with Python and some API libraries from Google (like python-google-shopping) but I may be overdoing it and setting up the service is the way to go.
Is there a way to accomplish this rather than resorting to the aforementioned service?
On the other hand, and assuming the BigQuery Data Transfer service is the way to go, I see (in the examples) you need to create and provide the dataset you're going to extract the report data to so I guess the extraction is limited to the GCP project you're working with.
I mean... you can't extract the report data for a third-party even if you had the proper service account credentials, right?

Google Tag Manager clickstream to Amazon

So the questions has more to do with what services should i be using to have the efficient performance.
Context and goal:
So what i trying to do exactly is use tag manager custom HTML so after each Universal Analytics tag (event or pageview) send to my own EC2 server a HTTP request with a similar payload to what is send to Google Analytics.
What i think, planned and researched so far:
At this moment i have two big options,
Use Kinesis AWS which seems like a great idea but the problem is that it only drops the information in one redshift table and i would like to have at least 4 o 5 so i can differentiate pageviews from events etc ... My solution to this would be to divide from the server side each request to a separated stream.
The other option is to use Spark + Kafka. (Here is a detail explanation)
I know at some point this means im making a parallel Google Analytics with everything that implies. I still need to decide what information (im refering to which parameters as for example the source and medium) i should send, how to format it correctly, and how to process it correctly.
Questions and debate points:
Which options is more efficient and easiest to set up?
Send this information directly from the server of the page/app or send it from the user side making it do requests as i explained before.
Does anyone did something like this in the past? Any personal recommendations?
You'd definitely benefit from Google Analytics custom task feature instead of custom HTML. More on this from Simo Ahava. Also, Google Big Query is quite a popular destination for streaming hit data since it allows many 'on the fly computations such as sessionalization and there are many ready-to-use cases for BQ.