Does google store the requests that are sent via Google DLP API - google-cloud-platform

I am trying to understand if Google stores text or data that are sent to DLP API? For example, I am having some data (text files) locally and I am planning to use google DLP to help identify sensitive information and maybe transform those back.
Would Google store the text files data that I am using? In other words, would it retain a copy of the files that I am sending? I am trying to read through the security and compliance page, but there is nothing that I could find that clearly explains this.
Could anyone please advise?
Here is what I was looking at https://cloud.google.com/dlp/data-security

Google DLP API only classifies and identifies the kind of data, mostly sensitive, we want to analyse and Google doesn't store the data we send.

We certainly don't store the data being scanned with the *Content api methods beyond what is needed to process it and return a response to you.

Related

How to move data from BigQuery to Google Ads or DV360?

all,
I have been looking to move my certain customer data to Google Ads/DV360.
Wanted to understand if there's any way to move data From BigQuery >> Google Ads?
Using scripts we can automate bulk uploads using spreadsheets etc. but this controls only campaigns. Is there any method I can automate "remarketing list" under "Segments" via Scripts? Also is there any limitations wrt number of records & size of the file etc.?
Upon research I discovered methods to move the data Vice-versa ie; Google Ads >> Big Query via "Transfer Service API", "Scripts", Manual method.
Exploring API's therefore, wanted to understand if API can help me in the above requirements or not? If Yes, please guide me on the same.
Your help & response would highly be appreciated. Have a nice day ahead.

Google Merchant Center - retrieve the BestSellers_TopProducts_ report without the BigQuery Data Transfer service?

I'm trying to find a way to retrieve a specific Google Merchant Center report (BestSellers_TopProducts_) and upload it to BigQuery as part of a specific ETL process we're developing for a customer we have at my workplace.
So far, I know you can set up the BigQuery Data Transfer service so it automates the process of downloading this report but I was wondering if I could accomplish the same with Python and some API libraries from Google (like python-google-shopping) but I may be overdoing it and setting up the service is the way to go.
Is there a way to accomplish this rather than resorting to the aforementioned service?
On the other hand, and assuming the BigQuery Data Transfer service is the way to go, I see (in the examples) you need to create and provide the dataset you're going to extract the report data to so I guess the extraction is limited to the GCP project you're working with.
I mean... you can't extract the report data for a third-party even if you had the proper service account credentials, right?

Connecting on prem MQ to Google Cloud Platform

This is more of a conceptual question as there is no relevant documentation available. We have an on prem IBM-MQ from which we need to transfer data on our cloud storage bucket (GCP/AWS), what could be possible solutions in this case? Any help or direction would be appreciated. Thank you!
I'm assuming you can reach your goal once the MQ-data has been changed/converted to supported format by the Big Query.
You can refer on this google documentation for full guide on Loading data from local files. You can upload file via GCP Console or using selected programming language that will match on your on-prem. There's also variety of uploads that you can choose from according to data file. This also includes the right permission to use the BigQuery.
If you require authentication you check on this Big Query Authentication Guide

Google Tag Manager clickstream to Amazon

So the questions has more to do with what services should i be using to have the efficient performance.
Context and goal:
So what i trying to do exactly is use tag manager custom HTML so after each Universal Analytics tag (event or pageview) send to my own EC2 server a HTTP request with a similar payload to what is send to Google Analytics.
What i think, planned and researched so far:
At this moment i have two big options,
Use Kinesis AWS which seems like a great idea but the problem is that it only drops the information in one redshift table and i would like to have at least 4 o 5 so i can differentiate pageviews from events etc ... My solution to this would be to divide from the server side each request to a separated stream.
The other option is to use Spark + Kafka. (Here is a detail explanation)
I know at some point this means im making a parallel Google Analytics with everything that implies. I still need to decide what information (im refering to which parameters as for example the source and medium) i should send, how to format it correctly, and how to process it correctly.
Questions and debate points:
Which options is more efficient and easiest to set up?
Send this information directly from the server of the page/app or send it from the user side making it do requests as i explained before.
Does anyone did something like this in the past? Any personal recommendations?
You'd definitely benefit from Google Analytics custom task feature instead of custom HTML. More on this from Simo Ahava. Also, Google Big Query is quite a popular destination for streaming hit data since it allows many 'on the fly computations such as sessionalization and there are many ready-to-use cases for BQ.

Google Apps - Data Transfer API - transfert only some ressources

I'm trying to use the new Data Transfer API for Google Apps Domain and I would like to transfer some specific Google Drive files from one user to another. It seems we can use this API to transfer a "full service" (eg: all files from Google Drive) and not only some specific files.
Is my understanding of this API is correct or is it possible to limit the transfer to specific resources?
Thank you.
You're correct. The API enables you to transfer ownership of application data (currently Drive documents and Google+ pages) in bulk. It essentially allows you to automate the manual ownership transfer task documented here. You might want to read this blog here which has some useful background information.
The only way to achieve what you want is to use the Drive API (not to be confused with the Drive SDK).