I am trying to import Creatives from Campaign Manager into GCP using Cloud Pub Sub. There are more details on this page https://cloud.google.com/solutions/creative-analysis-at-scale-with-google-cloud-and-machine-learning but it does not give a picture of how to import creatives.
What's the step-by-step process for that ?
That page suggests storing the creatives in Google Cloud Storage. You'll need to upload your creatives into a Cloud Storage bucket. There are a variety of ways to do so (via the Cloud Console, via the gsutil tool, or via Cloud Storage REST APIs), discussed here.
You could set up Pub/Sub notifications on your Cloud Storage bucket, which you can configure to automatically publish a Pub/Sub message each time a creative is uploaded into your Cloud Storage bucket. Downstream in BigQuery, you can extract the Cloud Storage URI from the Pub/Sub message and feed it into the Vision API.
Related
I have some objects in a Google Cloud Storage bucket that are publicly downloadable on URLs like https://storage.googleapis.com/blahblahblah. I want to set up a monitoring rule that lets me see how often one of these objects is being downloaded. I have turned on the Data Read audit log as mentioned here, but I don't see any logs when I download the object from the storage.googleapis.com link. I have another bucket where downloads are performed through the Node Google Cloud Storage client library, and I can see download logs from that bucket, so it seems like downloads from the public URL don't get logged.
I also don't see a way to specify the object in a particular bucket when setting up an alert in Google Cloud. Is creating a new bucket solely for this object the best way to try to set up monitoring for the number of downloads, or is there something I'm missing here?
Google Cloud Audit logs do not track objects that are public (allUsers or allAuthenticatedUsers).
Enable usage logs to track access to public objects.
Should you use usage logs or Cloud Audit Logs?
I am working on a pet project based on multi-cloud (AWS and GCP) which is based on serverless architecture.
Now there are files generated by the business logic within GCP (using Cloud Functions and Pub/Sub) and they are stored in GCP Cloud storage. I want to ingest these files dynamically to AWS S3 bucket from the Cloud Storage.
One possible way is by using the gsutil library (Exporting data from Google Cloud Storage to Amazon S3) but this would require a compute instance, and run the gsutil commands manually which I want to avoid.
In answering this I'm reminded a bit of a Rube Goldberg type setup but I don't think this is too bad.
From the Google side you would create a Cloud Function that is notified when a new file is created. You would use the Object Finalize event. This function would get the information about the file and then call an AWS Lambda fronted by AWS API Gateway.
The GCP Function would pass the bucket and file information to the AWS Lambda. On the AWS side you would have your GCP credentials and the GCP API download the file and upload it to S3.
Something like:
All serverless on both GCP and AWS. Testing isn't bad as you can keep them separate - make sure that GCP is sending what you want and make sure that AWS is parsing and doing the correct thing. There is likely some authentication that needs to happen from the GCP cloud function to API gateway. Additionally, the API gateway can be eliminated if you're ok pulling AWS client libraries into the GCP function. Since you've got to pull GCP libraries into the AWS Lambda this shouldn't be much of a problem.
I would like to get all the different prices for different services in GCP in a REST API call.
Do I have to use the API of each service to get this, or is there one call that could get all the info on the different GCP services ?
Have a look at the documentation Get Started with the Cloud Billing API:
For the Cloud Billing Catalog API:
Programmatic access to the entire public Google Cloud catalog consisting of:
Billable SKUs
Public pricing
Relevant metadata
You can access the Cloud Billing API in one of the following ways:
REST API.
RPC API.
more details you can found at the section Get Google Cloud pricing information:
This page shows you how to use the Cloud Billing Catalog API to:
Get a list of all public services including relevant metadata about each service.
Get a list of all public SKUs within a service including:
Human readable description of the SKU.
Public pricing of the SKU.
Regions where the SKU is available for purchase.
Categorization data about the SKU.
with examples how to do it.
Keep in mind that calling the Cloud Billing Catalog API requires an API Key.
You can use the Cloud Billing Catalog API with your existing cost management tools, as well as to reconcile list pricing rates when you export billing data to Google BigQuery.
My question is about how can I setup my Cloud storage bucket to retrieve data from my Campaign Manager account. I aim to process Camapaign report data in Big query, combining them with others data sources.
So in the documentation, it seems that it is possible with Transfert Data utility but I need before to store data files in a Cloud bucket and then it will be possible to use Data Transfer to get the data in BigQuery.
So how can I get Campaign Manager Data in a Google Cloud Storage?
Have you already tried following this documentation to setup BigQuery Data Transfer Service for Campaign manager? In the Before you begin section you'll need to contact either your campaign manager reseller or the campaign manager support to setup the Campaign Manager DTv2 files.
After completing this step, you will receive a Cloud Storage bucket name similar to the following: dcdt_-dcm_account123456
After doing this, you may now complete the rest of the documentation.
AWS Pinpoint Analytics appears to have replaced Amazon Mobile Analytics. In Mobile Analytics, you were able to create custom dashboards.
I'm struggling to find the feature in AWS Pinpoint. I'm assuming it's in there somewhere, but alas, I haven't found it yet.
#D.Patrick, you can create custom dashboards with Pinpoint data but not directly within Pinpoint console i.e You would need first to export your Pinpoint event data to a persistent storage (e.g S3 or Redshift) using Amazon Kinesis. Once in S3, you can use analytics tools to further analyze or visual the data. Such analytic tool offered by AWS include AWS Quicksight or AWS Athena. Other analytics(none-AWS) tools include Splunk
Check out the blog by AWS on this topic:
https://aws.amazon.com/blogs/messaging-and-targeting/creating-custom-pinpoint-dashboards-using-amazon-quicksight-part-1/
The 3 parts of this session describe in detail how to use Python 3, with AWS Lambda to create the custom dashboards.