Google cloud pubsub aggregated sink - google-cloud-platform

I find the docs for gcloud stuff detailed but hardly the easiest to follow. Is anyone out there able to tell me what I'm doing that means my topic never received the logs I believe it should. Many thanks:
Create a project called logproj, enable PubSub API and create a topic called mytopic.
Create organization wide aggregated sink with the destination of the PubSub topic I just created, with the sink containing admin activity logs only:
gcloud logging sinks create mysink pubsub.googleapis.com/projects/logproj-12345/topics/mytopic --include-children --organization=123456789123 --log-filter='"logName:activity" AND logName:"/logs/cloudaudit.googleapis.com%2Factivity"'
The above command completes successfully and gives me a service account called o78732a92983-3234626#gcp-sa-logging.iam.gserviceaccount.com that I need to add to the topic and give PubSub Publisher role, which I do via the Web UI.
In order to test the topic is receiving organization wide activity logs, I create a test project. Then use the log viewer and select the pubsub topic, but the only logs I can see are from me creating the topic in the logproj project.
What am I missing?
Thanks in advance.

Messages published to Pub/Sub cannot be found in the Stackdriver logs. If you want to view the messages published to your topic:
Create a subscription for your topic
Use command-line to pull messages: https://cloud.google.com/sdk/gcloud/reference/pubsub/subscriptions/pull
Alternatively, you could implement a simple subscriber. Code examples can be found here: https://cloud.google.com/pubsub/docs/pull

Related

GCP Deployment manager - Call HTTP endpoint when everything is deployed

I have a template that creates a few resources in GCP, and I want it to either call an HTTP endpoint or publish a message to a topic whenever the deployment completes. I've been checking different services all day, and couldn't find anything about it.
In AWS, it is quite easy to deploy an SNS message that is published to an SNS topic, which is subscribed to SQS Queue, and that triggers a lambda function. In Azure ARM templates, we can use az CLI to invoke a web request and call an endpoint directly.
I couldn't find any similar in GCP. Is there any way of either calling an HTTP endpoint, Cloud Function or perhaps publishing a message to a topic whenever a deployment is finished?
I really appreciate any help.
The best approach in GCP is to Create a Logging Sink using a filter and the Logging query language to only send the Deployment Manager logs to a PubSub topic.
Here is an example of a sink sending deployment manager logs to a PubSub topic previously created:
gcloud logging sinks create $SINK_NAME pubsub.googleapis.com/projects/$YOUR_PROJECT/topics/$TOPIC \
--log-filter='resource.type="deployment"' --description="my sink"
Be careful to Set the destination permissions or will not see the logs in the PubSub topic.
Once you are getting the logs in the PubSub topic, you can configure Cloud Pub/Sub Triggers to fire up an HTTP call based on content of the log.

Get cloud asset details via GCP pubsub

I have created a pubsub topic and subscriber and configured a public API for receiving the push notification. I have able to receive sample push notification in the API through topic. I would like to know if there is any way in which I can receive all GCP asset creation/ updation details as push notification via pub sub. I need all the details like asset creation date, who created, project etc...Is this doable in GCP pubsub
You have 2 solutions to achieve that:
Use Cloud Asset inventory
Use Audit logs
Cloud Asset inventory monitoring is a built in solution that publish a message on each asset change
The tradeoff that, for now, all the assets aren't supported. The most common yes, but depends on your needs.
The other solution is to create a Cloud Logging sink to PubSub, with a filter only on the System Audit Logs

Give a Gmail-API outside our google cloud Pub/Sub publisher privileges

Let me try to explain to you what we are trying to do.
Saying it quickly: We want to give Pub/Sub Publisher (in our GoogleCloud) privileges to a GMail-API that is outside of our GoogleCloud.
What we have:
Following instructions here: https://developers.google.com/gmail/api/quickstart/ruby
We've created a project for GMail-API and the credentials in GoogleCloud, let's call it Cloud-A. (We'll not own this side in a production environment; this project and cloud will be managed by our customers' IT department.)
Next, we followed this other guide https://developers.google.com/gmail/api/guides/push
We created a Pub/Sub topic in our GoogleCloud (Cloud-B), we own this portion and it's the topic where we want to subscribe in order to listen for messages/notifications. (This topic is in a different account from the GMail-API that will be publishing messages that is Cloud-A).
So, following that last guide, it says that we need to give permissions to gmail-api-push#system.gserviceaccount.com and from my understanding what that means is that I'm giving privileges to GMail-API from Cloud-B to publish messages in Pub/Sub Cloud-B.
What I can't find out is a way to give permissions to Gmail-API from Cloud-A to publish messages in Pub/Sub Cloud-B.
To wrap up, I want to listen to Pub/Sub in Cloud-B that will receive notifications from Gmail-API in Cloud-A.
We used this https://github.com/googleapis/google-api-ruby-client/blob/master/generated/google/apis/gmail_v1/service.rb#L144 and I get an error saying that the topic doesn't exist (Probably because it is in Cloud-B and I'm configuring Cloud-A Gmail-Api)
I hope I was clear enough, we are not looking to given another project inside the same Google Cloud access to a Pub/Sub, it isn't even a service that we wrote since it is Gmail-Api and the only thing we are allowed to do is to send it the topic name we want it to publish in.
I'm not familiar with how the GMail-API publishes to Pub/Sub, but, if you have already figured out how to publish from GMail-API in project Cloud-A to a Pub/Sub topic in Cloud-A, you may try the following workarounds:
Alternative A:
Create the topic (topic-A) in project Cloud-A.
Create a pull subscription (subs-A) associated to topic-A also in project Cloud-A.
Create a service account (account-B) in project Cloud-B and grant it the Pub/Sub subscriber role for subscription subs-A.
Make your consumers (e.g. AppEngine, GKE, GCE) use service account account-B to pull messages from subs-A.
Alternative B:
Create the topic (topic-A) in project Cloud-A.
Create a push subscription (subs-A) associated to topic-A pointing to an endpoint of a service hosted in project Cloud-B (e.g. GCE, GKE, AppEngine, Cloud Function, etc.)
Alternative C:
Create the topic (topic-A) in project Cloud-A.
Create a pull or push subscription (subs-B) in project Cloud-B associated to topic-A in project Cloud-A. The user creating this subscription should have the Pub/Sub Editor role granted for topic-A.
Consume the messages from subs-B.

Publish to different pubsub topic in Google cloud via mosquitto broker?

I have a pubsub topic name
projects/project_name/topics/Sandbox_PubSub
and I also have a subscription on that topic.
and then I have a cloud function name Sandbox_function which triggers with a publish on that topic.
I have a python code where I am publishing the data to this topic via a mosquitto broker
'/devices/device_id/events/Sandbox_PubSub'
Its working perfectly fine and I am receiving data in the cloud function as well as in the subscription.
Now I created another topic name
projects/project_name/topics/Pairing
and then subscription and another cloud function called Pairing_Function
Now when I publish my data on
'/devices/device_id/events/Pairing'
Instead of getting that data in Pairing_Function I am getting the data in Sandbox_Function as a subFolder Pairing and also in Sandbox_Subscription instead of Pairing subscription.
What i am doing wrong here? and how can i get data into different topics?
thanks a lot!
I found it that we need to define the topics in the registry too.
(see: IoT core -> Registry Edit -> Add more telemetry topics)
We need to redirect the subFolder to specific topics

How can I forward a GCP Pubsub subscription's messages to another topic?

I would like to be able to forward a Google Cloud Platform (GCP) Pubsub subscription's messages to another GCP topic, which is possibly in another GCP Project.
Is this possible via Push endpoint URL (What would the url of the destination topic be?)
Some other easy configuration via API?
(workaround) Alternatively, how would you provide this functionality?
There is a Dataflow template called Cloud_PubSub_to_Cloud_PubSub that read messages from a Pubsub subscription (inputSubscription) and publish the messages to a Pubsub topic (outputTopic).
Be sure that the service account used to launch the Dataflow job has the right permissions (roles/pubsub.subscriber in the project where inputSubscription is defined or directly on inputSubscription, and roles/pubsub.publisher in the project containing outputTopic or directly on outputTopic).
You can take a look at the code on GitHub if you are familiar with Java.
The only way to do this would be to publish the messages to the other topic in a subscriber you write when it receives the messages. There is no automated way to forward messages from one topic to another.