How to get the response delivered to Subscriber back to the Producer - google-cloud-platform

I have implemented a model using google pubsub where the producer sends in the message and the subscriber processes the message and sends the response to the subscription. But how do I map the response to the publisher which sent the request?
Are there any filters that can be put on the subscription so that the response can be tracked? or is there another way of implementing this?

There is no way in Cloud Pub/Sub for the publisher to know that the subscriber processed the message. One of the main goals with the pub/sub paradigm is to separate the publisher from the subscriber and having this kind of dependency tends to break that separation. Once the publish succeeds, then it knows that interested subscribers will receive the message.
If the publisher needs to know that the subscriber completed the processing of the message, then one way to accomplish this is to use a second Pub/Sub topic that sends those messages. The subscriber on the original topic becomes the publisher and the original publisher becomes the subscriber.

Related

GCP PubSub understanding filters

In my understanding PubSub filters are supposed to reduce number of messages sent to a specific subscription. We currently observe behaviour that we didn't expect.
Assuming there is a PubSub Topic "XYZ" and a subscription to that topic "XYZ-Sub" with a filter attributes.someHeader = "x"
There are 2 messages published to that topic:
First one attributes.someHeader = "a". Second one with attributes.someHeader = "x"
I expect the only message 2 will be delivered to the subscription as message 1 does not match the filter.
If it is not the case and still both messages get delivered (what we currently observe):
GCP console shows a rising number of unacked messages on a sub when no client is connected. Pulling this messages in the gcp console removes them without showing any received messages, which makes me assume that the filters are applied when pulling messages.
Are filters evaluated on PubSub client and not topic level?
What is the point in using filters with pub/sub?
Will the delivery of the unwanted message (the bytes of the message) be billed?
Filtering in Cloud Pub/Sub only delivers messages that match the filter to subscribers. The filters are applied in the Pub/Sub service itself, not in the client. They allow you to limit the set of messages delivered to subscribers when the subscriber only wants to process a subset of the messages.
In your example, only the message with attributes.someHeader = "x" should be delivered. However, note that as the documentation, the backlog metrics might include messages that don't match the filter. Such messages will not be delivered to subscribers, but may still show up in the backlog metrics for a time.
You do get charged the Pub/Sub message delivery price for messages that were not delivered. However, you do not pay any network fees for them, nor do you end up paying for any compute to process messages you do not receive.

Google Cloud Pub/Sub retrieve message by ID

Problem: My use case is I want to publish thousends of messages to Google Cloud Pub/Sub with a 5min retention period but only retrieve specific messages by their ID - So a cloud function will retrieve one message by ID using the Nodejs SDK and all the untreated messages will be deleted by the retention policy. All the current examples mention are to handle random messages from the subscriber.
Is it possible to just pull 1 message by id or any other metadata and close the connection.
There is no way to retrieve individual messages by ID, no. It doesn't really fit into the expected use cases for Cloud Pub/Sub where the publishers and subscribers are meant to be decoupled, meaning the subscriber inherently doesn't know the message IDs prior to receiving the messages.
You may instead want to transmit the messages via whatever mechanism you are using to making the subscribers aware of the message IDs. Or, if you know at publish time which messages will ultimately need to be retrieved, you could add an attribute to the message to indicate this and use filtering.

Is it possible to kick off two different cloud build which are based on subscription to same topic?

currently i have a cloud-build application which is being kicked off by a pub-sub trigger , subscribing to eg. topic1
I would like to know if i can kick off another cloud-build application from subscribing to the same topic. Is there a way to configure the message (or the trigger) so that if message1 is published to topic1, then cloudbuild1 is kicked off, and if message2 is published to topic1, then cloudbuild2 is kicked off?
Kind regards
marco
When you create a subscription on a topic, all the published messages in the topic are replicated in each subscription.
Therefore, if you have TOPIC and Sub1 and Sub2, if you publish 1 message in TOPIC, you will have this message in Sub1 and Sub2.
However, you can set up a filter on messages when you create a subscription. You can set this filter only at the creation and you can't update it later. You need to delete and recreate the subscription if you want to update the filter.
In addition, you can filter only on message attributes, not on the message body content.
Therefore, with filter, think wisely your filter from the beginning and when you publish a message in TOPIC, add attributes that allow your to route the messages to the correct subscription.

Google Pub/Sub sending to only one subscription every ~5 messages

I've made 3 clients connected to a subscription, and one publisher. In the image 2 of the subscriptions are on the terminal, and one subscription is not seen as it is hosted on a DigitalOcean Droplet. It seems every 5 messages, it switches which subscriber actually receives the message, which should not happen. I've also varied the speed and it's always about 5 messages.
Here is the code used on all clients for subscriptions:
sub.on("message", (msg) => {
console.log(`Message:1 ${msg.data.toString("utf-8")}`)
msg.ack()
})
And here is the code for publishing:
console.log("send")
topic.publish(Buffer.from("hey"), {
channelId: "641273551806267403"
})
In Cloud Pub/Sub, a subscription is a logical entity that wants all messages published to the topic with which the subscription is associated. A subscriber is a client that receives messages on behalf of a subscription. When there are multiple subscribers receiving messages for a single subscription, then each subscriber receives a subset of the messages. This is the load balancing case, where one uses multiple subscribers to process messages at scale; if more messages need to be supported, one just turns up more subscribers to receive messages from the same subscription.
When a topic has multiple subscriptions, then every message has to be sent to a subscriber receiving messages on behalf of each subscription. This is the fan out use case.
Here is a graphic that tries to make it a little clearer. The left side is load balancing, the right side is fan out:

Google pubsub acknowledgement from a different client

Here is what I need to accomplish:
Get some messages in System 1 using a Pull Subscription.
Send each message along with acknowledgement id to System 2
Send acknowledgement to the subscription from that System 2.
So, basically I will create a new Pubsub client and send acknowledgement. How can I make this request?
To complement your answer, I just realized that your client is a Pubsub java class and the method execute() belongs to an instance of Acknowledge.
I found a full example that pulls messages from Pubsub. In fact, the pullMessages() method in such example has the sentence you mentioned. Digging into the java framework, this link mentions how the method exetute() is used.
I added the execute() method to the acknowledgment operation:
client.projects().subscriptions()
.acknowledge(getFullSubscriptionsName(config), acknowledgeRequest)
.execute();
The method execute() was not being invoked.