How do I peek Scheduled messages in an Azure Service bus topic?
According to this post:
https://github.com/MicrosoftDocs/azure-docs/issues/59641
Scheduled messages reside in the topic until the scheduled time, and users should peek into the topic if they want to see the scheduled messages.
So I can't use the 'PeekMessagesAsync' on a 'ServiceBusReceiver'
Using a 'ServiceBusAdministrationClient' I'm able to get the number of Scheduled messages in the Topic, but I'd really like to get hold of the messages them self....
Any help would be greatly appreciated....
Short answer - you can’t peek messages based on the status.
There’s a now 4 year old issue asking for this feature.
Related
After I sent a message to my GCP subscription, it takes a minute or two (should be instant) to appear in my Nifi flow. At this point, I see a bunch of XML and my payload isn't there. Does anyone know what's possibly happening?
If your push messages are not acknowledged then it may slow down delivery of the rest significantly.
Your use case looks more like the endpoints don't acknowledge it's delivery instantly (or acknowledgement is late due to some other reasons). If the message is not acknowledged immediately then a system will retry to deliveer it (with some delay) and it will keep trying untill it's acknowledged.
Also look at the Message Flow Control documentation which albo may point you to a solution.
Similar topic was also discussed here in StackOverflow (which might help you).
I am new to pub/sub on GCP and have some difficulties on understanding some concepts. So if I want to get email every time I have new message in my mailbox, can I use Pub/Sub for that? How the push notification work in that case? I understand the subscriber concepts but I have some difficulties in the publisher concepts. Can anyone help?
Although I am not familiar with the Gmail API (I am specialized mainly in GCP), a quick read over the documentation can provide some really useful insights about this topic. Also, as per your question, I think your doubts are more related to Pub/Sub itself, rather than Gmail API, so let me try to clarify some things for you.
I can see in the Gmail API documentation, that you can configure Gmail to send push notifications using Cloud Pub/Sub topics, in such a way that Gmail sends publish requests to a Pub/Sub topic whenever a mailbox update matches the configuration you established. Although I cannot get into much details about this part of the scenario, from the documentation I understand that the way to configure the Gmail push notifications is to make a watch() request with the configuration you want and pointing a Pub/Sub topic that you should have previously created. Once this is set (and also permissions are correctly configured), Gmail would keep publishing mailbox message updates for a period of 7 days (after a week, you have to re-call watch()).
In order to receive notifications, you can now forget completely about the Gmail API, and you can focus on Pub/Sub. You should create a Pub/Sub subscription (using either Pull or Push configuration, depending on your requirements), so that your client (wherever and whatever it is) receives the Pub/Sub messages that work as a notification. You may have to acknowledge the messages so that they are not retried, too.
As a side note, given that you mentioned that the Pub/Sub subscriber concepts are more or less clear to you, and you would like to know more about publishing, let me share with you some links that may come in handy for a better understanding of the environment:
Pub/Sub main concepts.
Typical Pub/Sub flow.
General guide for Pub/Sub publishers.
In the scenario you are presenting (Gmail notifications using Pub/Sub), you would have to create a topic (with the name you want, let's name it gmail_topic), and the Gmail API would be your publisher. What the watch() method would be doing, behind the scenes, is calling the publish() method to send messages (containing information about mailbox updates) to your topic gmail_topic. Messages are passed to Pub/Sub subscriptions (which you can create and bind to the gmail_topic), and they retained in each of the subscriptions for 7 days (the maximum retention period) until you consume and acknowledge them.
For various reasons we have run into scenarios where we would like to "pause" push notifications from a Google Cloud Platform (GCP) Pubsub subscription and just allow them to queue up, and then eventually "unpause" and allow pushes to continue without losing any messages.
Is this a built in feature?
Can you suggest a workaround?
Good news. I stumbled upon the answer at https://cloud.google.com/pubsub/docs/subscriber#receive_push
Stopping/pausing and resuming push delivery
To pause receiving messages for a subscription, send a
modifyPushConfigRequest to set the push endpoint to an empty
string. The messages will accumulate, but will not be delivered. To
resume receiving messages, send another modifyPushConfigRequest
request with a populated push endpoint.
To permanently stop delivery, you should delete the subscription.
There is no "pause" feature with push subscriptions. If you can, you might consider switching to a pull subscription. Then you can control exactly when you request messages.
If you can't switch to a pull subscription, you could just return an error response when you receive messages or make your endpoint unavailable. Google Cloud Pub/Sub will backoff redelivery of messages, waiting up to 10 seconds between attempts. It will try to redeliver messages for 7 days. Depending on how long you need to pause your message consumption, this might be a viable option.
If your not going to need to switch between "paused" and "unpaused" frequently, less than once per minute, then you can accomplish this behavior by switching your subscriber to a pull subscription (and not pulling) to get the pause behavior and then switching back to a push subscription to start receiving messages again.
I don't think there's such a pause feature. Instead, you can use polling Consumers and you can stop polling when you need to pause. That's all I can think of.
So I need a second pair of eyes to correct or confirm my understand standing of Amazon SQS. From my understanding, you can add an unlimited amount of messages to one queue. A message can be 256 KB in size, and if it needs to be larger than that, you can use amazon s3 to store 2 GB. Reading around online, it appears there are many use cases for this queuing service. For example one use case of SQS can act as a database buffer.
But here's what I'm looking to do.. I'm looking to make a real time messaging system. My current functionality acts like more of a message board, so the implementation just inserts into the database then reads the data and packages it into JSON to be inserted on SQLITE mobile phone. That works great, but I'm getting a lot of requests from people to make it real-time.
So what I'm wondering is can I utilize amazon SQS to write and read messages for a chat application? So in my theoretical use case of SQS would have a message queue to write to, and pull from the that queue every second to check for messages on mobile. But here's where I'm confused. Since you cannot "Query" a particular message from the queue, would it make sense to have a queue per user then a generic queue for the app server to read from? Or am I just talking crazy and should spend cognitive resources thinking about implementing an open connection on an Ec2 instance?
Any help would be great,
Thanks!
Have you thought about using Amazon SNS to push the chat messages to your mobile devices? Each user publishes to a topic and the readers subscribe to that topic. You just have to be ok with missing messages if the app isn't running.
If you only have a few (or maybe, less than 100) users, you could have thought of having one SQS queue per user. If that is not so, the solution won't be operationally feasible.
If you were to have one generic queue, SQS won't help because it doesn't allow querying for a given field in all available messages.
I can think of following options for your use case:
Setup one Redis cluster, possibly on Amazon ElastiCache. Have one message List per user.
One Messages table in MySQL, possibly on AWS RDS. This will provide an easy way to query messages for a given user.
You can also use DynamoDB in #2.
In IEventProcessor.ProcessEventsAsync I want to store events in a persisted store. It's possible this store is unavailable and messages cannot be persisted. How to sign these messages to be redelivered later?
The store may be down only for some hours, but until it's up again every message is affected and cannot be persisted.
I don't think you can mark a particular event to be delivered in eventhub, unlike ServiceBus queue. However, eventhub does provide retention policy and offset for each event, which make possible to reprocess an old event. You can read more in the "checkpointing" section from this document: https://azure.microsoft.com/en-us/documentation/articles/event-hubs-overview/
Adding to Tyler response, i suppose that you could use the some kind of "Poison Message"/Dead letter queue approaches. Event Hub does not have that functionality, but Service Bus Queues do.
Anyway, i think that it should be a programmatic approach, not something inside of the backend.
There is a good article about something else, but approach is alike what i meant:
https://www.dougv.com/2015/07/handling-poison-messages-in-an-azure-service-bus-queue/