I would like to know if i can execute a script based on a SQS message.
My requirement is to execute script in EC2 when i receive a queue message in SQS.
I am using AWS SDK to interact with SQS.
Thanks & Regards,
Srivignesh KN
An SQS message cannot automatically trigger a script to execute on your EC2 server. An SQS message does not automatically do anything. You have to create an application that will poll for SQS messages and then perform the necessary actions based on the messages it receives.
#Mark, thank you for the suggestions.
I was able to find a solution and implement this in the following manner using boto library.
import boto.sqs
#--------------Establishing Connection ------------------
conn = boto.sqs.connect_to_region(
"myregion",
aws_access_key_id='myaccesskey',
aws_secret_access_key='mysecretaccesskey')
print conn.get_all_queues()
#----------------Processing the queue ------------------
my_queue=conn.get_queue('MySqsQueue')
print "My Queue is ",my_queue
rs=my_queue.get_messages()
length=len(rs)
print "Number of messages in the queue is",length
m=rs[0]
MsgBody=m.get_body()
print "Message Body is ",MsgBody
---My Script Execution/Processing---
#--------------Clearing the queue---------------
conn.purge_queue(my_queue)
Thanks & Regards,
Srivignesh KN
Related
I want to see some status messages from my drone using "ros2 topic echo /mavros/statustext/recv" in the terminal.
After subscription, I just waiting for a message,(in this time my drone do mission, do takeoff, do landing...) but there's nothing received in that topic.
How to get any message in this topic?
ros2 param set /mavros/sys heartbeat_mav_type "GCS"
I have this simple python function where i am just taking the input from pubsub topic and then print it.
import base64,json
def hello_pubsub(event, context):
"""Triggered from a message on a Cloud Pub/Sub topic.
Args:
event (dict): Event payload.
context (google.cloud.functions.Context): Metadata for the event.
"""
pubsub_message = base64.b64decode(event['data'])
data = json.loads(pubsub_message)
for i in data:
for k,v in i.items():
print(k,v)
If i had used the pubsub_v1 library, there i could do following.
subscriber = pubsub_v1.SubscriberClient()
def callback(message):
message.ack()
subscriber.subscribe(subscription_path, callback=callback)
How do i ack the message in pubsub triggered function?
Following your latest message, I understood the (common) mistake. With Pubsub, you have
a topic, and the publishers can publish messages in it
(push or pull) subscriptions. All the messages published in the topic are duplicated in each subscription. The message queue belong to each subscription.
Now, if you look closely to your subscriptions on your topic, you will have at least 2.
The pull subscription that you have created.
A push subscription created automatically when you deployed your Cloud Function on the topic.
The messages of the push subscription is correctly processed and acknowledge. However those of pull subscription aren't, because the Cloud Function don't consume and acknowledge them; the subscription are independent.
So, your Cloud Function code is correct!
The message should be ack'd automatically if the function terminates normally without an error.
How can I bulk move messages from one topic to another in GCP Pub/Sub?
I am aware of the Dataflow templates that provide this, however unfortunately restrictions do not allow me to use Dataflow API.
Any suggestions on ad-hoc movement of messages between topics (besides one-by-one copy and pasting?)
Specifically, the use case is for moving messages in a deadletter topic back into the original topic for reprocessing.
You can't use snapshots, because snapshots can be applied only on subscriptions of the same topics (to avoid message ID overlapping).
The easiest way is to write a function that pull your subscription. Here, how I will do it:
Create a topic (named, for example, "transfer-topic") with a push subscription. Set the timeout to 10 minutes
Create a Cloud Functions HTTP triggered by PubSub push subscription (or a CLoud Run service). When you deploy it, set the timeout to 9 minutes for Cloud Function and to 10 minutes for Cloud Run. The content of the processing is the following
Read a chunk of messages (for examples 1000) from the deadletter pull subscription
Publish the messages (in bulk mode) into the initial topic
Acknowledge the messages of the dead letter subscription
Repeat this up to the pull subscription is empty
Return code 200.
The global process:
Publish a message in the transfer-topic
The message trigger the function/cloud run with a push HTTP
The process pull the messages and republish them into the initial topic
If the timeout is reached, the function crash and PubSub perform a retry of the HTTP request (according with an exponential backoff).
If all the message are processed, the HTTP 200 response code is returned and the process stopped (and the message into the transfer-topic subscription is acked)
this process allow you to process a very large amount of message without being worried about the timeout.
I suggest that you use a Python script for that.
You can use the PubSub CLI to read the messages and publish to another topic like below:
from google.cloud import pubsub
from google.cloud.pubsub import types
# Defining parameters
PROJECT = "<your_project_id>"
SUBSCRIPTION = "<your_current_subscription_name>"
NEW_TOPIC = "projects/<your_project_id>/topics/<your_new_topic_name>"
# Creating clients for publishing and subscribing. Adjust the max_messages for your purpose
subscriber = pubsub.SubscriberClient()
publisher = pubsub.PublisherClient(
batch_settings=types.BatchSettings(max_messages=500),
)
# Get your messages. Adjust the max_messages for your purpose
subscription_path = subscriber.subscription_path(PROJECT, SUBSCRIPTION)
response = subscriber.pull(subscription_path, max_messages=500)
# Publish your messages to the new topic
for msg in response.received_messages:
publisher.publish(NEW_TOPIC, msg.message.data)
# Ack the old subscription if necessary
ack_ids = [msg.ack_id for msg in response.received_messages]
subscriber.acknowledge(subscription_path, ack_ids)
Before running this code you will need to install the PubSub CLI in your Python environment. You can do that running pip install google-cloud-pubsub
An approach to execute your code is using Cloud Functions. If you decide to use it, pay attention in two points:
The maximum time that you function can take to run is 9 minutes. If this timeout get exceeded, your function will terminate without finishing the job.
In Cloud Functions you can just put google-cloud-pubsub in a new line of your requirements file instead of running a pip command.
Problem : Fetch 2000 items from Dynamo DB and process(Create a POST req from 100 items) it batch by batch (Batch size = 100).
Question : Is there anyway that I can achieve it from any configuration in AWS.
PS : I've configured a cron schedule to run my Lambda function. I'm using Java. I've made multi-threaded application which synchronously does so, but this eventually increases my computation time drastically.
I have the same problem and thinking of solving it in following way. Please let me know if you try it.
Schedule Job to fetch N items from DynamoDB using Lambda function
Lambda function in #1 will submit M messages to SQS to process each
item and trigger lambda functions, in this case it should call
lambda functions M times Each lambda function will process request
given in the message
In order to achieve this you need to schedule an event via CloudWatch, setup SQS and create lambda function triggered by SQS events.
Honestly, I am not sure if this is price effective but it should be working. Assuming your fetch size is so low, this should be reasonable.
Also you can try using SNS in this case you don't need to worry about SQS message polling.
I have an AWS Elastic Beanstalk instance configured as a Worker Environment. It has a cron job that runs every x minutes. In Worker Configuration I have the path to a php file that runs when the cron fires. If I go to the SQS dashboard I can manually send the SQS for this worker a message as well as set an actual message, for example "hello".
My question is, how can I have the php file access the SQS message's message attribute?
The obvious answer is to use the AWS\SQSClient however the only way to read a message is to first get a message. The problem here is that the message has already been retrieved by the Elastic Beanstalk worker code. So how can I now read its attributes?
EDIT
Just to add more clarity to what I am describing I'm going to give a detailed write up of my steps to cause this.
I log into my elastic bean stalk and create a new Environment in my application.
I select 'create new worker'
I configure a PHP instance
I upload a new source for my environment
The source zip contains 2 files cron.yaml and someJob.php
See file codes below
I continue through set up until I get to the "Worker Details" section. Here I set the following:
Worker Queue - Autogenerated queue
HTTP Path - /someJob.php
MIME Type - default
HTTP Connections - 10
Visibility Timeout - 300
I let the environment build
During the build an autogenerated SQS message and dead letter queue are automatically built
Once finished the environment sits there until the first cron job time is hit
A message is somehow sent to the autogenerated SQS message queue
someJob.php runs
The message apparently gets deleted
Cron:
version: 1
cron:
- name: "Worker"
url: "someJob.php"
schedule: "0 * * * *"
PHP:
psuedo
<?send me an email, update the db, whatever?>
//NOTE: I don't even connect to the AWS file or perform ANY SQS actions for this to work
Now my question is, if I go to the autogenerated SQS queue I can select it, go to Queue Actions, go to send a message, and then send an actual message string ... such as "Hello".
Can I access the message message value "Hello" even though my PHP wasn't responsible for calling the message from SQS? Obviously, I would need to the call the AWS lib and associated SQS commands but the only command I can do is "receiveMessage" which I assume would pull a new message instead of the information from the currently received "Hello" message.
Note that sending the "Hello" message will also call someJob.php to run.