AWS Sqs Source Connector to Kafka - amazon-web-services

I am trying to connect to AWS sqs Queue where my data is stored. I have the Aws.key and secret with me. I also have the SQS Source Connector which transfers data from SQS queue to my Kafka topic. I would like to know how to do that. Do I need to work on the AWS console ? How to use the Source Connector to transfer the data ?

You need to deploy your own infrastructure (doesn't require the console, no) running Kafka Connect, then install the SQS connector plugin and then HTTP POST the connector config to its REST API

Related

GCP Deployment manager - Call HTTP endpoint when everything is deployed

I have a template that creates a few resources in GCP, and I want it to either call an HTTP endpoint or publish a message to a topic whenever the deployment completes. I've been checking different services all day, and couldn't find anything about it.
In AWS, it is quite easy to deploy an SNS message that is published to an SNS topic, which is subscribed to SQS Queue, and that triggers a lambda function. In Azure ARM templates, we can use az CLI to invoke a web request and call an endpoint directly.
I couldn't find any similar in GCP. Is there any way of either calling an HTTP endpoint, Cloud Function or perhaps publishing a message to a topic whenever a deployment is finished?
I really appreciate any help.
The best approach in GCP is to Create a Logging Sink using a filter and the Logging query language to only send the Deployment Manager logs to a PubSub topic.
Here is an example of a sink sending deployment manager logs to a PubSub topic previously created:
gcloud logging sinks create $SINK_NAME pubsub.googleapis.com/projects/$YOUR_PROJECT/topics/$TOPIC \
--log-filter='resource.type="deployment"' --description="my sink"
Be careful to Set the destination permissions or will not see the logs in the PubSub topic.
Once you are getting the logs in the PubSub topic, you can configure Cloud Pub/Sub Triggers to fire up an HTTP call based on content of the log.

aws with an external rabbitmq broker

We have a provider that wants to relay information information through RabbitMQ.
Our system is completely on AWS and we runs on managed services. Example API Gateway, Lambda and DynamoDB, we would like to have a listener (consumer) of messages on the broker and we write into our DynamoDB.
Is this possible to do with some managed AWS Service? I have googled a lot on this theme but only find things on installing a message broker within AWS.
It would be great is someone could point me to the correct direction.

How to receive data MQTT on AWS

I have an external service that sends me data via MQTT. I need to receive this data on AWS then process it and then write it to an RDS instance. What services are recommended for this purpose? An Amazon MQ and a Lambda? A SQS and a Lambda? Any other?
Thanks for any help.
AWS IoT 'things' have a Message Broker that supports MQTT.
Traditionally, IoT and MQTT involves huge numbers of messages, which are normally stored in DynamoDB. However, you could write an AWS Lambda function to store the received messages in an Amazon RDS database.

Pentaho di jobs triggered by Lambda

Pls help, how this can be achieved?
Requirement:
When new files are available in a AWS S3 Bucket, a lambda process will be triggered and Pentaho job(s) to validate/process the files should be triggered.
The Pentaho Job should be executed in the server and not the Lambda JVM
(to make use of the resources of the Linux Server where Pentaho 7.1 Client community version is available.)
Note: I followed the approach in https://dankeeley.wordpress.com/2017/04/25/serverless-aws-pdi/ and this executes the code in Lambda JVM, but per our reqmt we need the job to run in the linux server.
Infra Details:
Pentaho Code will be in file repo in server; mount location example: /mnt/data
Pentaho version: Pentaho 7.1 Client community version.
Server: Linux
Thanks in advance.
If you want your Pentaho Job to be executed in the server and not the Lambda JVM, you dont need AWS Lambda at all.
Instead you can use
AWS SNS and
Provision an HTTP endpoint on your Linux Server which then subscribes to SNS topic
Basically you will need to install an HTTP Server and provision an HTTP endpoint that can be invoked when new files are available in S3.
So when new files are available in a AWS S3 Bucket, you can set the notification to AWS SNS instead of AWS Lambda and then as a subscriber to this SNS Topic you can hook in the HTTP endpoint that you provisioned in step 2 above.
So whenever a new file is invoked a notification will go to SNS which in turn will push that to HTTP endpoint and then you can read the file and execute your Pentaho Job

How to use (if possible) JMeter to load a AWS SQS queue with messages?

I need to load a SQS queue on the Amazon Web Services to load test an application which consumes the messages from the SQS queue.
Is it possible to load the SQS queue using JMeter?
Are there any better options available to load the SQS queue than using JMeter?
I haven't tried it, but you should be able to send the messages through HTTP requests.
I created JMeter plugin that use AWS SDK to publish events in SNS Standard or FIFO Topic:
https://github.com/JoseLuisSR/awsmeter
You just need AWS IAM user with programmatic access, download JMeter and install awsmeter plugin.
If you have questions or comments let me know.
Thanks.