Deploy Spring Batch to AWS - amazon-web-services

I am new to AWS. I have developed a batch processing application using Spring boot, batch and Quartz for scheduling. It pulls a file from remote FTP server and loads into DB. Can someone pls help, how can I deploy this to AWS?
Will this be on Elastic Beanstalk (EC2) instance with volume mounted to it for downloading the file from FTP and then processing?
Thanks,

You should probably look at the official Spring site, Spring Cloud AWS. It is a somewhat involved process, but powerful. If your job is a small one, you may wish to consider a AWS Lambda function instead.

Related

How to deploy a .war stored in Google cloud storage to a Compute Engine VM instance using a Cloud Function?

My current Approach-
I'm currently using Cloud Build to build and store a .war artifact in a GCS bucket. To deploy it on my custom VM, I'm running a java program on this GCE vm which detects changes to the bucket using Pub/Sub notification and downloads and deploys the fresh .war on the vm.
My Objective-
Download a ~50MB Spring boot 2.X + Java 11 war from GCS using a cloud function written in Java 11
Upload it to the VM(Ubuntu 18.04.x LTS) using cloud function(Generation not relevant)
Deploy it on the VM from the cloud function(The war has an embedded tomcat container, so I only have to java -jar it)
My issue-
Connecting to the VM "externally" is my main issue. The only solution I can think of is running a Spring web service endpoint on the vm which receives a .war using POST. I'll use the cloud function to POST the downloaded .war to this endpoint which will deploy it on the vm.
However, this approach seems like a Rube Goldberg machine from my perspective so I'm wondering if there is better idea than what I've come up with.
P.S- We're aware that pulling from the VM is the more sound approach, but this cloud function deployment is a client request so sadly, we must abide.

Best solution to deploy spring batch application on AWS cloud env

I am developing a Spring batch application and want to deploy on AWS env to ensure minimal usage of resources and as soon as the batch completes the resources should be terminated.
Please suggest me the best service for that in AWS. My job will take 1-2 hr to run.
Edit: AWS batch seems right option but not sure how inter node communication will happen as Spring batch using messaging middleware for inter node communication but AWS batch suggest to use IP based MPI, Apache MXnet etc for the same purpose

Suggest me the right AWS service

My Spring boot application has REST endpoints and has scheduled tasks. Which AWS service should I use?
I was considering beanstalk, but there I have to choose a web or worker environment. Similar selection is required in Heroku also.
My application can withstand crashes, if it can spin up automatically.
Can you suggest an appropriate AWS service?
although I'm not sure what your scheduled tasks are, I'd have a look at AWS Lambda
https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-run-lambda-schedule.html
I think, you must use AWS Elastic Beanstalk. Why?
All Cloud providers follow the paradigms below.
IAAS: Basically providers give VMs. You manage operating system, load balancer etc.
Exp: AWS Elastic Computing, GCP Compute Engine ..
PAAS: Basically Providers operate your operations, just deploy your code. (Generally if app is down, it can spin up automatically.)
Exp: AWS Elastic Beanstalk, GCP App Engine ..
For better understanding types of cloud computing, read this doc.

how to deploy code form bitbucket to amaozn EC2 server

I have created a laravel application and now i have push my code to a staging server.
In my amazon AWS account I have created an EC2 server, simple way to do this, and find it confusing.
I want to run this project on server.
Can someone please point me in the right direction?
You can use startup script on your EC2.
Your startup script should do the following steps:
1-Install requirement software or services in the new machine.
2-Download or clone latest your app code from git and build it if necessary.
3-Download other assets or software or data from the S3 bucket.(Java, Tomcat, or WARs)
4-Start and configure services.
You should strongly consider using the AWS CodeDeploy service to manage the deployment of code on your instance.
It is easy to setup, and the service itself is free; you pay for instance usage only.

How to deploy and Run Amazon Kinesis Application on Amazon Kinesis service

I am trying to understand how to deploy an Amazon Kinesis Client application that was built using the Kinesis client library (KCL).
I found this but it only states
You can follow your own best practices for deploying code to an Amazon EC2 instance when you deploy a Amazon Kinesis application. For example, you can add your Amazon Kinesis application to one of your Amazon EC2 AMIs.
which is not giving a broader picture to me.
These examples use an Ant script to run Java program. Is this the best practice to follow?
Also, I understand even before running the EC2 instances I need to make sure
The developed code JAR/WAR or any other format needs to be on the EC2 instance
The EC2 instance needs to have all the required environment like Ant setup in place already to execute the program.
Could someone please add some more detail on this?
Amazon Kinesis will be responsible for ingesting data, not running your application. You can run your application anywhere, but it is a good idea to run it in EC2, as you are probably going to use other AWS Services, such as S3 or DynamoDB (Kinesis Client Library uses DynamoDB for sharding, for example).
To understand Kinesis better, I'd recommend that you launch the Kinesis Data Visualization Sample. When you launch this app, use the provided CloudFormation template. It will create a stack with the Kinesis stream and an EC2 instance with the application, that uses Kinesis Client Library and is a fully working example to start from.
The best way I have found to host a consumer program is using EMR, but not as a hadoop cluster. Package your program as a jar, and place it in s3. Launch an emr cluster and have it run your jar. Using the data pipeline you can schedule this job flow to run at regular intervals. You can also scale an emr cluster, or use a actual EMR job to process the stream if you choose to get the high tech.
You can also use Beanstalk. I believe this article is highly useful.