Updating cloud code in aws - amazon-web-services

I am new to AWS elastic beanstalk. I have deployed the Parse example server using deploy to AWS button in the Parse Server Example Link. I want to update the cloud code in main.js but I don't know how can I deploy the cloud code the way I was deploying with Parse in terminal.commands applied .I am trying to upload cloud code but its not updating I used eb init,eb create,eb deploy from parse server folder but after is run eb deploy its updating but application version is creating but cloud code is not updated..Could anyone help me out with another solution?

Related

django docker postgres s3 lambda

AWS announced 4 months ago that "you can now package and deploy Lambda functions as container images", see here for AWS announcement and sample code. I am trying to deploy my Django app in production using this service and setup CI/CD using GitHub. I've been able to figure out the CI/CD deploying a simple Python app using Lambda (no S3 or RDS). However, I don't know how to get Django, S3, Postgres, and Lambda to work together. I am new to Docker and followed this tutorial However, the tutorial does not talk about how to serve the static files using S3 and how to get Lambda, Postgres, and S3 all in the container, persumably because this is a fairly new service. I was wondering if anyone has successfully deployed a Django app for development purposes using these services and can share how the Dockerfile, docker-compose.yml, etc. should look like

AWS Chalice - CI/CD - Deploy under the same gateway

When we use
chalice deploy
for a component which is to be available as REST endpoint, Chalice creates the Lambda and API on AWS infrastructure.
Every chalice project creates a new API with a unique id.
I want to be able deploy multiple chalice projects under the same API id. We want to be able to configure this API name/id and use it in CI/CD pipeline as well.
How do we achieve this?
The reason for the new API ids is because chalice when using the chalice deploy command, it creates a file in .chalice/deployed for that stage. In that file it would have the ID it would re-deploy to.
There are two solutions if you are using a CI/CD pipeline.
First being you can issue the FIRST deploy to create the file on your project LOCALLY. From your local machine you can run chalice deploy --stage {YourStageHere} and it will create the proper file and you can push that into your repo to save it. Then the pipeline will read from that file for the API ID.
The second being is much more in depth. It would require setting up a changeset to the pipeline. There is a very good starting tutorial in the official documentation:
https://chalice-workshop.readthedocs.io/en/latest/todo-app/part2/02-pipeline.html

Launching an EC-2 Instance for S3 Bucket

I'm new to web dev and created a site using an S3 bucket that works and is live online. However, I realized that since I'm trying to run a PHP script I found online for a contact form, the site is not truly static so the form will not work. The form is called like this in the html code:
<form name="htmlform" method="post" action="html_form_send.php">
Currently when I try to run this, I get a 405 method not allowed error. Is there a way I can get this code to run? I've done some research online that mentions I have to create an EC2 instance, but I'm overwhelmed by which AMI to choose and don't know how to get the instance connected to the bucket. Can anyone help, or at least does anyone have a resource I could look at to figure this out? Thanks!
The easiest way to deploy your PHP app in AWS is using Elastic Beanstalk.
Elastic Beanstalk for PHP makes it easy to deploy, manage, and scale
your PHP web applications using Amazon Web Services. Elastic Beanstalk
for PHP is available to anyone developing or hosting a web application
using PHP.
Elastic Beanstalk provides configuration options that you can use to
customize the software that runs on the EC2 instances in your Elastic
Beanstalk environment. You can configure environment variables needed
by your application, enable log rotation to Amazon S3, and set common
PHP initialization settings.
Reference: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_PHP_eb.html

How to deploy Webpack build to AWS from Bitbucket?

I have a React application that I am bundling using Webpack. The app relies on a MongoDB database and a Node/Express server to field the backend of the app (API requests, etc.).
I want to set up continuous integration/deployment (C.I/D.), but am not sure where to start. As my app's GIT repo is with Bitbucket and I have had experience with AWS in the past, it would be good to enable C.I/D. using these. How do I go about this?
You can use Jenkins to build your project from BitBucket.
Make use of AWS CodePipeline and AWS CodeDeploy for continuous delivery on AWS.
Jenkins gives you the flexibility to work with any source control system, and has plugins for AWS CodePipeline.
From AWS CodePipeline, you can configure a stage to call a Jenkins build job.
I've been using this system in production for quite some time now, without any issues.

Deploying webapp generated using Jhipster in amazon web services (AWS)

I have used Jhipster to generate a web app and i worked on top of it to redesign the web app as per my requirements.
Then i generated war using the command as below:
mvnw package -Pprod -DskipTests
Now i want to deploy that web app to Amazon Web Services. I have tried all the ways jhipster suggested me to do.
1. Direct AWS
2. Boxfuse
I have tried uploading the generated war directly to S3 bucket also, but uploading fails.
Using boxfuse, i have configured everything as per documentation and i tried uploading.
It gives me the following error in the cmd prompt while uploading war to the aws console.
Push failed (Connection reset by peer: socket write error)
Please suggest me a way to upload the war generated to AWS and deploy in EBS there.