Hope someone can help me with this.
I have set up an EC2 with EFS mounted to it.
The EFS basically contains the WordPress source code.
What is the best way to set up an automated process to update the source code inside the EFS from a bitbucket repository.
I'm able to ssh to the server and run git pull (a bit slow), but I want to set up an auto-scaler later, and I want a way to deploy the code changes directly into the EFS without doing the SSh on the ec2 instance.
Thanks
Easiest way, have a cron job that runs git pull every day, or whatever frequency you want.
If you want to be fancier, then have it run git fetch and then depending on the changes, run a pull.
If you want to be secure, create a read-only bitbucket account so if someone infiltrates your EC2 host they cant mess up your repo.
If you want to be able to monitor it, you can make the script ping AWS CloudWatch and add alarms so if your cron fails, you can set it back up.
Related
I have few python scripts which need to be executed in sequence on AWS Cloud so what are the best and simplest options? These script files are proof of concept so little bit dirty also but need to run overnight. Most of the script finishes within 10 mins but couple of them can take up to 1 hour running on a single core.
We do not have any servers like Jenkins, airflow etc...we are planning to use existing aws services.
Please let me know, Thanks.
1) EC2 Instance (Manually controlled)
Upload your scripts to an S3 bucket Use default VPC
launch EC2 Instance
Use SSM Remote session to log in
Run AWS CLI (AWS S3 Sync to download from S3)
Run them Manually
stop instance when done.
To be clean, make a SH file (or master .py file) to do the work. If you want it to stop charging you money afterwards, add command to stop instance when complete.
Least amount of work
2) If you want to run scripts daily
- Script out the work above (include modifying the Autoscale group at end to go to one box)
- Create an EC2 Auto Scale Group and launch it on a CRON job schedule.
It will start up, do the work, and then shut down and stop charging you.
3) Lambda
Pretty much like option 2, but AWS will do most of the work for you.
Either put all your scripts into one lambda..or put each script into its own lambda and have a master that does sync invoke of each script in the order you want.
You have a cloudwatch alarm trigger daily and does the work
I would say that if you are in POC mode, option 1 is best decision. It is likely closest to what you already do where you are currently executing. This is what #jarmod recommended already.
You didn't mention anything about which AWS resources your python scripts need to access or at least the purpose of the scripts, so it is difficult to provide a solution.
However a good option is to use AWS Batch.
I have a Node.js web application currently running on a single EC2 instance on AWS. I am thinking of using auto scaling with 2 or more EC2 instances since the load on the application is increasing.
I have been trying to understand something with AWS Auto Scaling for a couple hours now but I cant seem to find an answer anywhere.
Currently, at many instances I SSH into my Ubuntu EC2 instance to modify some things or to run a deploy command (which grabs latest code from github). How does this work when you have, let's say 4 instances running under the auto scaling?
So if I SSH into a server and change the server.js file, what happens to the other 3 instances?
If that is not possible what are my choices? I have seen many people seeing that using S3 is the way to keep things in Sync but I don't fully get that. So I have to keep all my source code in S3 and do my edits from there?
You won't be able to modify files directly on the server once they are in an auto-scaling group. Changing something on one server won't be reflected on the other servers, and even if you manually updated all the currently running servers, any servers added by auto-scaling actions will not have those changes.
There are many methods to solve this, for example using AWS Code Deploy.
You could also configure something via an EC2 User-Data script in your auto-scaling configuration which will run on each server when they are created. That script could checkout the latest code from Git, or pull the latest build artifact from S3, and then start the app. When you have an update ready to deploy, you would simply flag the current instances as "unhealthy" and wait for the Auto-Scaling group to automatically replace them with new, updated instances.
You could use AWS EFS to host your application code and all web servers will get content from EFS instead of individual server. This way you don't have to worry about modifying individual server content.
One way you can do it is using github. you can update your code and push it to github and then terminate your existing instances and let the auto-scaling group spin up new instances with the updated code. here is a youtube tutorial video that has detailed steps on how to do it: https://www.youtube.com/watch?v=lB3Ip0Yn-Zs
I am playing around with the idea of having an Auto Scaling Group for my website that receives a lot of traffic. I need each server to be running an identical webservice, so I have come up with several ideas to make this happen.
Idea 1: Use Code Commit + User Data
I will keep my webserver code in a git repo in CodeCommit. Then, when my EC2 instances spin-up, they will install apache2, and then pull from the git repo.
Idea 2: Use Elastic File System
After a server spins up, it will mount to one central EFS that has my webserver code on it. EC2 will install apache2 then use EFS to get the proper php files etc.
Idea 3: Use AWS S3
Like above with apache2, but then download webserver code from s3.
Which option is advised? Why?
I suggest you have a reference machine which is used for creating images. Keep it updated with the latest version of your code and when you are happy with it, create an image out of it, update your launch configuration, and change the ASG configuration so that it uses it. You can then stop the reference machine and leave the job to the ASG instances.
I am trying AWS cloud solutions and want to make a privat git repository, but there are multipe ways of doing this: on Ec2 Instence, by CodeCommit or any other ways of doing this. Any suggestion are well appreciated.
I think you mean codecommit, not codedeploy:
http://aws.amazon.com/codecommit/
AWS CodeCommit is a secure, highly scalable, managed source control
service that hosts private Git repositories. AWS CodeCommit eliminates
the need for you to operate your own source control system or worry
about scaling its infrastructure. You can use AWS CodeCommit to store
anything from code to binaries, and it works seamlessly with your
existing Git tools.
If you want to be on AWS, and you are open to using codecommit, that is going to be your safest, easiest, cheapest and most reliable option.
You can certainly run it on an EC2 instance, but then you need to have that instance running all the time, and you need to back it up, and possibly have a hot spare ready to go in case your first goes down...
Or, for a few pennies a day you can let AWS handle all that for you.
I am running a couple of ubuntu ec2 instances, I want to run an automation script which will pull the code from Github whenever a new instance is booted from the AMI. The thing is presently I am sshing to the server and run the command git pull origin master and it will ask for password key.
How do I automate this process? So after booting the new instance from a AMI this script should:
Run
Pull the code and also the submodule
Create couple of files and configure it
Please help me to achieve it.
Thanks
This will probably take some time and configuring, but this might set you on the right path.
First, setup your ssh keys, so that you can automatically pull from a repo, without a password. Outlined here: https://help.github.com/articles/generating-ssh-keys
Next, create a startup script to issue the 'pull' command from Github. Here: https://help.ubuntu.com/community/UbuntuBootupHowto
Then save your AMI, When you start a new EC2 AMI, the script should run, pulling in your Github changes.
Also to note, make sure gits remote path is using SSH, if it is HTTPS, it will ALWAYS ask for a password.
Your best best would be to utilize the fact the Ubuntu utilizes CloudInit within its canonical image.
Using CloudInit, you can pass scripts (i.e. shell scripts) to execute at various start up stages as EC2 user-data.
It would be very easy for your to make your GIT command line sequence execute from such a script. He is link to documentation, which includes examples.
https://help.ubuntu.com/community/CloudInit
Create a user-password access to your ubuntu instance. Replicate this particular instance if you need multiple. Now you are free of the key access. If you need to automate a process in that instance cron it or send the script via ssh to that instance and let the cron to find and run it.