Best way to work with code on cloud? - amazon-web-services

I've lately started with Amazon Web Services and deployed a couple of express applications on EC2 and I find it extremely tedious to edit code on the fly via SSH (ssh is little unresponsive for coding purposes and I'm not really comfortable with nano and vim for heavy editing).
I know I can edit it on my machine and scp it to EC2. I was thinking whether there's any way I can setup something like nodemon but for cloud, i.e. whenever I make a change on my local development, it deploys it on cloud with scp? Kind of extending nodemon to cloud.
Or is there any other way to work with that?

There are plug-ins and utilities that can allow you to edit locally with Sublime Text (a good Australian editor -- please register if you use it a lot!) and have the file automatically updated on a remote server.
See:
Stackoverflow: How to use Sublime over SSH
Editing files remotely via SSH on SublimeText 3
There are probably many similar utilities if you go looking for them.

Related

How can i automate script executions in aws EC2 using go sdk?

I'm building an app that manages multiple ec2 instances using the go sdk. I would like to run scripts on these instances in an automated way.
How can I achieve that ? I don't think os.command => ssh => raw script stored as string in code is the best practice. Is there any clean way to achieve this ?
Thanks
Is there any clean way to achieve this ?
To bootstrap your instance, you would create a UserData script. The script runs only once, just after your instance is launched.
For other execution of commands remotely, you can use SSM Run Command to run command on a single or multiple instances.
The way you suggest is actually valid and can work. I agree with you though, it wouldn't be my first choice either. I would either use the package golang.org/x/crypto/ssh in the standard library or an external solution like github.com/appleboy/easyssh-proxy.
I would lean towards the default library but if you don't have a preference there then the Scp function of the latter package might be especially of interest to you. You can find examples of this in the readme of the project.
Secure copy protocol (SCP) is a means of securely transferring computer files between a local host and a remote host or between two remote hosts. It is based on the Secure Shell (SSH) protocol.
EDIT: After seeing Marcin's answer, I think my answer is more the plain SSH answer, AWS independent. For the idiomatic answer for AWS please definitely look at his suggested solution!

Using cloud functions vs cloud run as webhook for dialogflow

I don't know much about web development and cloud computing. From what I've read when using Cloud functions as the webhook service for dialogflow, you are limited to write code in just 1 source file. I would like to create a real complex dialogflow agent, so It would be handy to have an organized code structure to make the development easier.
I've recently discovered Cloud run which seems like it can also handle webhook requests and makes it possible to develop a complex code structure.
I don't want to use Cloud Run just because it is inconvenient to write everything in one file, but on the other hand it would be strange to have a cloud function with a single file with thousands of lines of code.
Is it possible to have multiple files in a single cloud function?
Is cloud run suitable for my problem? (create a complex dialogflow agent)
Is it possible to have multiple files in a single cloud function?
Yes. When you deploy to Google Cloud Functions you create a bundle with all your source files or have it pull from a source repository.
But Dialogflow only allows index.js and package.json in the Built-In Editor
For simplicity, the built-in code editor only allows you to edit those two files. But the built-in editor is mostly just meant for basic testing. If you're doing serious coding, you probably already have an environment you prefer to use to code and deploy that code.
Is Cloud Run suitable?
Certainly. The biggest thing Cloud Run will get you is complete control over your runtime environment, since you're specifying the details of that environment in addition to the code.
The biggest downside, however, is that you also have to determine details of that environment. Cloud Funcitons provide an HTTPS server without you having to worry about those details, as long as the rest of the environment is suitable.
What other options do I have?
Anywhere you want! Dialogflow only requires that your webhook
Be at a public address (ie - one that Google can resolve and reach)
Runs an HTTPS server at that address with a non-self-signed certificate
During testing, it is common to run it on your own machine via a tunnel such as ngrok, but this isn't a good idea in production. If you're already familiar with running an HTTPS server in another environment, and you wish to continue using that environment, you should be fine.

Medium Hadoop / Spark Cluster Administration

Please let me know if this question is more appropriate for a different channel but I was wondering what the recommended tools are for being able to install, configure and deploy hadoop/spark across a large number of remote servers. I'm already familiar with how to setup all of the software but I'm trying to determine what I should start using that would allow me to easily deploy across a large number of servers. I've started to look into configuration management tools (ie. chef, puppet, ansible) but was wondering what the best and most user friendly option to start off with is out there. I also do not want to use spark-ec2. Should I be creating homegrown scripts to loop through a hosts file containing IP? Should I use pssh? pscp? etc. I want to just be able to ssh with as many servers as needed and install all of the software.
If you have some experience in scripting language then you can go for chef. The recipes are already available for deployment and configuration of cluster and it's very easy to start with.
And if wants to do it by your own then you can use sshxcute java API which runs the script on remote server. You can build up the commands there and pass them to sshxcute API to deploy the cluster.
Check out Apache Ambari. Its a great tool for central management of configs, adding new nodes, monitoring the cluster, etc. This would be your best bet.

Is it possible to remotely run scripts in a guest OS using VCLI?

Using VMware OVF Tool 4.0, I'm deploying/powering on some VMs and would like to execute some scripts inside them. However, I was unable to assign injection properties to a VM, I.e: DNS, Gateway, etc. See OVF Tool documentation page 22 for more information: https://www.vmware.com/support/developer/ovf/ovf400/ovftool-400-userguide.pdf
This below link was helpful but the associated properties were not assigned, when I tred this:
http://www.virtuallyghetto.com/2014/06/an-alternate-way-to-inject-ovf-properties-when-deploying-virtual-appliances-directly-onto-esxi.html
As an alternative, I would like to remotely run a setup script that resides in the VM.
I'm seeing from articles online that Power CLI cmdlet Invoke-VMScript is a common choice. Link shown below:
https://www.vmware.com/support/developer/PowerCLI/PowerCLI51/html/Invoke-VMScript.html
Is there an alternative method to this cmdlet? Is there a similar command that VCLI has to offer? Any assistance would be great. Thanks in advance.
Regards,Gabriel
I researched VIX API using Perl, and I'm now able to run remote scripts from a VM.Link: https://www.vmware.com/support/developer/vix-api/Regards,Gabriel

Deploying first Django project on Amazon EC2 free tier

Finally the time has come and I'm ready to deploy my first django project.
I'm a newbie in web development stuff and now the real fun begins.
This is a low scale site for computer jobs.
I want to start with a free tier and grow from there as need emerges.
I've read some guides regarding django project deployment but could not find all answers.
so hope some guys here could help me out:
I've been thinking on getting Amazon EC2 free tier VPS, is this a good option?
my local development machine runs Ubuntu, I've read that i could install 10GB Ubuntu image, do you recommend such image?
should I go with apache or lighter web server?
My project is hosted on bitbucket, I just need to checkout my project on my VPS right?
What about data backups? I would like to backup my mySQL DB
How do you recommend me serving the static files?
I'm looking for a good tutorial on how to setup AWS with django and mysql
10x guys!
I've been thinking on getting Amazon EC2 free tier VPS, is this a good
option?
If it fufills your technology requirements, ram, cpu, memory; it is a good option.
my local development machine runs Ubuntu, I've read that i could
install 10GB Ubuntu image, do you recommend such image?
Might as well keep your environments the same if you can. If you can match up versions that is another plus
should I go with apache or lighter web server?
Either, Apache would probably be easier to deploy at this point because you don't have to worry about running it as a servicer ( using a program like like supervisor to manage it).
Whichever one you choose, there is an abundance of tutorials online describing how to set up django.
My project is hosted on bitbucket, I just need to checkout my project
on my VPS right?
That is one way. There are lots of ways to deploy. I like syncing the actual files using fabric. That way your production server doesn't need to know about your bitbucket account. Once again, there are so many tutorials online describing deploying django. Fabric is a great place to start.
What about data backups? I would like to backup my mySQL DB
There exists lots of tools for this. Plenty of premade tools and shell scripts. I have used automysqlbackup and it works great http://sourceforge.net/projects/automysqlbackup/
How do you recommend me serving the static files?
Make sure the webserver serves them. If you deploy through apache you can set up an alias to serve static files very easily. You can come up with a collectstatic deployment scheme to put your static on s3, but for a simple site apache would be just fine
I'm looking for a good tutorial on how to setup AWS with django and
mysql
Perhaps you can find a tutorial that covers this, most likely you might just find a tutorial :
how to setut aws with ubuntu
Installing django / mysql on ubuntu