I have some files stored in a bucket on AWS S3 which I would like to access from a PHP application running on a remote webserver. Unfortunately I am having troubling installing aws cli on the remote webserver as my hosting company has constrained me within jailshell, so I can't run any install scripts; I have no ssh access and can only upload software to the remote web server via FTP. I can run scripts on the remote web server via cron jobs so I can do some automated execution within the jailshell but it's pretty limited
I tried doing an installation on my local machine with the awscli bundled installer, which limited the installation to a single directory /aws on my local machine, which I then uploaded to the remote webserver, but I ran into problems executing the aws commands from a script due to symlinks and PATH not set up correctly on the remote webserver.
http://docs.aws.amazon.com/cli/latest/userguide/awscli-install-bundle.html
explains how to set up the aws cli, but it doesn't work well in my situation where I can't run PIP, or even edit the $PATH.
Related
I am working deploying a Laravel application to the AWS ElasticBeanstalk. I configured the CLI and I could deploy the application to an ElasticBeanstalk environment running the command. This is what I have done so far.
I created an ElasticBeanstalk application and an environment in it.
Then I initialised the application for deployment using "eb init" and deployed it using "eb deploy". But I would like to add some additional commands to be run during the deployment. For example, I might run "gulp build" or other commands. Where and how can I figure it? I know that there is an .elasticextension folder but that does not allow us to add custom commands to be run on deployment.
I know that there is an .elasticextension folder but that does not allow us to add custom commands to be run on deployment.
Not sure what do you mean that you can't run commands in .ebextensions during deployment. But the extensions are commonly used for running commands or scripts when you are deploying your app. There are special sections for that:
commands: You can use the commands key to execute commands on the EC2 instance. The commands run before the application and web server are set up and the application version file is extracted.
container_commands: You can use the container_commands key to execute commands that affect your application source code. Container commands run after the application and web server have been set up and the application version archive has been extracted, but before the application version is deployed.
There are also platform hooks on Amazon Linux 2 to further fine tune the deployment of your applications.
Finally, if all of them are not suited, you could create dedicated build step in CodePipleline for you application. The dedicated step could be used to create fully deployment version of your application for EB with minimal amount of work to do at EB instances.
I have multiple standalone servers from where I want to upload/sync directories to Object Storage usign AWS CLI.
Do I have to install AWS CLI in each server? OR is there a common console/platform provided within AWS Object Storage from where I can call the same command over something like say SSH. How can I avoid installing cli to all the servers?
You have to install AWS CLI in all the servers even if you write the script to ssh from a single server that which is installed AWS CLI, SSH protocol will take the configuration from the remote server, not from a server where the script is running. It's better if you use a configuration management tool like ansible to speed up the process.
I'm trying to set up a web development environment on Amazon Workspaces running Amazon Linux AMI, but I didn't find a way to install Vagrant on the machine. I would like to have a virtual webdev machine for various practical reasons, but it seems that I can't run vagrant as AWS is already virtualised.
Is that correct, or is there a way to install and run vagrant/virtualbox containers on AWS Workspace?
AWS workspaces only offers a limited number of packages within its repo manager, so you won't find vagrant there. But you can manually install the repo using the CentOS download on their website. For example, this worked for me inside my Linux AMI WorkSpace:
wget https://releases.hashicorp.com/vagrant/2.1.2/vagrant_2.1.2_x86_64.rpm
yum install vagrant_2.1.2_x86_64.rpm
Now a WorkSpace is essentially a virtualized environment, so its unlikely you will be able to run a vm inside it - See this.
However vagrant offers a number of providers other than the default - including aws, which will allow you to spin up a vagrant box on an ec2 instance rather than locally. You can install it as follows:
vagrant plugin install vagrant-aws
And follow the configuration steps here
I have configured my .ebextensions folder to download and install a windows service on the leader ec2 instance.
Problem is that every time i want to update to a new version of the web application (Not the windows service) Those commands execute again and try to re install the service again.
On the other side. Every time i want to update only the widows service, i have to do the work manually through ssh or rdp. Or re-deploy the whole application which triggers the .ebextensions commands.
Is there a more elegant workflow for this i am skipping?
You are encountering Elastic Beanstalk weakest link. You host two different services on the same EB instance, which is unsupported by EB (which is lame I agree).
I resolved the "setup only once" need by appending a test to the setup extentension config file. In my case it's a linux box, but you can do something similar:
commands:
10_setup_win_service:
test: test ! -f /opt/elasticbeanstalk/.post-provisioning-complete
command: <...>
Now to complete this hack I have a file called .ebextensions/99_finalize_setup.config:
commands:
99_write_post_provisioning_complete_file:
command: touch /opt/elasticbeanstalk/.post-provisioning-complete
this approach ensures the win service is installed only once.
Now for your maintenance issue of the win service, you cannot use the EB toolset for that. Your understanding of the options here are correct - either use SSH to automate the work, or do it manually by logging into the server.
So I've got Jenkins running on an AWS EC2 windows instance. I've created some test jobs and they run my Selenium automation scripts fine from Jenkins in my FF browser using localhost:8080.
I have Jenkins installed in C:\Jenkins. However, inside C:\Jenkins/jobs I see no files at all. Where are my Jenkins job files located?
Look under here:
C:\Users\<user that you use to run jenkins>\.jenkins
You can see your home directory clicking on: Jenkins->Manage Jenkins->Configure System. That's where your job files should be.