AWS CLI command doesn't work from crontab - amazon-web-services

I have a CentOS 7.9 machine with AWS CLI installed on it. I can successfully run this command on its own: aws s3 sync s3://bucketname /data/bucketname/ but when I try to run it via crontab, it doesn't work. All I see in the /var/log/cron is that the command ran, but there is no data in that /data/bucketname directory.
This is what my crontab looks like:
*/5 * * * * sh /root/script.sh
And this is what my script looks like:
#!/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin:/opt/aws/bin
SHELL=/bin/bash
/bin/aws s3 sync s3://bucketname /data/bucketname/
which aws returns /bin/aws so I tried to add that to my script.sh file but that did not make a difference. Any help would be much appreciated.

Related

Jenkins - bash: aws: command not found but runs fine from terminal

In Build Step, I've added Send files or execute command over SSh -> SSH Publishers -> Exec command, I'm trying to run aws command to copy file from ec2 to s3. The same command runs fine when I execute it over the terminal, but via jenkins it simply returns:
bash: aws: command not found
The command is
cd ~/.local/bin/ && aws s3 cp /home/ec2-user/lambda_test/lambda_function.zip s3://temp-airflow-us/lambda_function.zip
Based on the comments.
The solution was to use the following command:
cd ~/.local/bin/ && ./aws s3 cp /home/ec2-user/lambda_test/lambda_function.zip s3://temp-airflow-us/lambda_function.zip
since aws is not available in PATH env variable.
command not found indicates that the aws utility is not on $PATH for the jenkins user.
To confirm, sudo su -l jenkins and then issue the command which aws - this will most likely return no results.
You have two options:
use the full path (likely /usr/local/bin/aws)
add /usr/local/bin to the jenkins user's $PATH
I need my Makefile to work in both Linux and Windows so the accepted answer is not an option for me.
I diagnosed the problem by adding the following to the top of my build script:
whoami
which aws
env|grep PATH
This returned:
root
which: no aws in (/sbin:/bin:/usr/sbin:/usr/bin)
PATH=/sbin:/bin:/usr/sbin:/usr/bin
Bizarrely, the path does not include /usr/local/bin, even though the interactive shell on the Jenkins host includes it. The fix is simple enough, create a symlink on the Jenkins host:
ln -s /usr/local/bin/aws /bin/aws
Now the aws command can be found by scripts running in Jenkins (in /bin).

AWS crontab not working on Amazon Linux 2/3.0.1

I have a laravel project setup on AWS elastic beanstalk(Amazon Linux 2/3.0.1) and I'm trying to setup a cronjob to run a command.
I ran nano crontab -e and wrote
* * * * * cd /var/app/current && php artisan command:initcrawl to run for every minute as a test, but it doesn't work.
When i cd in /var/app/current and then run the command manually php artisan command:initcrawl it works without any problem.
My problem was the fact that I wrote 'nano' in front of the crontab command.
Instead of writing 'nano' in front of the command to use the nano editor i should have done this export EDITOR='/usr/bin/nano' then when i run crontab -e everything would work fine now.

How can i write crontab script for .sh file of copying the file from AWS s3 folder to my local ec2 folder?

I want to write the crontab script for copy the file from AWS s3 folder to my local aws ec2 . I can just run the command at my linux command by
aws s3 cp s3://sftp-test/test.txt /u02/app/oracle/test/
So the test.txt file will be copy from s3 to my local u02 folder. But when i write at my command line at .sh file and then it does not work. First i create test.sh file by
#!/bin/sh
*/5 * * * * aws s3 cp s3://sftp-customs/test.txt /u02/app/oracle/test/
Second i call that crontab by
crontab test.sh
Then i can see the my script when i used
crontab -l
But overall it does not work. I don't received any file from server. Thanks for your time all. Please help with some advise.
The shell file must be like:
#!/bin/sh
source ~/.bash_profile #or .bashrc
aws s3 cp s3://sftp-customs/test.txt /u02/app/oracle/test/
and in cron (using `crontab -e) you should enter
*/5 * * * * /path/to/test.sh
alternatively create new file (test.txt for example) with content:
*/5 * * * * /path/to/test.sh
and execute command:
crontab test.txt
NB! The last command will replace your entire cron

Userdata ec2 is not excuted

I am setting up a web app through code pipeline. My cloud formation script is creating an ec2 instance. In that ec2 user data, I have written a logic to get a code from the s3 and copy the code in the ec2 and start the server. A web app is in Python Pyramid framework.
code pipeline is connected with GitHub. It creates a zip file and uploads to the s3 bucket. (That is all in a buildspec.yml file)
When I changed the user data script and run code pipeline it works fine.
But When I changed some web app(My code base) file and re-run the code pipeline. That change is not reflected.
This is for ubuntu ec2 instance.
#cloud-boothook
#!/bin/bash -xe
echo "hello "
exec > /etc/setup_log.txt 2> /etc/setup_err.txt
sleep 5s
echo "User_Data starts"
rm -rf /home/ubuntu/c
mkdir /home/ubuntu/c
key=`aws s3 ls s3://bucket-name/pipeline-name/MyApp/ --recursive | sort | tail -n 1 | awk '{print $4}'`
aws s3 cp s3://bucket-name/$key /home/ubuntu/c/
cd /home/ubuntu/c
zipname="$(cut -d'/' -f3 <<<"$key")"
echo $zipname
mv /home/ubuntu/c/$zipname /home/ubuntu/c/c.zip
unzip -o /home/ubuntu/c/c.zip -d /home/ubuntu/c/
echo $?
python3 -m venv venv
venv/bin/pip3 install -e .
rm -rf cc.zip
aws configure set default.region us-east-1
venv/bin/pserve development.ini http_port=5000 &
The expected result is when I run core pipeline, every time user data script will execute.
Give me a suggestion, any other
The User-Data script gets executed exactly once upon instance creation. If you want to periodically synchronize your code changes to the instance you should think about implementing a CronJob in your User-Data script or use a service like AWS CodeDeploy to deploy new versions (this is the preferred approach).
CodePipeline uses a different S3 object for each pipeline execution artifact, so you can't hardcore a reference to it. You could publish the artifact to a fixed location. You might want to consider using CodeDeploy to deploy the latest version of your application.

AWS commands not getting executed on CRONTAB

Before I proceed, please let me tell that I tried all methods mentioned at stackoverflow and other forums but nothing worked on my CentOS 6.8 server.
Here is what I have written in crontab
00 5 * * * /usr/bin/aws /var/www/html/james/crons/s3_downloader.sh
And s3_downloader.sh file full content is:
#!/bin/bash
aws s3 sync "s3://my_bucket/my_folder/" "/var/www/html/james/downloads/my_folder/";
But nothing is working when cron tab runs it. However everything works fine when I run it via command line screen on server.
My server has installed the AWS at path (using ROOT user): /usr/bin/aws (using which aws)
Here is the methods I have tried (but nothing worked for me):
-->Changed the path for aws in file contents:
#!/usr/bin/aws
aws s3 sync "s3://my_bucket/my_folder/" "/var/www/html/james/downloads/my_folder/";
--> Did export settings on ROOT console
export AWS_CONFIG_FILE="/root/.aws/config"
export AWS_ACCESS_KEY_ID=XXXX
export AWS_SECRET_ACCESS_KEY=YYYY
Edit:
When I logged the response from crontab to a log
usage: aws [options] <command> <subcommand> [<subcommand> ...] [parameters]
To see help text, you can run:
aws help
aws <command> help
aws <command> <subcommand> help
aws: error: argument command: Invalid choice, valid choices are:
Here is full response: http://pastebin.com/XAKQUVzT
Edit 2
After more debugging, I can see the error coming out (in cron log) is:
env: /var/www/html/james/crons/s3_downloader.sh: Permission denied
Your crontab entry is wrong. You have passed the name of your shell script (/var/www/html/james/crons/s3_downloader.sh) as a parameter to /usr/bin/aws.
You should either call aws s3 sync directly from within the crontab entry, or call your shell script (and make the shell script call aws s3 sync), but you're trying to do both.
So, change the crontab entry to execute the shell script (and make sure that the shell script is actually executable).
00 5 * * * /var/www/html/james/crons/s3_downloader.sh
#!/bin/sh
cd path of the folder with scripts
/usr/local/bin/aws ec2 stop-instances --instance-ids idoftheinstance
Include the path of aws /usr/local/bin/aws and good.