AWS commands not getting executed on CRONTAB - amazon-web-services

Before I proceed, please let me tell that I tried all methods mentioned at stackoverflow and other forums but nothing worked on my CentOS 6.8 server.
Here is what I have written in crontab
00 5 * * * /usr/bin/aws /var/www/html/james/crons/s3_downloader.sh
And s3_downloader.sh file full content is:
#!/bin/bash
aws s3 sync "s3://my_bucket/my_folder/" "/var/www/html/james/downloads/my_folder/";
But nothing is working when cron tab runs it. However everything works fine when I run it via command line screen on server.
My server has installed the AWS at path (using ROOT user): /usr/bin/aws (using which aws)
Here is the methods I have tried (but nothing worked for me):
-->Changed the path for aws in file contents:
#!/usr/bin/aws
aws s3 sync "s3://my_bucket/my_folder/" "/var/www/html/james/downloads/my_folder/";
--> Did export settings on ROOT console
export AWS_CONFIG_FILE="/root/.aws/config"
export AWS_ACCESS_KEY_ID=XXXX
export AWS_SECRET_ACCESS_KEY=YYYY
Edit:
When I logged the response from crontab to a log
usage: aws [options] <command> <subcommand> [<subcommand> ...] [parameters]
To see help text, you can run:
aws help
aws <command> help
aws <command> <subcommand> help
aws: error: argument command: Invalid choice, valid choices are:
Here is full response: http://pastebin.com/XAKQUVzT
Edit 2
After more debugging, I can see the error coming out (in cron log) is:
env: /var/www/html/james/crons/s3_downloader.sh: Permission denied

Your crontab entry is wrong. You have passed the name of your shell script (/var/www/html/james/crons/s3_downloader.sh) as a parameter to /usr/bin/aws.
You should either call aws s3 sync directly from within the crontab entry, or call your shell script (and make the shell script call aws s3 sync), but you're trying to do both.
So, change the crontab entry to execute the shell script (and make sure that the shell script is actually executable).
00 5 * * * /var/www/html/james/crons/s3_downloader.sh

#!/bin/sh
cd path of the folder with scripts
/usr/local/bin/aws ec2 stop-instances --instance-ids idoftheinstance
Include the path of aws /usr/local/bin/aws and good.

Related

AWS CLI command doesn't work from crontab

I have a CentOS 7.9 machine with AWS CLI installed on it. I can successfully run this command on its own: aws s3 sync s3://bucketname /data/bucketname/ but when I try to run it via crontab, it doesn't work. All I see in the /var/log/cron is that the command ran, but there is no data in that /data/bucketname directory.
This is what my crontab looks like:
*/5 * * * * sh /root/script.sh
And this is what my script looks like:
#!/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin:/opt/aws/bin
SHELL=/bin/bash
/bin/aws s3 sync s3://bucketname /data/bucketname/
which aws returns /bin/aws so I tried to add that to my script.sh file but that did not make a difference. Any help would be much appreciated.

Jenkins - bash: aws: command not found but runs fine from terminal

In Build Step, I've added Send files or execute command over SSh -> SSH Publishers -> Exec command, I'm trying to run aws command to copy file from ec2 to s3. The same command runs fine when I execute it over the terminal, but via jenkins it simply returns:
bash: aws: command not found
The command is
cd ~/.local/bin/ && aws s3 cp /home/ec2-user/lambda_test/lambda_function.zip s3://temp-airflow-us/lambda_function.zip
Based on the comments.
The solution was to use the following command:
cd ~/.local/bin/ && ./aws s3 cp /home/ec2-user/lambda_test/lambda_function.zip s3://temp-airflow-us/lambda_function.zip
since aws is not available in PATH env variable.
command not found indicates that the aws utility is not on $PATH for the jenkins user.
To confirm, sudo su -l jenkins and then issue the command which aws - this will most likely return no results.
You have two options:
use the full path (likely /usr/local/bin/aws)
add /usr/local/bin to the jenkins user's $PATH
I need my Makefile to work in both Linux and Windows so the accepted answer is not an option for me.
I diagnosed the problem by adding the following to the top of my build script:
whoami
which aws
env|grep PATH
This returned:
root
which: no aws in (/sbin:/bin:/usr/sbin:/usr/bin)
PATH=/sbin:/bin:/usr/sbin:/usr/bin
Bizarrely, the path does not include /usr/local/bin, even though the interactive shell on the Jenkins host includes it. The fix is simple enough, create a symlink on the Jenkins host:
ln -s /usr/local/bin/aws /bin/aws
Now the aws command can be found by scripts running in Jenkins (in /bin).

AWS CLI working in shell but not working in cron job

I tried to automate some jobs through cron on EC2. However, the aws commands failed due to an error '403 Forbidden'. The codes worked fine when I ran those in the shell. I tried almost all the solutions on stack overflow but none of those solved the issue.
The testing code is simple enough. (test copying a file from S3 to my EC2)
/2 * * * * /usr/local/bin/aws s3 cp s3://fraud-training-test/FraudAWS_v1.R /home/ubuntu
Anyone has a sense of what else I could be doing to solve this issue? Thanks!
So far, I tried the following solutions:
Installed and configured awscli for both root and my current user;
Put the global path and home on top of my script
eg:
SHELL=/bin/bash
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
Tried scheduling the cron job as my current user and as root but both failed.
Below is the exact error I got from the log:
Subject: /usr/local/bin/aws s3 cp s3://fraud-training-test/FraudAWS_v1.R /r$
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
X-Cron-Env: <PATH=/usr/bin:/usr/local/bin>
X-Cron-Env: <SHELL=/bin/sh>
X-Cron-Env: <HOME=/root>
X-Cron-Env: <LOGNAME=root>
fatal error: An error occurred (403) when calling the HeadObject operation: Forbidden
The problem is probably that the user you configure in crontab is not the same user configured in the AWS CLI.
You must match the user that executes crontab -e with the user that is configured in aws configure
This worked fine on my Lightsail Bitnami instance.
Check your AWS Installed Path by using which aws and do ls -la to know the user permissions.
Then run it accordingly in either crontab -e for current user or sudo crontab -e for root user.
It worked for me in Ubuntu 16.04.

AWS ebextensions crontab error: no such file (but it exists)

I am running an amazon web services web server tier elastic beanstalk application, and I would like to run a script every 3 hours. I've googled around quite a bit, and the answer for this seems to be to create a ".ebextensions" folder, with a .config file that looks something like this:
commands:
01_cronjobs:
leader_only: true
command: "cat .ebextensions/etss.txt > /etc/cron.d/crontab_etss && chmod 644 /etc/cron.d/crontab_etss"
Unfortunately, when I do this, the application fails to deploy with the error:
[Instance: xxx] Command failed on instance. Return code: 1 Output: cat: .ebextensions/etss.txt: No such file or directory. command 01_cronjobs in .ebextensions/config.config failed. For more detail, check /var/log/eb-activity.log using console or EB CLI.
I have an etss.txt file in my .ebextensions folder that looks like this:
0 */3 * * * /usr/bin/php download_etss_tar.php > /dev/null
#blank newline here
I also have tried putting it in the root and referencing it there. Why can't the cron find the file that I know exists in the zip? What am I doing wrong here?
Furthermore, I also tried using the full file name, to no avail:
Command failed on instance. Return code: 1 Output: cat: /var/www/html/.ebextensions/etss.txt: No such file or directory. command 01_cronjobs in .ebextensions/etss.config failed. For more detail, check /var/log/eb-activity.log using console or EB CLI.

aws-cli 1.2.10 cron script fails

I have a crontab that fires a PHP script that runs the AWS CLI command "aws ec2 create-snapshot".
When I run the script via the command line the php script completes successfully with the aws command returning a JSON string to PHP. But when I setup a crontab to run the php script the aws command doesn't return anything.
The crontab is running as the same user as when I run the PHP script on the command line myself, so I am a bit stumped?
I had the same problem with running a ruby script (ruby script.rb).
I replace ruby by its full path (/sources/ruby-2.0.0-p195/ruby) and it worked.
in you case, replace "aws" by its full path. to find it:
find / -name "aws"
The reason it's necessary to specify the full path to the aws command is because cron by default runs with a very limited environment. I ran into this problem as well, and debugged it by adding this to the cron script:
set | sort > /tmp/environment.txt
I then ran the script via cron and via command line (renaming the environment file between runs) and compared them. This led me to see that I needed to set both the PATH and the AWS_DEFAULT_REGION environment variables. After doing this the script worked just fine.