I have a laravel project setup on AWS elastic beanstalk(Amazon Linux 2/3.0.1) and I'm trying to setup a cronjob to run a command.
I ran nano crontab -e and wrote
* * * * * cd /var/app/current && php artisan command:initcrawl to run for every minute as a test, but it doesn't work.
When i cd in /var/app/current and then run the command manually php artisan command:initcrawl it works without any problem.
My problem was the fact that I wrote 'nano' in front of the crontab command.
Instead of writing 'nano' in front of the command to use the nano editor i should have done this export EDITOR='/usr/bin/nano' then when i run crontab -e everything would work fine now.
Related
I have generated a bash script that activates virtaulenv and runs my custom management command in Django. I want to run the bash script every day at midnight.
Bash Script :
cd ~
cd path_to_virtualenv/
source virtualenv_name/bin/activate
cd path_to_project/
python manage.py custom_command
deactivate
When I run this script using . or source it runs perfectly. I have configured crontab to run this bash script (For testing, I have set execution time per minute). But I am not getting desired output.
crontab -e
*/1 * * * * source /path_to_bash_script/bash_script_filename
Add shebang to your shell script (i.e: #!/bin/bash, ...)
Remove source from crontab
add an extension to your script file.
I changed the script as :
#!/bin/bash
source /path_to_virutalenv/bin/activate
python /path_to_project/manage.py custom_command
In crontab :
*/1 * * * * bash /path_to_bash_script/script.sh
I have a CentOS 7.9 machine with AWS CLI installed on it. I can successfully run this command on its own: aws s3 sync s3://bucketname /data/bucketname/ but when I try to run it via crontab, it doesn't work. All I see in the /var/log/cron is that the command ran, but there is no data in that /data/bucketname directory.
This is what my crontab looks like:
*/5 * * * * sh /root/script.sh
And this is what my script looks like:
#!/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin:/opt/aws/bin
SHELL=/bin/bash
/bin/aws s3 sync s3://bucketname /data/bucketname/
which aws returns /bin/aws so I tried to add that to my script.sh file but that did not make a difference. Any help would be much appreciated.
I have been trying a long for creating a periodic task in Django but there are lot of version constraints and not a clear explanation.
I recommend Celery. What is Celery?
Celery supports scheduling tasks. Check this doc
First of all, you want to create a management command following this guide.
https://docs.djangoproject.com/en/2.1/howto/custom-management-commands/
Say we want to run the closepoll command in the example every 5 minutes.
You'll then need to create a script to run this command.
Linux / MacOS:
#!/bin/bash -e
cd path/to/your/django/project
source venv/bin/activate # if you use venv
python manage.py closepoll # maybe you want to >> /path/to/log so you can log the results
store the file as run_closepoll.sh, run chmod +x run_closepoll.sh in command line
Now we can use crontab to run our command
run crontab -e in your command line
add this line:
*/5 * * * * /path/to/run_closepoll.sh
Now the command will run every 5 minutes.
If you're not familiar with crontab, you can use this website
https://crontab-generator.org/
Windows:
Same content as the above example, but remove the first line and save as run_closepoll.bat
In your start menu, search for Task Scheduler, follow the instructions on the GUI, it should be pretty simple from there.
for more info about the task scheduler, see here: https://learn.microsoft.com/en-us/windows/desktop/taskschd/using-the-task-scheduler
This blog explains clearly
https://medium.com/#yehandjoe/celery-4-periodic-task-in-django-9f6b5a8c21c7
Thanks!!!
I'm using django-cron and It works as expected. The only caveat is that you have to set a Cron job in the Linux system to run the command python manage.py runcrons.
I wrote a bash script to create a scheduled task and run a django app. I am using Git Bash as my terminal, and I have been able to manually run the commands in the snippet posted below with success. However, when I run bash script with these same commands, the scheduled task and the django app are never run. Why is there a discrepancy in behavior and how can I correct this in my bash script?
#!/usr/bin/env bash
// Create scheduled task
echo schtasks //create //tn my-task //tr '"python app/manage.py loaddata /resources/output.json"' //sc daily //st 09:30 //ri 60 //et 16:00
// Run app
echo python app/manage.py runserver
echo "TERMINATED"
$SHELL
For a bash script to run from a CMD session (triggered by the Windows scheduler), you would need:
a script named git-xxx (replace xxx by the name of your chosing)
that script in your Windows PATH (as well as Git itself)
a schedule task running git xxx (note the space)
That would run git-xxx in a Git bash session.
The other option would be, still from a CMD (or a scheduled task) to run:
bash -c "/c/path/to/your/script"
In both instances, make sure bash is not the one from WSL if you are on Windows 10, and have activated that feature.
I want to run my development django server at startup so I defined following cron job:
#reboot screen -d -m django-admin.py runserver 192.168.0.28:8000
But it didn't work.
What is really interesting, when I copy/paste directly to terminal and execute it works just fine.
I even tried something like this:
#reboot cd /home/ubuntu && /usr/bin/screen -d -m /usr/bin/python /usr/local/bin/django-admin.py runserver 192.168.0.28:8000 &> /home/ubuntu/cron.err
To be sure I'm not using some undefined commands in wrong location and examined contents of cron.err file but it's empty.
And (of course) when I fire this directly from the console it works immediately.
Please help.
Does it work if you try and run it from cron at a specific time? Eg:
50 12 2 8 * /usr/bin/screen -dmS set_from_cron