Can't send email in management command run by cron - django

I have a strange problem with a Django management command I am running via cron.
I've a production server set up to use Mailgun. I've a management command that simply sends an email:
from django.core.mail import send_mail
class Command(BaseCommand):
help = 'Send email'
def handle(self, *args, **options):
send_mail('Test email', 'Test content', 'noreply#example.com', ['me#example.com',], fail_silently=False)
This script works perfectly if I run it via the command line (I'm using virtualenvwrapper):
> workon myapp
> python manage.py do_command
or directly:
> /home/user/.venvs/project/bin/python /home/user/project/manage.py do_command
But when I set it up with cron (crontab -e):
*/1 * * * * /home/user/.venvs/project/bin/python /home/user/project/manage.py do_command
The script runs (without error), but the email isn't sent.
What could be going on?

OK, the issue was that the wrong DJANGO_SETTINGS_MODULE env var was set and there were a few things throwing me off the scent:
My manage.py script defaults to the "development" version of my settings: settings.local and this uses the command line email backend. Cron suppresses all output so I wasn't seeing that happening.
Secondly, I was testing in a shell that already has DJANGO_SETTINGS_MODULE set to settings.production, so it appeared that the script ran correctly when I ran it on the command line.
The fix is easy, add DJANGO_SETTINGS_MODULE to the crontab:
DJANGO_SETTINGS_MODULE=config.settings.production
*/1 * * * * ...

Related

django fabric define multiple host with password

Lets say I have a list of hosts to provide:
env.hosts = ['host1', 'host2', 'host3']
env.password = ['password1', 'password2', 'password3']
Its not working for me.. I dont just want to give host and give password for everyhost. I want to set the password for every host and it should get the password and deploy my site without asking for password.
How can I do that ?
Your best options is to do this:
Note: that the password keys need to be user#location:port otherwise it wont work.
fabfile.py
from fabric.api import env, task, run
#task
def environments():
env.hosts = ['user1#10.99.0.2', 'user2#10.99.0.2', 'user3#10.99.0.2']
env.passwords = {'user1#10.99.0.2:22': 'pass1', 'user2#10.99.0.2:22': 'pass2', 'user3#10.99.0.2:22': 'pass3'}
#task
def echo():
run('whoami')
and then to test:
$ fab environments echo
[user1#10.99.0.2] Executing task 'echo'
[user1#10.99.0.2] run: whoami
[user1#10.99.0.2] out: user1
[user1#10.99.0.2] out:
[user2#10.99.0.2] Executing task 'echo'
[user2#10.99.0.2] run: whoami
[user2#10.99.0.2] out: user2
[user2#10.99.0.2] out:
[user3#10.99.0.2] Executing task 'echo'
[user3#10.99.0.2] run: whoami
[user3#10.99.0.2] out: user3
[user3#10.99.0.2] out:
Done.
Disconnecting from user2#10.99.0.2... done.
Disconnecting from user1#10.99.0.2... done.
Disconnecting from user3#10.99.0.2... done.

cron job 'django-cron' not running in ubuntu cron tab

I am using django_cron for a schedule a job, when i am use python manage.py runcrons this work good. but after adding the cron job in ubuntu cron list job is not executing.
My setting.py is:
CRON_CLASSES = [
"home.cron.HomeCronJob",
]
FAILED_RUNS_CRONJOB_EMAIL_PREFIX = []
INSTALLED_APPS = (
'django.contrib.auth',
'..................'
'django_cron',
)
My cron.py file is:
from django_cron import CronJobBase, Schedule
from home.management.commands.auto_renueva import republishAds
class HomeCronJob(CronJobBase):
RUN_EVERY_MINS = 2
MIN_NUM_FAILURES = 2
schedule = Schedule(run_every_mins=RUN_EVERY_MINS)
code = 'home.home_cron_job'
def do(self):
republishAds()
then I have created a shell script for run this job, cron.sh:
#! /bin/bash
source /home/cis/ENV/muna/bin/activate
python /home/cis/DjangoLive/Newmunda/mund2anuncios/manage.py runcrons
deactivate
and the code i have added in ubuntu cron file are:
*/1 * * * * /home/cis/DjangoLive/Newmunda/mund2anuncios/crons.sh >> /home/cis/Desktop/crons.log 3 >> /home/cis/Desktop/cron_errors.log
Please suggest me what i am doing wrong Here.
Thanks in Advance
As a guess
python /home/cis/DjangoLive/Newmunda/mund2anuncios/manage.py runcrons
Will fail because PATH is not set in cron environment. You should include the full path to the python interpreter.
Other common error in cron jobs are no execution permissions on scripts. Normally cron errors are emailed to root, so you should have more info about errors on root mailbox

How to call task properly?

I configured django-celery in my application. This is my task:
from celery.decorators import task
import simplejson as json
import requests
#task
def call_api(sid):
try:
results = requests.put(
'http://localhost:8000/api/v1/sids/'+str(sid)+"/",
data={'active': '1'}
)
json_response = json.loads(results.text)
except Exception, e:
print e
logger.info('Finished call_api')
When I add in my view:
call_api.apply_async(
(instance.service.id,),
eta=instance.date
)
celeryd shows me:
Got task from broker: my_app.tasks.call_api[755d50fd-0f0f-4861-9a18-7f4e4563290a]
Task my_app.tasks.call_api[755d50fd-0f0f-4861-9a18-7f4e4563290a] succeeded in 0.00513911247253s: None
so should be good, but nothing happen... There is no call to for example:
http://localhost:8000/api/v1/sids/1/
What am I doing wrong?
Are you running celery as a separate process?
For example in Ubuntu run using the command
sudo python manage.py celeryd
Till you run celery (or django celery) as a separate process, the jobs will be stored in the database (or queue or the persistent mechanism you have configured - generally in settings.py).

Running periodic tasks with django and celery

I'm trying create a simple background periodic task using Django-Celery-RabbitMQ combination. I installed Django 1.3.1, I downloaded and setup djcelery. Here is how my settings.py file looks like:
BROKER_HOST = "127.0.0.1"
BROKER_PORT = 5672
BROKER_VHOST = "/"
BROKER_USER = "guest"
BROKER_PASSWORD = "guest"
....
import djcelery
djcelery.setup_loader()
...
INSTALLED_APPS = (
'djcelery',
)
And I put a 'tasks.py' file in my application folder with the following contents:
from celery.task import PeriodicTask
from celery.registry import tasks
from datetime import timedelta
from datetime import datetime
class MyTask(PeriodicTask):
run_every = timedelta(minutes=1)
def run(self, **kwargs):
self.get_logger().info("Time now: " + datetime.now())
print("Time now: " + datetime.now())
tasks.register(MyTask)
And then I start up my django server (local development instance):
python manage.py runserver
Then I start up the celerybeat process:
python manage.py celerybeat --logfile=<path_to_log_file> -l DEBUG
I can see entries like this in the log:
[2012-04-29 07:50:54,671: DEBUG/MainProcess] tasks.MyTask sent. id->72a5963c-6e15-4fc5-a078-dd26da663323
And I also can see the corresponding entries getting created in database, but I can't find where it is logging the text I specified in the actual run function in MyTask class.
I tried fiddling with the logging settings, tried using the django logger instead of celery logger, but of no use. I'm not even sure, my task is getting executed. If I print any debug information in the task, where does it go?
Also, this is first time I'm working with any type of message queuing system. It looks like the task will get executed as part of the celerybeat process - outside the django web framework. Will I still be able to access all the django models I created.
Thanks,
Venkat.
Celerybeat it stuff, which pushes task when it need, but not executing them. You tasks instances stored in RabbitMq server. You need to execute celeryd daemon for executing your tasks.
python manage.py celeryd --logfile=<path_to_log_file> -l DEBUG
Also if you using RabbitMq, I recommend to you to install special rabbitmq management plugins:
rabbitmq-plugins list
rabbitmq-enable rabbitmq_management
service rabbitmq-server restart
It will be available at http://:55672/ login: guest pass: guest. Here you can check how many tasks in your rabbit instance online.
You should check the RabbitMQ logs, since celery sends the tasks to RabbitMQ and it should execute them. So all the prints of the tasks should be in RabbitMQ logs.

Django timer thread

I would like to compute some information in my Django application on regular basis.
I need to select and insert data each second and want to use Django ORM.
How can I do this?
In a shell script, set the DJANGO_SETTINGS_MODULE variable and call a python script
export DJANGO_SETTINGS_MODULE=yourapp.settings
python compute_some_info.py
In compute_some_info.py, set up django and import your modules (look at how the manage.py script sets up to run Django)
#!/usr/bin/env python
import sys
try:
import settings # Assumed to be in the same directory.
except ImportError:
sys.stderr.write("Error: Can't find the file 'settings.py'")
sys.exit(1)
sys.path = sys.path + ['/yourapphome']
from yourapp.models import YourModel
YourModel.compute_some_info()
Then call your shell script in a cron job.
Alternatively -- you can just keep running and sleeping (better if it's every second) -- you would still want to be outside of the webserver and in your own process that is set up this way.
One way to do it would be to create a custom command, and invoke python manage.py your_custom_command from cron or windows scheduler.
http://docs.djangoproject.com/en/dev/howto/custom-management-commands/
For example, create myapp/management/commands/myapp_task.py which reads:
from django.core.management.base import NoArgsCommand
class Command(NoArgsCommand):
def handle_noargs(self, **options):
print 'Doing task...'
# invoke the functions you need to run on your project here
print 'Done'
Then you can run it from cron like this:
export DJANGO_SETTINGS_MODULE=myproject.settings; export PYTHONPATH=/path/to/project_parent; python manage.py myapp_task