I have a Django project which requires Celery tasks to run for two of its apps. I was wondering if there was a way to have a single celery.py file which runs both of them.
File tree:
src
|--api
| |-- init.py
| |-- otherFiles
| |-- celery
| |-- celery.py
|
|--xyz
| |-- init.py
| |-- otherFiles
| |-- celery
| |-- celery.py
|
|--otherApps
If I were to run to Celery apps manually, I would run:
source venv/bin/activate
cd src
celery worker -A api
And in another terminal:
source venv/bin/activate
cd src
celery worker -A xyz
The two celery workers require access to the same databases.
This is how my celery file looks like:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "api.settings")
app = Celery("api")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
The other replaces app = Celery('api') with app = Celery('xyz').
Is this the correct way to handle this? Can I create a single script and run both tasks from it?
If not, what's the best way to daemonize this?
Production server is running Ubuntu 18.04 and python 3.6.9, so I'm planning on using systemd although suggestions are welcome too.
Related
I' m deploying my first django app on heroku. After deploy the heroku doesn't launch my app. After checking heroku logs --tail
I have got an error:
: ModuleNotFoundError: No module named 'tabele.wsgi'
My procfile looks like:
web: gunicorn tabele.wsgi:application --log-file -
I tries also:
web: gunicorn tabele:app
web: gunicorn tabele.wsgi --log-file -
I'm beginner and I don't understand if (in my case 'tabele') should be folder containing manage.py or the different one?
In my project folders including manage.py and another one including settings.py and wsgi.py has the same name "tabele"
Could you explain me what is what in procfile file for better understanding?
Any idea what I'm doing wrong?
djangoherokuapp
|-- tabele/
| |--- __init_-.py
| |--- settings.py
| |--- urls.py
| |--- wsgi.py
|----- manage.py
|------Procfile ⬅⬅⬅
|------requirements.txt
|----- app/
| |--- admin.py
| |--- apps.py
| |--- __init__.py
| |--- models.py
| |--- tests.py
| |--- views.py
Add a Procfile in the project root directory to define process types
and explicitly declare what command should be executed to start your
app.
Open the Procfile and add the line below:
web: gunicorn tabele.wsgi --log-file -
--log-file - means "log to stdout". The --log-file flag lets you
set a path to a log file, and - means "stdout" (in this context).
Or try with:
web: gunicorn tabele.wsgi
I read quite a few posts about this but still no solution...
I have a docker-compose project with, among other, a django service that I build.
On my prod environment, it is using gunicorn + nginx. All fine, working as expected.
However on my dev environment, I am using only manage.py runserver. And here the troubles begin. Somehow, manage.py uses an old version of my settings.py that has been since then deleted. In my specific case, runserver is looking for a local mysql db, which doesnt exist because it is in another container.
So, it is the same settings.py between gunicorn and manage.py, why does it work in one and not the other one???
My project structure:
mysite
|_ django_mysite/
| |_ __init__.py
| |_ settings.py
| |_ urls.py
| |_ wsgi.py
|_ myapp/
| |...
|_ static/
| |...
|_ manage.py
|_ uwsgi_params
My manage.py:
#!/usr/bin/env python
import os
import sys
if __name__ == "__main__":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "django_mysite.settings")
try:
from django.core.management import execute_from_command_line
except ImportError:
# The above import may fail for some other reason. Ensure that the
# issue is really that Django is missing to avoid masking other
# exceptions on Python 2.
try:
import django
except ImportError:
raise ImportError(
"Couldn't import Django. Are you sure it's installed and "
"available on your PYTHONPATH environment variable? Did you "
"forget to activate a virtual environment?"
)
raise
execute_from_command_line(sys.argv)
My wsgi.py:
"""
WSGI config for django_mysite project.
It exposes the WSGI callable as a module-level variable named ``application``.
For more information on this file, see
https://docs.djangoproject.com/en/1.11/howto/deployment/wsgi/
"""
import os
import sys
from django.core.wsgi import get_wsgi_application
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
sys.path.append(BASE_DIR)
os.environ['DJANGO_SETTINGS_MODULE'] = 'django_mysite.settings'
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "django_mysite.settings")
application = get_wsgi_application()
My Dockerfile (in case it is useful):
FROM alpine:3.7
MAINTAINER XXX XXX
# Dependencies
RUN rm -rf /var/cache/apk/* \
&& rm -rf /tmp/* \
&& apk update \
&& apk --no-cache add python py-pip build-base gettext libxslt-dev jpeg-dev \
&& mkdir -p /data/web
WORKDIR /data/web
# Django mysite requirements
COPY settings /data/web/
RUN apk --no-cache add python-dev mysql-client mysql-dev \
&& pip install --no-cache-dir -r requirements.txt
&& apk del -r python-dev mysql
# Pull mysite code
# Change settings.py and requirements.txt
RUN apk --no-cache add git \
&& git clone -b xx_xxxx https://github.com/XXX/xxx \
&& apk del -r git \
&& rm /data/web/mysite/requirements.txt \
&& rm /data/web/mysite/django_mysite/settings.py \
&& mv requirements.txt mysite/requirements.txt \
&& mv settings.py mysite/django_mysite/settings.tmp.py \
&& mv settings.build.py mysite/django_mysite/settings.py \
# Collect static django files and replace right settings.py file
WORKDIR /data/web/mysite
RUN python /data/web/mysite/manage.py collectstatic --no-input \
&& rm django_mysite/settings.py \
&& mv django_mysite/settings.tmp.py django_ mysite/settings.py
If I go into the django container and run "manage.py diffsettings", I see only the old settings I used for the collectstatic in my build.
However if I directly check the settings.py file from within the container I see the right settings.py.
On dev environment, my compose launch the runserver via the command:
/usr/bin/python manage.py runserver 0.0.0.0:8000
and having the following issue:
'Can\'t connect to local MySQL server through socket \'/run/mysqld/mysqld.sock\' (2 "No such file or directory")'
(makes sense. The django container doesnt have mysql, it is the mysql container which is referenced in the host)
On my prod:
/usr/bin/gunicorn django_mysite.wsgi:application -w 2 -b :8000
All working perfectly. Gunicorn from the django container deals with the msql container and the nginx container.
Any idea? Could it be related to Docker layers?
Thanks!
E
Can't tell from the files you shared, but this seems to be a recurring issue with applications in docker. Here are some options you can try:
1) Do you have different files for setting prod and dev databases? Check if the dev database have a "hostname" option. If it doesn't, 95% sure it will point to localhost (and therefore will give you trouble on dev machine).
2) Do you have a database container for the dev machine? Try to connect with mysql client from the application container, to the database container. You can use docker exec -it container_id /bin/bash (you'll have to adapt since it's Alpine) to attach to the app container.
3) Is the database container running, but you are not able to connect to it? Check if you container door is open to reach the database.
I have two packages (diretories ) in my Python project
src
/textmining
mining.py...def mining():#...
__init.py....__all__ = ["mining"]
/crawler
crawler.py
in crawler.py I use the mining class
mining=mining()
main.py
__init__.py
my main.py is as follow:
scrapy_command = 'scrapy runspider {spider_name} -a crawling_level="{param_1}"'.format(spider_name='crawler/crawler.py',
param_1=crawling_level)
process = subprocess.Popen(scrapy_command, shell=True)
when I run crawler, it prompts
runspider: error: Unable to load 'Crawler.py': cannot import name mining
You need an __init__.py in every folder that's a part of the same package module.
src
__init__.py
/textmining
__init__.py
mining.py
/crawler
__init__.py
crawler.py
For simplicity, you should add a main.py in the src folder and call the function you want to start your program with from there as it's fairly difficult to import modules from sibling directories if you start your script in a non-root directory.
main.py
from crawler import crawler
crawler.start_function()
crawler.py
from src.textmining import mining
miner = mining()
Without turning everything into a python module you'd have to import a folder into the current script or __init__.py module by adding to path:
# In crawler.py
import sys
import os
sys.path.append(os.path.abspath('../textmining'))
import mining
However, messing around with the path requires you to keep in mind what you've done and may not be a thing you desire.
I user Django 1.10 with uWSGI and nginx on ubuntu 16.04 and deploy my app with ansible. My project have not default structure, but quite common ( thank Two scoopce for this :).
I use split dev and production settings and config folder instead 'name' project folder. It's looks like this:
|-- config
| |-- __init__.py
| |-- settings
| | |-- __init__.py
| | |-- base.py
| | `-- dev.py
| |-- urls.py
| |-- wsgi_dev.py
| `-- wsgi_production.py
|-- manage.py
`-- requirements.txt
My production.py genarate from ansible with security encrypt and locate in config/settings.
With this config i get "no python application found check your startup logs". Uwsgi don't see my application.
( {{ }} it's jinja2 syntax for ansible )
/etc/uwsgi/sites/{{ project_name }}
[uwsgi]
chdir = {{ django_root }}
home = /home/{{ project_user }}/venvs/{{ project_name }}
module = config.wsgi_production:application
master = true
processes = 5
socket = /run/uwsgi/{{ project_name }}.sock
chown-socket = {{ project_user }}:www-data
chmod-socket = 660
vacuum = true
After several weeks i can find problem in my wsgi.py. It common solution use os.environ['ENV'] for DJANGO_SETTINGS_MODULE, but with deffrent users and permissions its dosen't work.
If you use in your wsgi.py file something like this:
os.environ["DJANGO_SETTINGS_MODULE"] = "config.settings." + os.environ["ENV"]
And have problem with no python application found - split your wsgi file. I can catch that os.environ["ENV"] return empty string. I add it for my all user, use source and etc. But uwsgi in emperior mode don't see it.
You sould use wsgi_dev.py and wsgi_production.py where you can write somethink like this os.environ["DJANGO_SETTINGS_MODULE"] = "config.settings.production". It's not so elegant but solve this problems fine.
For use splitting wsgi you can write something like this in wsgi.py
import os
from django.core.wsgi import get_wsgi_application
if os.environ.get('DEV') is True:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.dev")
else:
os.environ.setdefault("DJANGO_SETTINGS_MODULE",
"config.settings.production")
application = get_wsgi_application()
since Django 1.4 (I think) django create a folder for my project when I start a project. Django add a folder for any application I created (with python manage.py startapp) at the same level of my project folder.
Project_name
|---project_name_dir/
|---application_dir/
`---manage.py
I really like the following folder structure:
Project_name
|---project_name_dir/
| |---application_dir/
| | |-- __init__.py
| | |-- models.py
| | |-- tests.py
| | `-- views.py
| |-- __init__.py
| |-- settings.py
| |-- urls.py
| |-- wsgi.py
| |---templates/
| | `---application_dir/
| `---static/
| |---css/
| |---font/
| |---img/
| `---js/
|---deployment/
|---documentation/
|---config/
`---manage.py
Because I have a folder with all my django files (project_name_dir/) and other directories for non django files.
So why Django put application at the same level of my project folder?
In Django, the position of the application directory is not considered. Django only uses the name of the application.
Thus, the position of the application is basically a matter of convenience of the programmer.
This is also the reason why two apps should not have the same name: even if they are imported in INSTALLED_APPS as
('app.app1', 'app1')
Django only concerns with the last part after the dot, i.e. app1.
So, in the end, you can use the directory structure you want, as long as the apps' names don't collide and you point to the app on INSTALLED_APPS. Because of this, if there isn't any special reason, you should put them on the project's root, like Django does.