I had a small proof of concept set up on a development server on a local machine. I'm now trying to move it over to django on a production server, which I'm using webfaction for. However, now that I'm switched over to apache from the built in django server I get the following:
ViewDoesNotExist: Could not import orgDisplay.views. Error was: No module named orgDisplay.views
But when check my orgDisplay apps folder there is a view.py in it. What am I doing wrong? I've tried adding the following to my settings.py by suggestion of the django IRC room.
import sys
sys.path.append(r"/home/user/webapps/django_project/myproject/orgDisplay")
which is the path to my app.
any ideas on how to even begin to trouble shoot this?
Thanks in advance.
I assume you're using mod_wsgi (which is recommended by Django authors), not mod_python. This is the way you should use your sys.path:
django.wsgi:
import os, sys
sys.path.append(r"/home/user/webapps/django_project/myproject/")
os.environ["DJANGO_SETTINGS_MODULE"] = "settings"
sys.stdout = sys.stderr # Prevent crashes upon print
import django.core.handlers.wsgi
application = django.core.handlers.wsgi.WSGIHandler()
urls.py:
from django.conf.urls.defaults import *
urlpatterns = (
("", include("orgDisplay.urls")),
# ...
)
orgDisplay/urls.py:
import views
urlpatterns = (
(r'^some_view/$', views.some_view), # It is actually orgDisplay.views.some_view
# many more records ...
)
It is a bad idea to add project dir itself to path since you're be getting name conflicts between multiple projects.
I think you're appending the wrong directory to sys.path. I think Python is looking in the .../myproject/orgDisplay folder for the orgDisplay package. Try removing the orgDisplay from your string, like this:
import sys
sys.path.append(r"/home/user/webapps/django_project/myproject")
The other option would be to simply add myproject (or whatever your project is actually called) in the import statement.
# instead of "from orgDisplay import views"
from myproject.orgDisplay import views
Also, make sure to restart Apache after every edit.
looking at manage.py, it does it like so:
import sys
from os.path import abspath, dirname, join
from django.core.management import setup_environ
# setup the environment before we start accessing things in the settings.
setup_environ(settings_mod)
sys.path.insert(0, join(PINAX_ROOT, "apps"))
sys.path.insert(0, join(PROJECT_ROOT, "apps"))
Provided your WSGI file is in your project directory, a slightly more flexible way is this:
import os, sys
sys.path.append(os.path.dirname(__file__))
This will enable you to change your project location later without having to modify your WSGI file.
Related
In my project, I have a main static folder and a sub folder named static. When I make changes in my sub folder named static (which I specified in COLLECTSTATIC_DIRS within the settings file), I save the file and run the collectstatic command.
This successfully saves the changes, however is really inefficient as I am constantly changing css and Javascript files inside my project, which I store as static files.
I browsed the web, and came across a solution named whitenoise, which is a pip package. But this package only works for a short period of time, and after a few times of closing and opening my project folder, it completely stopped working.
Does anybody have another solution to deal with this problem? Thank you.
You can use python-watchdog and write your own Django command:
import time
from django.conf import settings
from django.core.management import call_command
from django.core.management.base import BaseCommand
from watchdog.events import FileSystemEventHandler
from watchdog.observers import Observer
class Command(BaseCommand):
help = "Automatically calls collectstatic when the staticfiles get modified."
def handle(self, *args, **options):
event_handler = CollectstaticEventHandler()
observer = Observer()
for path in settings.STATICFILES_DIRS:
observer.schedule(event_handler, path, recursive=True)
observer.start()
try:
while True:
time.sleep(1)
finally:
observer.stop()
observer.join()
class CollectstaticEventHandler(FileSystemEventHandler):
def on_moved(self, event):
super().on_moved(event)
self._collectstatic()
def on_created(self, event):
super().on_created(event)
self._collectstatic()
def on_deleted(self, event):
super().on_deleted(event)
self._collectstatic()
def on_modified(self, event):
super().on_modified(event)
self._collectstatic()
def _collectstatic(self):
call_command("collectstatic", interactive=False)
You can use 3rd party solution which doesn't belong to Django to monitor your files and run commands on the files changes.
Take a look at fswatch utility Bash Script - fswatch trigger bash function
How to manage static files "Django-way"
Please check details on https://docs.djangoproject.com/en/3.0/howto/static-files/ what is a Django-way to do it correct :)
First of all set DEBUG = True while working on development.
Then add these lines to your project's urls.py:
from django.conf import settings
from django.views.decorators.cache import cache_control
from django.contrib.staticfiles.views import serve
from django.conf.urls.static import static
if settings.DEBUG:
urlpatterns += static(settings.STATIC_URL,
view=cache_control(no_cache=True, must_revalidate=True)(serve))
Here's an example using Python watchfiles and a Django management command.
# Lives in /my_app/manangement/commands/watch_files.py
import time
from django.conf import settings
from django.core.management import call_command
from django.core.management.base import BaseCommand
from watchfiles import watch
class Command(BaseCommand):
help = "Automatically calls collectstatic when the staticfiles get modified."
def handle(self, *args, **options):
print('WATCH_STATIC: Static file watchdog started.')
#for changes in watch([str(x) for x in settings.STATICFILES_DIRS]):
for changes in watch(*settings.STATICFILES_DIRS):
print(f'WATCH_STATIC: {changes}', end='')
call_command("collectstatic", interactive=False)
You can then run the Django management command in the background in whatever script you use to start Django.
python manage.py watch_static &
python manage.py runserver 0.0.0.0:8000
Main folder
|-project1
|-project2
I have the above structure for django projects.
When I am in project1 in a script i used os.chdir(to_project2) to project 2
I want to access project2's settings.py and fetch some attributes. Is it possible?
#You need to point your directory.[Ex: project2]
import os
os.chdir(project2 )
cd = os.getcwd()
# print the current directory
print("Current directory:", cwd)
#For Access the Django setting attributes
import django
from django.conf import settings
All of those aforementioned code you may use from one of your python file: something.py
I've got a django app that's trying to call a celery task that will eventually be executed on some remote hosts. The task codebase is completely separate to the django project, so I'm using celery.execute.send_task and calling it from a post_delete model signal. The code looks a bit like this:
class MyModel(models.Model):
#staticmethod
def do_async_thing(sender, instance, **kwargs):
celery.execute.send_task("tasks.do_my_thing", args=[instance.name])
signals.post_delete.connect(MyModel.do_async_thing, sender=MyModel)
I'm using the latest Django (1.6.1) and celery 3.1.7, so I understand that I don't need any extra module or app in my django project for it to be able to talk to celery. I've set BROKER_URL inside my settings.py to be the right url amqp://user:password#host/vhost.
When this method fires, I get a Connection Refused error. There's no indication on the celery broker that any connection was attempted - I guess it's not seeing the BROKER_URL configuration and is trying to connect to localhost.
How do I make this work? What extra configuration does send_task need to know where the broker is?
So I discovered the answer, and it was to do with not reading the tutorial (http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html) closely enough.
Specifically, I had the correct celery.py in place which I would have thought should have loaded the settings, but I'd missed the necessary changes to __init__.py in the django project, which wasn't hooking everything together.
My celery.py should be:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('mypoject')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
and the __init__.py should be simply:
from __future__ import absolute_import
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
i am trying to import the client in django for testing. but when i do, i get this wierd error:
ImproperlyConfigured: Requested setting DATABASES, but settings are not configured. You must either define the environment variable DJANGO_SETTINGS_MODULE or call settings.configure() before accessing settings.
from django.utils import unittest
from django.utils import simplejson as json
from django.test.client import Client
this is how i imported the client so that i could use it for testing. can someone explain this to me please.
Try this:
import os
import sys
sys.path.append('/home/username/www/site_folder')
os.environ['DJANGO_SETTINGS_MODULE'] = 'project.settings'
from django.utils import unittest
from django.utils import simplejson as json
from django.test.client import Client
But replace project with folder name, where your settings.py is
The Client is looking for the settings.py. You could simply load the client by typing this in your project folder:
python manage.py shell
In Pycharm which I use, after following this Running Django tests in PyCharm
my problem was solved.
It's in the file > settings > Django Support, and then select the right settings.
I have a Django model with some static methods. I'd like to call the methods from outside the application (cronjob).
The model I have:
class Job(models.Job):
#Irrelevant information
#staticmethod
def methodIwantToCall():
#statements
I have the following python file that I'm using for the cron job:
#!/usr/bin/python
# -*- coding: UTF-8 -*-
import os
os.environ['DJANGO_SETTINGS_MODULE'] = 'settings'
from myapp.models import Job
Job.methodIwantToCall()
At first, I was having an error about DJANGO_SETTINGS_MODULE not being set and I fixed that, however, now I have the following error: No module named myapp.utils
I feel like I'm doing something that I'm not supposed to do. So how do I call that static method the way I want it to be called?
EDIT: It looks like the paths are getting messed up when I'm importing from outside Django. For example, I have an import in my models file, when I call the cron file it fails importing with the message ImportError: No module named myapp.utils even though it's working.
The proper solution is to create custom manage.py command.
Assuming your cron job code resides in the same directory as your settings file, use the following setup code at the beginning:
from django.core.management import setup_environ
import settings
setup_environ(settings)