Sphinx autodoc not importing modules - django

I am writing some documentation for a Django project using Sphinx.
My project (Django 1.4) looks like this:
/funproject
/documentation
# this is where sphinx files live
/source
conf.py
...
/funproject
__init__.py
settings.py
admin.py
models.py
/app1
/app2
fabfile.py
manage.py
I have Sphinx setup in the conf.py:
import sys, os
sys.path.append(os.path.join(os.path.dirname(__file__), '../..'))
os.environ['DJANGO_SETTINGS_MODULE'] = 'funproject.settings'
from funproject import settings
from django.core.management import setup_environ
setup_environ(settings)
So in my Sphinx source files I can do this:
.. automodule:: app1.models
:members:
which works fine. However, these two examples do not work:
1.
.. automodule:: funproject.models
:members:
.. automodule:: fabfile
:members:
The second works if I move the fabfile into app1 and use app1.fabfile
I'm guessing I have some issue with my conf.py but I have tried many derivations but I cannot import my fabfile.py unless it is under an app dir (though under /funproject it doesn't work either).

You might try using the fabric fabfile directory style? Perhaps it's the case that automodule doesn't look at single file modules? Docs on this other style are here

Related

Wagtail settings only use .dev

I do not understand how to alternate between production and dev settings. Wagtail docs do not cover it and the only wagtail tutorial I can find mentions it and then completely skips over it. There is a settings file:
--| settings
----| __init__.py
----| base.py
----| dev.py
----| production.py
----| .env
my init file:
import os
from os.path import join, dirname
from dotenv import load_dotenv
dotenv_path = join(dirname(__file__), '.env')
load_dotenv(dotenv_path)
ENV = os.environ.get('AMSS_ENV')
if ENV == 'dev':
from .dev import *
elif ENV == 'prod':
from .production import *
AMSS_ENV is set to 'prod'. I also have the DJANGO_SETTINGS_MODULE variable set to production in the .env from a different attempt. Does the init file not fire first? is my logic broken? I get no errors and everything works but it loads in dev every time. I've tried so many other things and it just sticks like this. Can someone tell me what am I supposed to do? or where I can look?
It is always useful to check wsgi.py and manage.py to see which settings file they are set to. It is very easy to accidentally serve or run commands with the wrong settings file by forgetting about these two files.

Django: how to set the path for the environ variable "DJANGO_SETTINGS_MODULE"

In Django, I used to write populating scripts and put them in the project root directory. For example,
mysite/
mysite/
manage.py
populateA.py
The first few lines of populateA.py:
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')
import django
django.setup()
...
As there are more and more populating scripts, I would like to move them to another package populate:
mysite/
mysite/
manage.py
populate/
__init__.py
populateA.py
populateB.py
populateC.py
...
However, when I run the populating scripts (python populateA.py), I got the error message: ImportError: No module named 'mysite'. How to properly set the path for DJANGO_SETTINGS_MODULE?
Since populate is already a module, run its submodules.
python -m populate.populateA

Scrapy project can't find django.core.management

I'm trying to follow the method here to 'Scrapy' data from the web and simultaneously save that data directly to my Django database using Scrapy's item pipeline.
However, when I try to run scrapy crawl spidername, I'm getting the error:
ImportError: No module named django.core.management
At first I thought it was because my Scrapy project was outside of my Django project folder, but even after I moved the whole project into my Django project folder I kept getting the same error. If I open a python shell inside the Scrapy project folder in its new location (inside my Django project folder), import django.core.management works fine. So what's going on?
EDIT: Some additional info: I'm doing this on a Webfaction server, and the path to my Django project is /home/gchorn/webapps/django_app/django_project. I'm using Django version 1.4.1, Scrapy 0.16 and Python2.7. The layout of the Django+Scrapy project is as follows:
django_project/
__init__.py
manage.py
settings.py
urls.py
myproject #folder containing wsgi.py
app1
app2
app3
templates
ScrapyProject/
scrapy.cfg
ScrapyProject/
__init__.py
items.py
pipelines.py
settings.py
spiders/
__init__.py
my_spider.py
Try setting this in your Spider's settings.py:
import os
import sys
sys.path.append('/home/gchorn/webapps/django_app')
os.environ['DJANGO_SETTINGS_MODULE'] = 'django_project.settings'
Then you can import your model classes like:
from django_project.app1.models import some_model

Django ImportError for models.py

(Updating my question with more information.)
My django app is running fine on my dev server.
I have a view that pulls from the database using the below line that works fine:
from myapp.models import MyTable
However, if I add the above 'from/import' to another module (see below structure, it's the module named 'problem_module.py') I'm writing where I want to pull from the sqlite3 database, I get this error.
raise ImportError("Could not import settings '%s' (Is it on sys.path?): %s" % (self.SETTINGS_MODULE, e))
ImportError: Could not import settings 'myfolder.settings' (Is it on sys.path?): No module named myfolder.settings
I've read and tried various solutions recommended when people get this error, but I missing something because i'm unable to resolve it.
I'm using Django 1.4 and have the lay-out as recommended.
mysite/
manage.py
mysite/
__init__.py
settings.py
urls.py
wsgi.py
myapp/
__init__.py
models.py
admin.py
views.py
indevelopment/
__init__.py
problem_module.py
I figured out what was happening and why after going through the traceback carefully and looking at the django source code. Here is what happens.
When you run:
python manage.py runserver
the environment variable gets set properly assuming you already changed this small little file or just don't pay attention to it because django 1.4 automatically configures it for you.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myapp.settings")
However, because this setting of os.environ is under a:
if __name__ = "__main__"
expression, it only gets run if call that file directly, as you do with:
python manage.py runserver
Otherwise, if you are running a file that needs that environment variable - say testing a module in Eclipse - , the os.environ needs to get set in another place (shell, etc).
All the that I got generally pointed to this but I needed the context.
But as a little adjustment (yes, not a good idea as it couples) on the source code you can also hardcode it in manually in/django/conf/__init__.py
Specifically to see where it happens, the change below works:
# in module: /django/conf/__init__.py
class LazySettings(LazyObject):
def _setup(self):
try:
# Comment out the call to os.environ and hardcode in your app settings
# settings_module = os.environ[ENVIRONMENT_VARIABLE]
# WARNING: bad practice to do this. ;.
settings_module = "myapp.settings"
Have you changed/set DJANGO_SETTINGS_MODULE?
Try export DJANGO_SETTINGS_MODULE=mysite.settings and start your dev server.
modify your manage.py:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.settings")
PyCharm sometimes override DJANGO_SETTINGS_MODULE to empty string. Try to debug your manage.py and see if it realy changes after setdefault() call.
If its not either change pycharm settings or use os.environ['DJANGO....']='my_settings'..
or hack files at .idea/. .idea/workspaed.xml contains
env name="DJANGO_SETTINGS_MODULE" value="" in this case

djangorecipe test command

I've got a question about testing my Django applications in a built out Django project.
First, I've got the same project not built out and everything works fine. This project follows the standard Django project architecture apart from putting my tests in their own directory:
django_project/
manage.py
settings.py
urls.py
app1/
models.py
views.py
urls.py
tests/
app2/
...
If I run the tests in this situation great!
This is all a little bit different when it comes to buildout environment. There I've tried to svn check my project and applications and make the paths to all of them available in my bin directory (and I hope that also means making it available for whatever magic djangorecipe is doing). Anyway this is what my buildout looks like:
[buildout]
parts =
django_project
app1
app2
django
extra-paths =
${buildout:directory}
${buildout:directory}/parts
${buildout:directory}/parts/django_project
${buildout:directory}/parts/app1
${buildout:directory}/parts/app2
[django_project]
recipe = infrae.subversion
urls =
https://svn/django_projects/trunk/ .
[app1]
recipe = infrae.subversion
urls
= https://svn/path/app1/trunk/ .
[app2]
recipe = infrae.subversion
urls =
https://svn/path/app2/trunk/ .
[django]
recipe = djangorecipe
version = 1.2
project = django_project
projectegg = django_project
wsgi=true
settings=settings
extra-paths = ${buildout:extra-paths}
test =
app1
app2
When I run the buildout I get the following directory tree.
django_buidout/
...
bin/
django
django.wsgi
test
parts/
django/
django_project/
__init__.py
settings.py
...
app1/
setup.py
app1/
__init__.py
tests/
app2/
setup.py
app2/
__init__.py
tests/
I can get to the django shell so that works. But if I run ./bin/test I get an 'ImportError: No module named django.project.urls'. (the dot notation is not a mistake my django_project.settings.URL_CONF=django_project.urls) This is interesting because if I start the shell I can import django_project.urls. In addition to that if I run the tests through the ./bin/django test app1 all the test run. app1 is interesting because it has no tests on views.
In the views tests I'm using urlresolvers.reverse and that also shows up in the traceback along with the django.tests Client() class.
Is there something in my architecture that is messed up, or is urlresolvers.reverse doing something I'm not aware of?
Many thanks,
Todd