I have a working Django project. If I run it - everything works fine
Here is the structure of the project
titest_project
tibrains_app
tools
__init__.py
load_django.py
countries_add.py
apps.py
models.py
load_django.py
import sys
import os
import django
sys.path.append('/mnt/HDD/tibrains/03/titest_project')
os.environ['DJANGO_SETTINGS_MODULE'] = 'titest_project.settings'
django.setup()
countries_add.py
import load_django
from tibrains_app.models import ShopCountry
When I run countries_add.py it gives the error - No module named 'decouple'. Though when I run the project itself - it works without errors.
What could be the problem?
Related
This is my folder structure:
root:
|--Django_project
|--|-- db_app
|--|-- Django_project
|--|-- manage.py
|--Scrapy_project
|--|--Scrapy_project
|--|--|--spiders
|--|--|--settings.py
|--|--|--pipelines.py
|--|--|--items.py
|--|--|--middlewares.py
|--|--scrapy.cfg
In settings.py I have this:
import sys
import os
sys.path.append(os.path.dirname(os.path.abspath('.')))
os.environ['DJANGO_SETTINGS_MODULE'] = 'Django_project.settings'
import django
django.setup()
I've tried every possible path, including an absolute path to the project root, to the Django project, to the Django app - nothing works and I get this:
ModuleNotFoundError: No module named 'django'
Thanks for the help!
EDIT:
I should probably clarify that I'm running in a virtual environment.
I am trying to run a standalone Django scipt
import os, sys, django
proj_path = "/path/to/django-project"
import ipdb; ipdb.set_trace()
# This is so Django knows where to find stuff.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "boiler.settings")
sys.path.append(proj_path)
django.setup()
When i run It says
ImportError: cannot import name 'Celery' from 'celery' (/path/to/django-poject/boiler/celery.py)
My folder structure:
django-poject
-- boiler
-- __init__.py
-- settings.py
-- celery.py
-- manage.py
__init__.py
from .celery import app as celery_app
__all__ = ['celery_app']
celery.py
import os
from celery import Celery
import django
import sys
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'boiler.settings')
#This defines the celery app instance
redis = 'redis://:pass#localhost:6379/0'
app = Celery(dirname,
broker=redis,
backend=redis
)
I am able to run celery using
activate virtualenv
cd to django-poject
celery -A boiler worker --loglevel=debug
without any problems
But in standalone its creating problems
You have to name your celery.py something else. Like django_celery.py otherwise it won't work. Celery works fine without it that way, but you want to integrate in with django and like what Santhosh said, the absolute import of itself is giving you issues.
In your project's __init__.py you'll need something like:
from __future__ import absolute_import, unicode_literals
from your_path_to.django_celery import app as celery_app
__all__ = ('celery_app',)
I have a working Django app that I was able to get functioning on Heroku. The structure is project named 'untitled' and an app named 'web' such that the structure is:
project_root
static
templates
untitled
--->init.py
--->settings.py
--->urls.py
--->wsgi.py
web
--->init.py
--->admin.py
--->apps.py
--->models.py
--->tests.py
--->urls.py
--->views.py
This is a fairly basic app that I can get working outside of GAE (local and on Heroku), however, I'm getting stuck on the app.yaml and main.py requirements for GAE.
My app.yaml is:
application: seismic-interpretation-institute-py27
version: 1
runtime: python27
api_version: 1
threadsafe: true
handlers:
- url: /.*
script: main.app
libraries:
- name: django
version: "latest"
and my main.py (generated from PyCharm) is
import os,sys
import django.core.handlers.wsgi
import django.core.signals
import django.db
import django.dispatch.dispatcher
# Google App Engine imports.
from google.appengine.ext.webapp import util
# Force Django to reload its settings.
from django.conf import settings
settings._target = None
os.environ['DJANGO_SETTINGS_MODULE'] = 'untitled.settings'
# Unregister the rollback event handler.
django.dispatch.dispatcher.disconnect(
django.db._rollback_on_exception,
django.core.signals.got_request_exception)
def main():
# Create a Django application for WSGI.
application = django.core.handlers.wsgi.WSGIHandler()
# Run the WSGI CGI handler with that application.
util.run_wsgi_app(application)
if __name__ == '__main__':
main()
Finally, the output that is reported when running locally is
It seems that the error,
ImportError: Settings cannot be imported, because environment variable DJANGO_SETTINGS_MODULE is undefined.
is causing my problems. I am not exactly sure how to fix it.
try replace
from django.conf import settings
settings._target = None
os.environ['DJANGO_SETTINGS_MODULE'] = 'untitled.settings'
to
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "untitled.settings")
from django.conf import settings
settings._target = None
I'm in the beginning stages of a Flask application. The problems I'm having is that whenever I attempt to run the application I get:
app/application/___init___.py, line 11 in <module>
from user_auth.views import auth
ImportError: No module named user_auth.views
I have no idea what the problem is. The import for the home.view did this as well, then it stopped and worked fine on the local server. Been trying to figure this out for the longest, there aren't that many moving parts in the application as of yet, so not sure why this is happeninng. File structure and code below:
|app
|-application
|--__ init __.py
|--home
|--user_auth
|----forms.py
|----views.py
|----templates
|----static
My application/__ init __.py file:
from flask import Flask
app = Flask(__name__)
app.config.from_object('_config')
from home.views import home
from user_auth.views import auth
app.register_blueprint(home)
app.register_blueprint(auth)
My application/user_auth/views.py
from flask import Blueprint
auth = Blueprint('auth', __name__,
url_prefix='/user_auth',
template_folder='templates',
static_folder='static')
You're missing an __init__.py file under ./user_auth/ to make user_auth a module.
There's more information about modules in the docs.
I'm trying to follow the method here to 'Scrapy' data from the web and simultaneously save that data directly to my Django database using Scrapy's item pipeline.
However, when I try to run scrapy crawl spidername, I'm getting the error:
ImportError: No module named django.core.management
At first I thought it was because my Scrapy project was outside of my Django project folder, but even after I moved the whole project into my Django project folder I kept getting the same error. If I open a python shell inside the Scrapy project folder in its new location (inside my Django project folder), import django.core.management works fine. So what's going on?
EDIT: Some additional info: I'm doing this on a Webfaction server, and the path to my Django project is /home/gchorn/webapps/django_app/django_project. I'm using Django version 1.4.1, Scrapy 0.16 and Python2.7. The layout of the Django+Scrapy project is as follows:
django_project/
__init__.py
manage.py
settings.py
urls.py
myproject #folder containing wsgi.py
app1
app2
app3
templates
ScrapyProject/
scrapy.cfg
ScrapyProject/
__init__.py
items.py
pipelines.py
settings.py
spiders/
__init__.py
my_spider.py
Try setting this in your Spider's settings.py:
import os
import sys
sys.path.append('/home/gchorn/webapps/django_app')
os.environ['DJANGO_SETTINGS_MODULE'] = 'django_project.settings'
Then you can import your model classes like:
from django_project.app1.models import some_model