It seems that Scrapy can't see database because item.save() return error. django.db.utils.DatabaseError: no such table: myapp_myitem It's not problem with Django model because I can add/edit it from Django admin without errors.
items.py
from scrapy.contrib.djangoitem import DjangoItem
from myapp.models import Myitem
class MyitemItem(DjangoItem):
django_model = Myitem
pipelines.py
from myapp.models import Myitem
from items.models import License, Category, Special
class MyitemPipeline(object):
def process_item(self, item, spider):
item.save()
return item
settings.py
ITEM_PIPELINES = {
'scrapyproject.pipelines.MyitemPipeline':1000,
}
import sys
sys.path.append('path_to_project')
import os
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings.local'
my_django_project/settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'db/local_db',
'USER': 'me',
}
I'll answer to my question after I fix the problem because it's trivial problem which take from me all day. Maybe it can be useful to someone.
After I change NAME of database to absolute path to the sqlite3 database Scrapy was store data successfuly. So Scrapy need absolue path to the sqlite3 database and changing name to absolute path to the db will not affect Django functionality.
Related
I have deployed my project to pythonanywhere.
It is working locally.
But with pythonanywhere I am getting no such table exception.
I have configured sqllite as in this link
Just mentioned to generate the sqlite file using runmigrations.
I have changed the settings.py to use os.path.join at that Database section also but still same issue.
Exception Type: ProgrammingError
Exception Value:
(1146, "Table 'todo_todo' doesn't exist")
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
}
}
tried with os.path.join also but same error.
my models.py
from django.db import models
from django.contrib.auth.models import User
# Create your models here.
class Todo(models.Model):
title = models.CharField(max_length=100)
memo = models.TextField(blank=True)
created=models.DateTimeField(auto_now_add=True)
datecompleted=models.DateTimeField(null=True, blank=True)
important=models.BooleanField(default=False)
user = models.ForeignKey(User,on_delete=models.CASCADE)
def __str__(self):
return self.title
I migrated individual apps also.
python manage.py makemigrations appname
I'm new in django and trying to connect to my heroku database from my localhosted django app.
I've followed the instructions on heroku website but it's seems django won't understand that i'm trying to access an external database.
i get this error
relation "myAppName_user" does not exist
LINE 1: INSERT INTO "myAppName_user" ("title", "content") VALUES
('Beat...
However i never created or never intented to create a table named myAppName_user
I'm just trying to access the table user in my heroku postgres db but i don't know why it tries with myAppName_user
i just explicitly added myAppName in the INSTALLED_APPS config as django throws an error if it's not the case.
My DATABASE config :
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
DATABASES['default'] =
dj_database_url.config(default='postgres://xxxxx')
I want to test my database from my localhost. I'm doing this easily with Node JS but i can't find the solution in django.
Any suggestions ?
EDIT
from __future__ import unicode_literals
from django.db import models
class Fruits(models.Model):
title = models.CharField(max_length=100)
content = models.TextField()
def __str__(self):
return self.name
I'm trying to set up a models file in Django 1.9 using the new JSONField. I have found examples using postgres but none with MySql. In the examples with postgres they do a
from django.contrib.postgres.fields import JSONField
How do I go about importing it for MySql?
Thanks
UPDATE: Django 3.1 now supports JSONField natively for multiple databases: https://docs.djangoproject.com/en/dev/releases/3.1/#jsonfield-for-all-supported-database-backends
As stated in other answers, Django's native JSONField (as of 1.9 and 1.10) is for PostgreSQL.
Luckily, MySQL 5.7.8+ comes with a native JSON datatype. You can add it your Django project with the django-mysql package and Django 1.8+
pip install django-mysql
from django.db import models
from django_mysql.models import JSONField
class MyModel(models.Model):
my_json_field = JSONField()
Read more about the django_mysql JSONField here.
Django JSONField is Postgres only.
https://docs.djangoproject.com/en/3.0/ref/contrib/postgres/fields/#django.contrib.postgres.fields.JSONField
UPDATE:
There is support for MYSQL via 3rd party library django-mysql
# Install jsonfield package
pip install jsonfield
# Define my model
from django.db import models
import jsonfield
class MyModel(models.Model):
the_json = jsonfield.JSONField()
More detail:https://pypi.python.org/pypi/django-jsonfield
I know this question is about Django 1.9, but JSONField can now be used with all supported database backends with the release of Django 3.1.
Try to save data of this model in postgres db on my local machine:
models.py:
from django.db import models
from django import forms
from inputData.models import Input
from django.contrib.postgres.fields import JSONField
class Results(models.Model):
generator = models.OneToOneField(Input, on_delete = models.CASCADE, primary_key = True)
pvalues = JSONField()
views.py
def result(req, res_id):
try:
inp = Input.objects.get(pk = res_id)
path = os.path.join(BASE_DIR, 'uploads\\' + str(res_id) + '\\t.txt')
p_values = parse_res.main(path)
res = Results(generator = inp, pvalues = p_values)
res.save(using = 'results')
except Results.DoesNotExist:
raise Http404
return render(req, 'result.html', {'res': res})
settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'results': {
'ENGINE':'django.db.backends.postgresql',
'NAME': 'results',
'PASSWORD': 'password',
'USER': 'user',
'HOST': '127.0.0.1',
'PORT': '8000'
}
}
Model Results (see models.py) uses JSONField, which have about 200 bytes of data
But at the line res.save(... of code views.py browser does response too long.
Whatäs wrong with JSON?
What problems can be on server besides cache?
For today I'd recommend using jsonfield2 or waiting for native JSON support for all database backends in Django 3.
I want to import an apps model in setting.py for PUSH_NOTIFICATIONS_SETTINGS in django-push-notifications. This is my code:
INSTALLED_APPS = (
....
'my_app',
'push_notifications'
....
)
from my_app.models import User
PUSH_NOTIFICATIONS_SETTINGS = {
'GCM_API_KEY': 'xxxxx',
'APNS_CERTIFICATE': 'xxxxx.pem',
'USER_MODEL': User, # i want to change the default from auth_user to my_app User
}
But it raise an error on this line:
from my_app.models import User
The error is:
django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet.
How Can i load my_app model in setting.py?
You cannot load models from inside your settings file like that - models can only be loaded once all apps are loaded (which can only happen after the settings have been loaded).
Looking at the code in django-push-notifications, you should be able to provide the model as a string with dotted path:
'USER_MODEL': 'my_app.User'
You can't load User model in settings, but instead you can change it
PUSH_NOTIFICATIONS_SETTINGS = {
'GCM_API_KEY': 'xxxxx',
'APNS_CERTIFICATE': 'xxxxx.pem',
'USER_MODEL': 'my_app.User',
}
And use it later like:
from django.apps import apps
from django.conf import settings
User = apps.get_model(settings.PUSH_NOTIFICATIONS_SETTINGS['USER_MODEL'])
And you can do whatever you want with this User model
I'm trying to make a minimal working example for Django. This should be a single file which allows models to be defined and instantiated in an in-memory database, and which can then be used for stackoverflow questions. If possible, I'd like it to use the django/unittest framework, because this would make it simple to demonstrate problems with model behaviour.
There are many questions on Stackoverflow which would benefit from having something like this.
What I've managed so far:
# Django Minimal Working Example
# (intended) usage: save as `mwe.py` and run `python mwe.py`
# Settings
from django.conf import settings
settings.configure(DATABASES = {
'default': {'ENGINE': 'django.db.backends.sqlite3', 'NAME': ':memory:'}
})
# Models
# with help from http://softwaremaniacs.org/blog/2011/01/07/django-micro-framework/en/
from django.db import models
import sys
sys.modules['mwe.mwe'] = sys.modules[__name__]
sys.modules[__name__].__name__ = 'mwe.mwe'
__package__ = 'mwe.mwe'
class Book(models.Model):
isbn = models.IntegerField()
# Make a database and do things to it
from django.utils import unittest
from django.test import TestCase
class TestCreateObjects(TestCase):
def setUp(self):
book = Book.objects.create(isbn='9780470467244')
def test_sanity(self):
self.assertEqual(Book.objects.count(), 1)
unittest.main()
This script gets as far as running the unittest, but raises the error django.db.utils.DatabaseError: no such table: mwe_book.
Edit I've also tried replacing the lines from from django.utils import unittest with:
from django.core.management import call_command
call_command('syncdb', interactive=False)
book = Book.objects.create(isbn='9780470467244')
This gives the following feedback, before raising the same DatabaseError as above:
Creating tables ...
Installing custom SQL ...
Installing indexes ...
Installed 0 object(s) from 0 fixture(s)
Summary Thanks to help from Pavel this is working. For reference, here is a compact version of the script minus the extraneous unittest stuff:
# Django Minimal Working Example
# usage: save as `mwe.py` and run `python mwe.py`
# Setup django with an in-memory sqlite3 database
# Thanks to Pavel Anossov http://stackoverflow.com/a/15824108/188595
import sys
from django.conf import settings
from django.core.management import call_command
settings.configure(
DATABASES = {
'default': {'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:'}
},
INSTALLED_APPS = ('mwe',),
)
from django.db import models
sys.modules['mwe.models'] = sys.modules['mwe.mwe'] = sys.modules['mwe'] = sys.modules[__name__]
sys.modules[__name__].__name__ = __package__ = 'mwe.mwe'
# YOUR MODEL DEFINITIONS HERE
class Book(models.Model):
isbn = models.IntegerField()
# The call_command line has to appear after all model definitions
call_command('syncdb', interactive=False)
# YOUR CODE HERE
Book.objects.create(isbn='9780470467244')
assert Book.objects.count() == 1
What you were missing is actually installing an app:
settings.configure(
DATABASES = {
'default': {'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:'}
},
INSTALLED_APPS = ('mwe',),
)
This means this module will also be imported as mwe.models by syncdb and as mwe by translation:
sys.modules['mwe.models'] = sys.modules[__name__]
sys.modules['mwe'] = sys.modules[__name__]
My final version:
from django.conf import settings
import sys
settings.configure(
DATABASES = {
'default': {'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:'}
},
INSTALLED_APPS = ('mwe',),
)
from django.core.management import call_command
from django.utils import unittest
from django.db import models
from django.test import TestCase
sys.modules['mwe.models'] = sys.modules[__name__]
sys.modules['mwe.mwe'] = sys.modules[__name__]
sys.modules['mwe'] = sys.modules[__name__]
sys.modules[__name__].__name__ = 'mwe.mwe'
__package__ = 'mwe.mwe'
class Book(models.Model):
isbn = models.IntegerField()
class TestCreateObjects(TestCase):
def setUp(self):
book = Book.objects.create(isbn='9780470467244')
def test_sanity(self):
self.assertEqual(Book.objects.count(), 1)
call_command('syncdb', interactive=False)
unittest.main()
$ python mwe.py
Creating tables ...
Creating table mwe_book
Installing custom SQL ...
Installing indexes ...
Installed 0 object(s) from 0 fixture(s)
.
----------------------------------------------------------------------
Ran 1 test in 0.002s
OK
I'm sure this is broken in some horrible ways and is dependent on django version, but we'll have to try and see.