Django 1.9 - JSONField in Models - django

I'm trying to set up a models file in Django 1.9 using the new JSONField. I have found examples using postgres but none with MySql. In the examples with postgres they do a
from django.contrib.postgres.fields import JSONField
How do I go about importing it for MySql?
Thanks

UPDATE: Django 3.1 now supports JSONField natively for multiple databases: https://docs.djangoproject.com/en/dev/releases/3.1/#jsonfield-for-all-supported-database-backends
As stated in other answers, Django's native JSONField (as of 1.9 and 1.10) is for PostgreSQL.
Luckily, MySQL 5.7.8+ comes with a native JSON datatype. You can add it your Django project with the django-mysql package and Django 1.8+
pip install django-mysql
from django.db import models
from django_mysql.models import JSONField
class MyModel(models.Model):
my_json_field = JSONField()
Read more about the django_mysql JSONField here.

Django JSONField is Postgres only.
https://docs.djangoproject.com/en/3.0/ref/contrib/postgres/fields/#django.contrib.postgres.fields.JSONField
UPDATE:
There is support for MYSQL via 3rd party library django-mysql

# Install jsonfield package
pip install jsonfield
# Define my model
from django.db import models
import jsonfield
class MyModel(models.Model):
the_json = jsonfield.JSONField()
More detail:https://pypi.python.org/pypi/django-jsonfield

I know this question is about Django 1.9, but JSONField can now be used with all supported database backends with the release of Django 3.1.

Try to save data of this model in postgres db on my local machine:
models.py:
from django.db import models
from django import forms
from inputData.models import Input
from django.contrib.postgres.fields import JSONField
class Results(models.Model):
generator = models.OneToOneField(Input, on_delete = models.CASCADE, primary_key = True)
pvalues = JSONField()
views.py
def result(req, res_id):
try:
inp = Input.objects.get(pk = res_id)
path = os.path.join(BASE_DIR, 'uploads\\' + str(res_id) + '\\t.txt')
p_values = parse_res.main(path)
res = Results(generator = inp, pvalues = p_values)
res.save(using = 'results')
except Results.DoesNotExist:
raise Http404
return render(req, 'result.html', {'res': res})
settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'results': {
'ENGINE':'django.db.backends.postgresql',
'NAME': 'results',
'PASSWORD': 'password',
'USER': 'user',
'HOST': '127.0.0.1',
'PORT': '8000'
}
}
Model Results (see models.py) uses JSONField, which have about 200 bytes of data
But at the line res.save(... of code views.py browser does response too long.
Whatäs wrong with JSON?
What problems can be on server besides cache?

For today I'd recommend using jsonfield2 or waiting for native JSON support for all database backends in Django 3.

Related

How to configure sqlite of django project with pythonanywhere

I have deployed my project to pythonanywhere.
It is working locally.
But with pythonanywhere I am getting no such table exception.
I have configured sqllite as in this link
Just mentioned to generate the sqlite file using runmigrations.
I have changed the settings.py to use os.path.join at that Database section also but still same issue.
Exception Type: ProgrammingError
Exception Value:
(1146, "Table 'todo_todo' doesn't exist")
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
}
}
tried with os.path.join also but same error.
my models.py
from django.db import models
from django.contrib.auth.models import User
# Create your models here.
class Todo(models.Model):
title = models.CharField(max_length=100)
memo = models.TextField(blank=True)
created=models.DateTimeField(auto_now_add=True)
datecompleted=models.DateTimeField(null=True, blank=True)
important=models.BooleanField(default=False)
user = models.ForeignKey(User,on_delete=models.CASCADE)
def __str__(self):
return self.title
I migrated individual apps also.
python manage.py makemigrations appname

How to connect to my heroku database from localhost?

I'm new in django and trying to connect to my heroku database from my localhosted django app.
I've followed the instructions on heroku website but it's seems django won't understand that i'm trying to access an external database.
i get this error
relation "myAppName_user" does not exist
LINE 1: INSERT INTO "myAppName_user" ("title", "content") VALUES
('Beat...
However i never created or never intented to create a table named myAppName_user
I'm just trying to access the table user in my heroku postgres db but i don't know why it tries with myAppName_user
i just explicitly added myAppName in the INSTALLED_APPS config as django throws an error if it's not the case.
My DATABASE config :
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
DATABASES['default'] =
dj_database_url.config(default='postgres://xxxxx')
I want to test my database from my localhost. I'm doing this easily with Node JS but i can't find the solution in django.
Any suggestions ?
EDIT
from __future__ import unicode_literals
from django.db import models
class Fruits(models.Model):
title = models.CharField(max_length=100)
content = models.TextField()
def __str__(self):
return self.name

Django OperationalError: missing table; migration does not recognize missing table

I'm having trouble in Django 1.7, I am trying to save a user to a table, but I'm getting an error that the table does not exist.
Here is the code I'm executing:
from django.conf import settings
from django.contrib.auth import BACKEND_SESSION_KEY, SESSION_KEY, get_user_model
User = get_user_model()
from django.contrib.sessions.backends.db import SessionStore
from django.core.management.base import BaseCommand
class Command(BaseCommand):
def handle(self, email, *_, **__):
session_key = create_pre_authenticated_session(email)
self.stdout.write(session_key)
def create_pre_authenticated_session(email):
user = User.objects.create(email=email)
session = SessionStore()
session[SESSION_KEY] = user.pk
session[BACKEND_SESSION_KEY] = settings.AUTHENTICATION_BACKENDS[0]
session.save()
return session.session_key
However, at
user = User.objects.create(email=email)
I get an Error message :
django.db.utils.OperationalError: no such table: accounts_user
Here is the user model at accounts/models.py that I'm trying to use to build the table:
from django.db import models
from django.utils import timezone
class User(models.Model):
email = models.EmailField(primary_key=True)
last_login = models.DateTimeField(default=timezone.now)
REQUIRED_FIELDS = ()
USERNAME_FIELD = 'email'
def is_authenticated(self):
return True
I've run sqlmigrate against this migration with 'manage.py accounts 0001.initial' and I have gotten the correct create table SQL back, but running 'manage.py migrate' gives me the following :
Operations to perform:
Apply all migrations: sessions, admin, lists, contenttypes, accounts, auth
Running migrations:
No migrations to apply.
The migration is just the result of running 'makemigration' from the shell, no custom code. I do see accounts listed in the included applications, but the migration isn't being ran, so my site is in an odd spot where Django says the table is missing when I try to use it, but Django says it exists when I try to run the migration to create it. Why does Django erroneously think that the table already exists when I can look at the database and see that it doesn't?
#user856358 Your comment about the other sqlite file seems like the root cause. I encountered the same error, and it was resolved by removing that file and running another migration. In my case, the file was located as specified in settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, '../database/db.sqlite3'),
}
}
By removing the .sqlite3 file there, I was able to successfully run the migration and resolve the no-such-table error...
django.db.utils.OperationalError: no such table: accounts_user
$ rm ../database/db.sqlite3
$ python3 manage.py migrate

Scrapy pipeline django sqlite3 database error on save

It seems that Scrapy can't see database because item.save() return error. django.db.utils.DatabaseError: no such table: myapp_myitem It's not problem with Django model because I can add/edit it from Django admin without errors.
items.py
from scrapy.contrib.djangoitem import DjangoItem
from myapp.models import Myitem
class MyitemItem(DjangoItem):
django_model = Myitem
pipelines.py
from myapp.models import Myitem
from items.models import License, Category, Special
class MyitemPipeline(object):
def process_item(self, item, spider):
item.save()
return item
settings.py
ITEM_PIPELINES = {
'scrapyproject.pipelines.MyitemPipeline':1000,
}
import sys
sys.path.append('path_to_project')
import os
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings.local'
my_django_project/settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'db/local_db',
'USER': 'me',
}
I'll answer to my question after I fix the problem because it's trivial problem which take from me all day. Maybe it can be useful to someone.
After I change NAME of database to absolute path to the sqlite3 database Scrapy was store data successfuly. So Scrapy need absolue path to the sqlite3 database and changing name to absolute path to the db will not affect Django functionality.

Test boilerplate for Django models: seeking a portable, one-file solution

I'm trying to make a minimal working example for Django. This should be a single file which allows models to be defined and instantiated in an in-memory database, and which can then be used for stackoverflow questions. If possible, I'd like it to use the django/unittest framework, because this would make it simple to demonstrate problems with model behaviour.
There are many questions on Stackoverflow which would benefit from having something like this.
What I've managed so far:
# Django Minimal Working Example
# (intended) usage: save as `mwe.py` and run `python mwe.py`
# Settings
from django.conf import settings
settings.configure(DATABASES = {
'default': {'ENGINE': 'django.db.backends.sqlite3', 'NAME': ':memory:'}
})
# Models
# with help from http://softwaremaniacs.org/blog/2011/01/07/django-micro-framework/en/
from django.db import models
import sys
sys.modules['mwe.mwe'] = sys.modules[__name__]
sys.modules[__name__].__name__ = 'mwe.mwe'
__package__ = 'mwe.mwe'
class Book(models.Model):
isbn = models.IntegerField()
# Make a database and do things to it
from django.utils import unittest
from django.test import TestCase
class TestCreateObjects(TestCase):
def setUp(self):
book = Book.objects.create(isbn='9780470467244')
def test_sanity(self):
self.assertEqual(Book.objects.count(), 1)
unittest.main()
This script gets as far as running the unittest, but raises the error django.db.utils.DatabaseError: no such table: mwe_book.
Edit I've also tried replacing the lines from from django.utils import unittest with:
from django.core.management import call_command
call_command('syncdb', interactive=False)
book = Book.objects.create(isbn='9780470467244')
This gives the following feedback, before raising the same DatabaseError as above:
Creating tables ...
Installing custom SQL ...
Installing indexes ...
Installed 0 object(s) from 0 fixture(s)
Summary Thanks to help from Pavel this is working. For reference, here is a compact version of the script minus the extraneous unittest stuff:
# Django Minimal Working Example
# usage: save as `mwe.py` and run `python mwe.py`
# Setup django with an in-memory sqlite3 database
# Thanks to Pavel Anossov http://stackoverflow.com/a/15824108/188595
import sys
from django.conf import settings
from django.core.management import call_command
settings.configure(
DATABASES = {
'default': {'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:'}
},
INSTALLED_APPS = ('mwe',),
)
from django.db import models
sys.modules['mwe.models'] = sys.modules['mwe.mwe'] = sys.modules['mwe'] = sys.modules[__name__]
sys.modules[__name__].__name__ = __package__ = 'mwe.mwe'
# YOUR MODEL DEFINITIONS HERE
class Book(models.Model):
isbn = models.IntegerField()
# The call_command line has to appear after all model definitions
call_command('syncdb', interactive=False)
# YOUR CODE HERE
Book.objects.create(isbn='9780470467244')
assert Book.objects.count() == 1
What you were missing is actually installing an app:
settings.configure(
DATABASES = {
'default': {'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:'}
},
INSTALLED_APPS = ('mwe',),
)
This means this module will also be imported as mwe.models by syncdb and as mwe by translation:
sys.modules['mwe.models'] = sys.modules[__name__]
sys.modules['mwe'] = sys.modules[__name__]
My final version:
from django.conf import settings
import sys
settings.configure(
DATABASES = {
'default': {'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:'}
},
INSTALLED_APPS = ('mwe',),
)
from django.core.management import call_command
from django.utils import unittest
from django.db import models
from django.test import TestCase
sys.modules['mwe.models'] = sys.modules[__name__]
sys.modules['mwe.mwe'] = sys.modules[__name__]
sys.modules['mwe'] = sys.modules[__name__]
sys.modules[__name__].__name__ = 'mwe.mwe'
__package__ = 'mwe.mwe'
class Book(models.Model):
isbn = models.IntegerField()
class TestCreateObjects(TestCase):
def setUp(self):
book = Book.objects.create(isbn='9780470467244')
def test_sanity(self):
self.assertEqual(Book.objects.count(), 1)
call_command('syncdb', interactive=False)
unittest.main()
$ python mwe.py
Creating tables ...
Creating table mwe_book
Installing custom SQL ...
Installing indexes ...
Installed 0 object(s) from 0 fixture(s)
.
----------------------------------------------------------------------
Ran 1 test in 0.002s
OK
I'm sure this is broken in some horrible ways and is dependent on django version, but we'll have to try and see.