Django unable to save a text value in DB - django

I'm reading an e-mail content through IMAP in my Django app.
When I try to assign some of the parsed content to the object and do .save() it returns:
ProgrammingError: You must not use 8-bit bytestrings unless you use a text_factory that can interpret 8-bit bytestrings (like text_factory = str). It is highly recommended that you instead just switch your application to Unicode strings.
When I print the variable type: . Field in the DB is defined as CharField. I tried TextField as well, but the result is the same.
How I can solve that?

if your mail text is in mail_text, do this:
mail_text = unicode(mail_text)

Related

Django JSONField - get source text

When using a JSONField, the contents are automatically decoded from JSON into python objects when reading the value. I have a use-case where I'm encoding the string back into JSON to embed in a template. Is there any way to just get the raw JSON string from the object?
Django uses psycopg2.extras.Json under the hood. You will need to cast the field as text to get the original out as plain text [1]. Use django's Cast function [2] to annotate your queryset:
from django.db.models.functions import Cast
from django.db.models import TextField
models_with_json_text = Model.objects.annotate(
json_as_text=Cast("json_field_name", TextField())
)
[1] http://initd.org/psycopg/docs/extras.html#json-adaptation
[2] https://docs.djangoproject.com/en/2.2/ref/models/database-functions/#cast

Invalid literal for int() with base 10: 'csv'

I'm trying to export the user details from the auth_user table to a CSV file in the Django admin panel. I have written the export to CSV function correctly and it has worked fine with other tables of mine. I have also provided the function location in urls.py correctly as:
(r'^auth/user//csv/', 'catalyst_db.catalyst.utils.admin_user_export'),
But Django has provided me the error saying that:
ValueError at /auth/user/csv/ invalid literal for int() with base 10: 'csv'
Any ideas to overcome this problem?
You probably have a previous URL in your urlconf which is matching the CSV value before it gets to that pattern.
I believe you have a date(User model has a join date field) inside the csv file that you are trying to upload.The date field may be corrupt,most probably contains an aphotrophe before the date entry.(You can see the aphostrophe in the formula bar).
This can be fixed by the find and replace of your spreadsheet application.
Replace ^. with & in regular expression mode.

django unicode encode/decode errors

My site needs to be able to serve data in different languages. I set it so it uses utf-8 and the db settings are set to that as well. I've been getting different different unicode errors over the admin.
For example:
In the admin list, when a field from the list contains a non ascii char. (i get UnicodeDecodeError)
When adding a new entry, a UnicodeEncodeError if the unicode method for the model returns an utf-8 decode (which fixes #1).
When using a filter_horizontal in the admin, if data from the used model contains non ascii chars, then the filter disappears from the form.
If I set the unicode method for the model to return for example:
return u'%s' % unicode(self.tag)
That seems to fix #1 and #2, but then that's when I get #3.
I have been looking a lot for a solution, but can't find something that fixes all different errors. What's the best way to deal with those?
from django.utils.encoding import smart_unicode
...
def __unicode__(self):
return smart_unicode(self.tag)
It is noteworthy that you can bypass unicode by simply encoding your data in hexadecimal before storing it in your database.
Something like this is sufficient
MyModel(name=name.encode('hex'), password=password).save()
You can then execute name.decode('hex') to return the data back to its former representation.

Django fixture fails, stating "DatabaseError: value too long for type character varying(50)"

I have a fixture (json) which loads in development environment but fails to do so in server environment. The error says: "DatabaseError: value too long for type character varying(50)"
My development environment is Windows & Postgres 8.4. The server runs Debian and Postgres 8.3. Database encoding is UTF8 in both systems.
It is as if unicode markers in the fixture count as chars on the server and they cause some strings to exceed their field's max length. However that does not happen in the dev environment..
Update: the 50 char limit is now 255 in Django 1.8
--
Original answer:
I just encountered this this afternoon, too, and I have a fix (of sorts)
This post here implied it's a Django bug to do with length of the value allowed for auth_permission. Further digging backs up that idea, as does this Django ticket (even though it's initially MySQL-related).
It's basically that a permission name is created based on the verbose_name of a model plus a descriptive permission string, and that can overflow to more than the 50 chars allowed in auth.models.Permission.name.
To quote a comment on the Django ticket:
The longest prefixes for the string value in the column auth_permission.name are "Can change " and "Can delete ", both with 11 characters. The column maximum length is 50 so the maximum length of Meta.verbose_name is 39.
One solution would be to hack that column to support > 50 characters (ideally via a South migration, I say, so that it's easily repeatable) but the quickest, most reliable fix I could think of was simply to make my extra-long verbose_name definition a lot shorter (from 47 chars in the verbose_name to around 20). All works fine now.
Well, what makes the difference is the encoding of the template databases. On the production server they had ascii encoding while on the dev box it is utf-8.
By default postgres creates a database using the template1. My understanding is that if its encoding is not utf-8, then the database you create will have this issue, even though you create it with utf-8 encoding.
Therefore I dropped it and recreated it with its encoding set to UTF8. The snippet below does it (taken from here):
psql -U postgres
UPDATE pg_database SET datallowconn = TRUE where datname = 'template0';
\c template0
UPDATE pg_database SET datistemplate = FALSE where datname = 'template1';
drop database template1;
create database template1 with template = template0 encoding = 'UNICODE';
UPDATE pg_database SET datistemplate = TRUE where datname = 'template1';
\c template1
UPDATE pg_database SET datallowconn = FALSE where datname = 'template0';
Now the fixture loads smoothly.
Get the real SQL query on both systems and see what is different.
Just for information : I also had this error
DatabaseError: value too long for type character varying(10)
It seems that I was writing data over the limit of 10 for a field. I fixed it by increasing the size of a CharField from 10 to 20
I hope it helps
As #stevejalim says, it's quite possible that the column auth_permission.name is the problem with length 50, you verify this with \d+ auth_permission in postgres's shell. In my case this is the problema, thus when I load django models's fixtures I got “DatabaseError: value too long for type character varying(50)”, then change django.contrib.auth's Permission model is complicated, so ... the simple solution was perform a migrate on Permission model, I did this running ALTER TABLE auth_permission ALTER COLUMN name TYPE VARCHAR(100); command in postgres's shell, this works for me.
credits for this comment
You can make Django use longer fields for this model by monkey-patching the model prior to using it to create the database tables. In "manage.py", change:
if __name__ == "__main__":
execute_manager(settings)
to:
from django.contrib.auth.models import Permission
if __name__ == "__main__":
# Patch the field width to allow for our long model names
Permission._meta.get_field('name').max_length=200
Permission._meta.get_field('codename').max_length=200
execute_manager(settings)
This modifies the options on the field before (say) manage.py syncdb is run, so the databate table has nice wide varchar() fields. You don't need to do this when invoking your app, as you never attempt to modify the Permissions table whle running.

Auto-truncating fields at max_length in Django CharFields

I have a field that has a max_length set. When I save a model instance, and the field's value is greater than max_length, Django enforces that max_length at the database level. (See Django docs on models: http://docs.djangoproject.com/en/dev/ref/models/fields/#django.db.models.CharField.max_length)
However, since I am using Postgres, I receive a DatabaseError exception like this:
DatabaseError: value too long for type character varying(1000)
I would prefer to instead auto-truncate the value (so I don't have an exception). Now, I can do this manually, but what I would really want is to have all of my models auto-truncate the value. (Not necessarily intelligently. Just cutting it off at the 999th character is fine.)
Should I just write a custom class that imports from models.Model and override the save() method, looping through each _meta.field, checking for the max_length, and then truncating? That seems inelegant and there must be a better way.
You could create a custom field that auto-truncates the field (I think this code should work, but double-check it):
class TruncatingCharField(models.CharField):
def get_prep_value(self, value):
value = super(TruncatingCharField,self).get_prep_value(value)
if value:
return value[:self.max_length]
return value
Then, instead of using models.CharField in your models.py file, you'd just use TruncatingCharField instead.
get_prep_value prepares the value for a field for insertion in the database, so it's the ideal place to truncate.
Why don't you use a TextField? From the manual:
For large amounts of text, use
TextField.
Why don't you use ModelForm. ModelForm enforces a validation, setting its default max_length to model field's max_length property, and raising proper validation error when form.is_valid() is called. That way you don't have to save the form, until form is validated.
Or, if you want to silently pass the validation and truncate suits best to you, write a simple django form, and write a clean method that truncates input string to the max_length and return stripped data. Take data from form.cleaned_data after form is validated and save the object.
All considering the fact, Forms are designed to validate data before going to DB.
That seems inelegant and there must be a better way.
The only reason the truncate behavior ever happens in the first place is because of MySQL's failure to adhere to the SQL Standard. The throwing of an exception is the correct response when attempting to INSERT a string into a VARCHAR field that is not wide enough to hold it. MySQL truncates and inserts instead.
If you want to silently corrupt your data, then you'll have to do it manually somehow--PostgreSQL will never do it for you.