This is a follow up question for Django on Google App Engine: cannot upload images
I got part of the upload of images to GAE Blobstore working. Here's what I did:
In models.py I created a model PhotoFeature:
class PhotoFeature(models.Model):
property = models.ForeignKey(
Property,
related_name = "photo_features"
)
caption = models.CharField(
max_length = 100
)
blob_key = models.CharField(
max_length = 100
)
In admin.py I created an admin entry with an override for the rendering of the change_form to allow for insert of the correct action to the Blobstore upload url:
class PhotoFeatureAdmin(admin.ModelAdmin):
list_display = ("property", "caption")
form = PhotoFeatureForm
def render_change_form(self, request, context, *args, **kwargs):
from google.appengine.ext import blobstore
if kwargs.has_key("add"):
context['blobstore_url'] = blobstore.create_upload_url('/admin/add-photo-feature')
else:
context['blobstore_url'] = blobstore.create_upload_url('/admin/update-photo-feature')
return super(PhotoFeatureAdmin, self).render_change_form(request, context, args, kwargs)
As I use standard Django, I want to use the Django views to process the result once GAE has updated the BlobStore in stead of BlobstoreUploadHandler. I created the following views (as per the render_change_form method) and updated urls.py:
def add_photo_feature(request):
def update_photo_feature(request):
This all works nicely but once I get into the view method I'm a bit lost. How do I get the Blob key from the request object so I can store it with PhotoFeature? I use standard Django, not Django non-rel. I found this related question but it appears not to contain a solution. I also inspected the request object which gets passed into the view but could not find anything relating to the blob key.
EDIT:
The Django request object contains a FILES dictionary which will give me an instance of InMemoryUploadedFile. I presume that somehow I should be able to retrieve the blob key from that...
EDIT 2:
Just to be clear: the uploaded photo appears in the Blobstore; that part works. It's just getting the key back from the Blobstore that's missing here.
EDIT 3:
As per Daniel's suggestion I added storage.py from the djangoappengine project which contains the suggested upload handler and added it to my SETTINGS.PY. This results in the following exception when trying to upload:
'BlobstoreFileUploadHandler' object has no attribute 'content_type_extra'
This is really tricky to fix. The best solution I have found is to use the file upload handler from the djangoappengine project (which is associated with django-nonrel, but does not depend on it). That should handle the required logic to put the blob key into request.FILES, as you'd expect in Django.
Edit
I'd forgotten that django-nonrel uses a patched version of Django, and one of the patches is here to add the content-type-extra field. You can replicate the functionality by subclassing the upload handler as follows:
from djangoappengine import storage
class BlobstoreFileUploadHandler(storage.BlobstoreFileUploadHandler):
"""Handler that adds blob key info to the file object."""
def new_file(self, field_name, *args, **kwargs):
# We need to re-process the POST data to get the blobkey info.
meta = self.request.META
meta['wsgi.input'].seek(0)
fields = cgi.FieldStorage(meta['wsgi.input'], environ=meta)
if field_name in fields:
current_field = fields[field_name]
self.content_type_extra = current_field.type_options
super(BlobstoreFileUploadHandler, self).new_file(field_name,
*args, **kwargs)
and reference this subclass in your settings.py rather than the original.
Related
I am trying to test how to display API information within a view on my Django project. I know you may have to add some installed APIs into the settings INSTALLED APPS block.
This api is a simple geo one.
I am new to Django and new to using APIs within it. I have managed to get my app the way I need it using Youtube videos. But now I am on my own. I have many different view classes to display differents of my app.
The view below is the view Id like to place the data on.
is this how I would potentially do it? Then call {{ base }} within the HTHL to display it?
class PostDetailView(DetailView):
model = Post
template_name = 'clients/post_detail.html'
def api_test(request):
# This is where the APIs are going to go.
requests.get('https://api.coindesk.com/v1/bpi/currentprice.json')
data = response.json()
return render(request, 'clients/post_detail.html', {
'base': data['disclaimer']
})
I am currently getting no errors within my app, but the country element isnt displaying.
I have tested the following in just a simple python file
import requests
import json
response = requests.get('https://api.coindesk.com/v1/bpi/currentprice.json')
data = response.json()
print(data['disclaimer'])
which gets the desired result. So I guess now my issue is...how do i get this into the HTML? So i can display the results from the API
You can write like this:
class PostDetailView(DetailView):
model = Post
template_name = 'clients/post_detail.html'
def call_geo_api(self):
# This is where the APIs are going to go.
response = requests.get('https://api.coindesk.com/v1/bpi/currentprice.json')
data = response.json()
return data['disclaimer']
def get_context_data(self, *args, **kwargs):
context = super(PostDetailView, self).get_context_data(*args, **kwargs)
context['base'] = self.call_geo_api()
return context
Here, I have overridden get_context_data() method, which is responsible for sending context data from view to template.
Here I have changed your api method, so that it will return data['disclaimer'] from the API, and inside get_context_data method, I have injected it inside context. That should do the trick, so that you will be able to see data in template with {{ base }}.
I am trying to add the content of Django-CMS placeholders to the search index (using Algolia, but I guess this could apply for any indexing service, like Elasticsearch or similar) as soon as they are updated.
Using Django 1.10, django-cms 3.42, I have this model (simplified for this question):
from django.db import models
from cms.models.fields import PlaceholderField
from cms.models import CMSPlugin
class NewsItem(models.Model):
title=models.CharField(_('title'), max_length=200),
content = PlaceholderField('news_content')
I need to do some extra processing as soon as the model field 'content' is saved, and apparently the best way to check for that is to monitor the CMSPlugin model. So I look for saves using from django.db.models.signals.post_save like this:
#receiver(post_save, sender=CMSPlugin)
def test(sender, **kwargs):
logger.info("Plugin saved.")
Now, the problem is that post_save is not triggered as I thought it would. With normal CMS Pages, I noticed that post_save is only triggered when a Page is published, but there is no apparent way to publish a placeholder when used outside the CMS.
The closest similar case I've found is Updating indexes on placeholderfields in real time with django/haystack/solr, but the suggested solution doesn't work.
How could I go about resolving this?
Thank you!
We also had the same search indexing problem when we were implementing djangocms-algolia package, since a placeholder update doesn't trigger an update of the index.
For CMS pages we utilized post_publish and post_unpublish from cms.signals module here.
And for cms apps that use placeholders (eg djangocms-blog) we attached the listeners to post_placeholder_operation, but beware that to make it work your ModelAdmin needs to inherit from PlaceholderAdminMixin:
def update_news_index(sender, operation: str, language: str, **kwargs) -> None:
placeholder: Optional[Placeholder] = None
if operation in (ADD_PLUGIN, DELETE_PLUGIN, CHANGE_PLUGIN, CLEAR_PLACEHOLDER):
placeholder = kwargs.get('placeholder')
elif operation in (ADD_PLUGINS_FROM_PLACEHOLDER, PASTE_PLUGIN, PASTE_PLACEHOLDER):
placeholder = kwargs.get('target_placeholder')
elif operation in (MOVE_PLUGIN, CUT_PLUGIN):
placeholder = kwargs.get('source_placeholder')
else:
pass
if placeholder:
post: Post = Post.objects.language(language_code=language).filter(content=placeholder).first()
if post:
post.save()
signals.post_placeholder_operation.connect(update_news_index, PostAdmin)
signals.post_placeholder_operation.connect(update_news_index, PageAdmin)
I have a model like this:
from wagtail.wagtailcore.models import Page
class Blog(Page):
created = models.DateTimeField(auto_now_add=True)
...
..
Right Now, my default, if my slug is hi-there, the blog post is accessible on site_url:/hi-there/ but I want it accessible via site:url/2014/02/05/hi-there/ The page has various methods like url, url_path which one should I override and what's the best practice to achieve something like this in wagtail?
The RoutablePageMixin is the current (v2.0+) way to accomplish this.
Add the module to your installed apps:
INSTALLED_APPS = [
...
"wagtail.contrib.routable_page",
]
Inherit from both wagtail.contrib.routable_page.models.RoutablePageMixin and wagtail.core.models.Page, then define some view methods and decorate them with the wagtail.contrib.routable_page.models.route decorator:
from wagtail.core.models import Page
from wagtail.contrib.routable_page.models import RoutablePageMixin, route
class EventPage(RoutablePageMixin, Page):
...
#route(r'^$') # will override the default Page serving mechanism
def current_events(self, request):
"""
View function for the current events page
"""
...
#route(r'^past/$')
def past_events(self, request):
"""
View function for the past events page
"""
...
# Multiple routes!
#route(r'^year/(\d+)/$')
#route(r'^year/current/$')
def events_for_year(self, request, year=None):
"""
View function for the events for year page
"""
...
New in Wagtail v0.5 is a mechanism that directly addresses this sort of thing:
Embedding URL configuration in Pages or RoutablePage in v1.3.1
(the docs even have an Blog example!)
I was looking into migrating my blog to a wagtail version and wanted to support my previous url schema and had to solve this exact problem. Luckily I just found a solution and want to share it, hopefully this will be helpful for someone else in the future.
The solution is a 2 Step process.
change the url of a blog page to contain the date as well.
class Blog(Page):
created = models.DateTimeField(auto_now_add=True)
def get_url_parts(self, request=None):
super_response = super().get_url_parts(request)
# handle the cases of the original implementation
if super_response is None:
return None
(site_id, root_url, page_path) = super_response
if page_path == None:
return super_response
# In the happy case, add the date fields
# split prefix and slug to support blogs that are nested
prefix, slug, _ = page_path.rsplit("/", 2)
return (
site_id,
root_url,
f"{prefix}/{self.created.year}/{self.created.month}/{self.created.day}/{slug}/",
)
And now we need to make those posts also route-able.
class BlogIndexPage(RoutablePageMixin, Page):
...
def route(self, request, path_components):
if len(path_components) >= 4:
year, month, day, slug, *rest = path_components
try:
subpage = self.get_children().get(slug=slug)
return subpage.specific.route(request, rest)
except Page.DoesNotExist:
...
return super().route(request, path_components)
This solution ignores the date and just use the slug to locate a blog post as the original solution. This should also work when you don't use the RoutablePageMixin.
Hope this is still helpful for someone.
I'm using Grappelli together with Filebrowser and I found a bug when uploading images with a file extension in uppercase(image.PNG). If they end in uppercase a thumbnail will be created everytime the filebrowser page is refreshed.
I found this method in the filebrowser package:
def handle_file_upload(path, file, site):
"""
Handle File Upload.
"""
uploadedfile = None
try:
file_path = os.path.join(path, file.name)
uploadedfile = site.storage.save(file_path, file)
except Exception, inst:
raise inst
return uploadedfile
To solve the bug I want it to look like this:
def handle_file_upload(path, file, site):
"""
Handle File Upload.
"""
uploadedfile = None
try:
file_path = os.path.join(path, file.name.lower())
uploadedfile = site.storage.save(file_path, file)
except Exception, inst:
raise inst
return uploadedfile
How do I do this without changing the package file? I don't want my fix to disappeare when I update Filebrowser.
Can I override just that method? Or should I use signals or something?
I made my answer expling how to override a class method. but that's wrong... it is not a class method what you want to change.
I think, your best option is to make a branch in github of the project and then make a pull request and explein why you did it. If they share your opinion, they will take the pull request and you can go on without worry about override.
https://github.com/sehmaschine/django-filebrowser
I can easily fill the field of a FileField or ImageField in a Django fixture with a file name, but that file doesn't exist and when I try to test my application it fails because that file doesn't exist.
How do I correctly populate a FileField or Imagefield in a Django fixture so that the file itself is available too?
I'm afraid the short answer is that you can't do this using the FileField or ImageField classes; they just store a file path and have no real concept of the file's actual data. The long answer, however, is that anything is possible if you leverage the Django API for writing your own custom model fields.
At a minimum, you'll want to implement the value_to_string method to convert the data for serialization (there's an example in the django docs at the link above). Note that the examples at the URL link above also include mention of subclassing FileField and ImageField, which is helpful for your situation!
You'll also have to decide if the data should therefore be stored in the database, or on the file system. If the former, you will have to implement your custom class as a Blob field, including customization for every DB you wish to support; you'll also have to provide some support for how the data should be returned to the user out of the database when the HTML requests a .gif/.jpg/.png/.whatever url. If the latter, which is the smarter way to go IMHO, you'll have to implement methods for serializing, de-serializing binary data to the filesystem. Either way, if you implement these as subclasses of FileField and ImageField, you should still be able to use the Admin tools and other modules that expect such django features.
If and only if you elect to use the more involved blob approach, here's a snippet of code from an old project of mind (back when I was learning Django) that handles blob for MySQL and PostgreSQL; you'll probably be able to find a number of improvements as I haven't touched it since :-) It does not handle serialization, though, so you'll have to add that using the method above.
from django.db import models
from django.conf import settings
class BlobValueWrapper(object):
"""Wrap the blob value so that we can override the unicode method.
After the query succeeds, Django attempts to record the last query
executed, and at that point it attempts to force the query string
to unicode. This does not work for binary data and generates an
uncaught exception.
"""
def __init__(self, val):
self.val = val
def __str__(self):
return 'blobdata'
def __unicode__(self):
return u'blobdata'
class BlobField(models.Field):
"""A field for persisting binary data in databases that we support."""
__metaclass__ = models.SubfieldBase
def db_type(self):
if settings.DATABASE_ENGINE == 'mysql':
return 'LONGBLOB'
elif settings.DATABASE_ENGINE == 'postgresql_psycopg2':
return 'bytea'
else:
raise NotImplementedError
def to_python(self, value):
if settings.DATABASE_ENGINE == 'postgresql_psycopg2':
if value is None:
return value
return str(value)
else:
return value
def get_db_prep_save(self, value):
if value is None:
return None
if settings.DATABASE_ENGINE =='postgresql_psycopg2':
return psycopg2.Binary(value)
else:
return BlobValueWrapper(value)
There's no way to "include" the files in the serialized fixture. If creating a test fixture, you just need to do it yourself; make sure that some test files actually exist in locations referenced by the FileField/ImageField values. The values of those fields are paths relative to MEDIA_ROOT: if you need to, you can set MEDIA_ROOT in your test setUp() method in a custom test_settings.py to ensure that your test files are found wherever you put them.
EDIT: If you want to do it in your setUp() method, you can also monkeypatch default_storage directly:
from django.core.files.storage import default_storage
class MyTest(TestCase):
def setUp(self):
self._old_default_storage_location = default_storage.location
default_storage.location = '/some/other/place'
def tearDown(self):
default_storage.location = self._old_default_storage_location
That seems to work. default_storage is a documented public API, so this should be reliable.