I have a Django model which includes an Image Field, and an app which consumes the model instance including all images. The problem I'm having now is while my data and the img local link resides on my AWS RDS database, the media folder which contains the images for the model is not. Thus every time I deploy a new version of the app (I'm using Beanstalk for deployment), my media folder is wiped clean and all my image link dies as a result.
Below is my model:
class Item(models.Model):
# Fields
name = models.CharField(max_length=30, help_text="Enter item name")
description = models.TextField(help_text='Enter a short description')
image = models.ImageField(help_text='Upload item image', upload_to='image/item/',
default='image/demo.jpg')
# Methods
def __str__(self):
return str(self.name)
def get_absolute_url(self):
return reverse('model-detail-view', args=[str(self.id)])
And I use serve to retrieve the image from Django.
url(r'^resource/(?P<path>.*)$', serve, {'document_root': settings.MEDIA_ROOT}),
Now what would be the best way to resolve this problem? I would not want to upload the image to another server then link back to Django, it's best if I can save it to my database the moment I upload the file from my model form.
All feedback is appreciated.
See the EB documentation on design considerations:
Elastic Beanstalk applications run on Amazon EC2 instances that have no persistent local storage. When the Amazon EC2 instances terminate, the local file system is not saved, and new Amazon EC2 instances start with a default file system.
So, you cannot do what you are asking. You must upload the files to a persistent storage system; S3 is the obvious choice.
Related
I do not really understand how the database works when using in production.
My stack:
Django
Heroku
AWS S3
PostgresSQL on Heroku
Users can generate some images on my app. The images are saved to AWS S3, and in some feature I want to retrieve the last generated image.
This below is my model where the images are saved in.
models.py:
class imgUploadModel(models.Model):
auto_increment_id = models.AutoField(primary_key=True, default=True)
image = models.ImageField(null=True, blank=True, upload_to="images/")
And here the view where the images is taken again and handled in some features.
view.py:
imgname = imgUploadModel.objects.all().last().image
As you can see I use .last() to get to the latest images which was generated.
Now to my questions:
In production, could it be that one user sees another users images? Or how does the Dynos (from heroku) separate the sessions?
Since the AWS S3 bucket is just a memory storage without dividing it by users, I assume that one user can see other users images. Especially then, when user A creates an Img, and user B clicks on 'latest image'.
If it is so, how can I create Dynos or Buckets or anything else to prevent this behaviour.
I just do not really understand it from a logical point of view.
everyone. I decide to use Cloudinary for storing images.
This is my model field with image:
avatar = models.ImageField(upload_to='avatar/', default='avatar/default.png', blank=True)
All works fine for me, but I have one little issue.
When I uploaded the image from admin panel, cloudinary upload it to my cloudinary folder 'avatar' with some modified name, for example: 'july2022_kda4th' or 'july2022_aidkdk'
But when I uploaded this image from admin panel of another database (production), cloudinary upload the image with another name. So, I have two similar images in cloudinary. It's not convenient.
How can I fix it?
By default, if you don't supply a public_id in the upload API call, a random string is assigned to the asset. You can read more here: https://cloudinary.com/documentation/upload_images#public_id.
It sounds like you want to use the asset filename as the public_id so what you can do is:
In forms set use_filename=true and unique_filename=false as described in this link
OR if above is not working you can
Create an upload preset https://cloudinary.com/documentation/upload_presets
Enable Use filename or externally defined Public ID: option
Disable Unique filename:
Set this preset as the default upload API/UI (https://cloudinary.com/documentation/upload_presets#default_upload_presets) or include this upload_preset in your upload call
I am in the process of building a website with Django. The web app is for a local store in my city. The store sells gardening courses and workshops (the store is my mom's btw).
I want the admin of the page (the owner of the store) To be able to add new courses whenever she wants.
So I made this model for the courses:
class Curso(models.Model):
title = models.TextField()
photo = models.ImageField(upload_to='cursos/')
description = models.TextField()
price = models.IntegerField(null=False)
content = models.JSONField(blank=True)
clases = models.IntegerField(default=1)
duration = models.IntegerField()
isworkshop = models.BooleanField(default=False)
I included an ImageField because in the fronted there should be an image related to the course.
I added the path /cursos/ but I actually have no idea where the image is going. I saw people had a "media" folder.
But I also read the "media" folder was only for images uploaded by the user?
This is my static folder looks like:
static
|_app
|_js
|_css
|_sass
|_icons
|_images
|_homeslider
|_plants
|_pictures
Should the images uploaded from the admins app in that same folder?
Anyways I don't know where to store the images. And I don't have a hosting yet which leads me to my next question:
What should I do with the database?
I saw the majority of people asking these types of question had already bought a hosting so:
Should I buy a hosting even if the app is far from being ready just to start testing things out there and putting the courses objects there?
Are files stored differently in production?
Can I have my database locally and then upload it to the server?
What I don't want is to put a bunch of images and data on a sqlite database and then have to change all of it because that's not going to work for production
I'm new to web development and I am really lost when it comes to hosting, production and databases.
I would be very thankful for any help. I don't know what to do and I need someone more experienced to put me in the right direction.
Thank you in advanced!
I am trying to understand how Django ImageKit works with respect to creating thumbnail files (for example). I am using the example code:
from django.db import models
from imagekit.models import ImageSpecField
from imagekit.processors import ResizeToFill
class Profile(models.Model):
avatar = models.ImageField(upload_to='avatars')
avatar_thumbnail = ImageSpecField(source='avatar',
processors=[ResizeToFill(100, 50)],
format='JPEG',
options={'quality': 60})
I am uploading the avatar image from an app. This works fine with an entry made in the Profile table and the file created in AWS S3. What I am struggling to understand is when/where/how the avatar_thumbnail is created. Do I have to do something explicit to get it stored in AWS S3 along with the avatar image? Or is the avatar_thumbnail only ever created on the fly? I need it stored somewhere for later use.
I don't get it 100%, but from what I understood the thumbnail is a generator that is only called when the thumbnail is first requested, and is then cached.
My personal experience with it suggests so as well. I created a dummy instance of the model (same code as above) through the admin interface. I then created an html page that displays the thumbnails with template tagging (<img src="instance.thumbnail.url">). Checking my folders, no images generated so far. Then I launch a server, navigate to that page. It takes unusual time to load (that's an indication that the thumbnails are being created) on the first try, but then it speeds up. And the files are there.
By default, ImageKit generates ImageSpecField images when they are needed, not when the model object is created. To change the behavior, you can use the cache file strategies. Default value of IMAGEKIT_DEFAULT_CACHEFILE_STRATEGY is JustInTime, which can be changed to Optimistic which creates images on model object creation, or to custom strategy.
Moreover, you can set different strategies for individual ImageSpecFields by providing the cachefile_strategy parameter.
I'm writing a project in Django that has user uploaded images related to products, so my product model is something like:
class Product(models.Model):
name = models.CharField(max_length=128)
description = models.CharField(max_length=255)
owner = models.ForeighKey(User)
image = models.ImageField()
I would like to store the images at Amazon S3, since it's pretty cheap and fast. I would, as others, like to avoid the overhead of uploading the file to my server and then to S3. There are some sample code explaining how to upload a file directly to S3 from the client browser.
The issue I see (and I did not find any solution yet) is that I don't have only an image to upload. My object (Product) also has other fields, such as name, description and so on. All the examples I saw use one form for the image only. I would like to have one html form (with image, name and so on) for the user and once the user click on "submit", I would be able to store the data in amazon S3 and then the other info (name and so on), on my local database.
According to the image in http://docs.amazonwebservices.com/AmazonS3/latest/dev/UsingHTTPPOST.html using a POST method, it's possible to send data from the client to amazon S3 and my webserver. It's now clear if both situations can be included in the same page (same html form).
According to Amazon docs, It's possible to upload a file and set the redirect ULR when the upload is successful.
1) Is it possible to upload the image to the URL and then redirect to my webserver to store the rest of the information? Does the redirect keep the POST data (name, description etc)? How can I access the name of the file stored in S3?
2) Is there any other way to achieve my goal (besides my option #1)?
Thanks in advance
Checkout django-storages (http://code.welldev.org/django-storages/) it does exactly what you need.
And you don't have to care about multiple forms/redirects etc etc, it will replace the default file storage backend and just push files to amazon s3 bucket.
It supports several kind of storages and S3 in one among them, I've used for several projects so far and it's really easy to plug in.
Since docs seems pretty dead, here's my configurations:
settings.py
DEFAULT_FILE_STORAGE = "storages.backends.s3boto.S3BotoStorage"
AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''
AWS_STORAGE_BUCKET_NAME = ''
and of course you need to install django-storages