I want to store the images along with the thumbnails of it. I am storing the images in file system using django. At first, the user will be able to see the thumbnails and after clicking it original images can be seen. I am using postgres database. Also, I have already installed Pillow library. Thumbnail size will be approx 200*200.
Now my questions are:
How should I store the thumbnails ? (in database or in file system)
How to convert the images to it's thumbnails ? (python library or something else)
If anything better is possible for the mentioned feature please let me know.
P.S.: High performance and lesser page load time is required.
There are third party apps that do the heavy lifting like sorl-thumbnail or easy-thumbnail
For the first question, storing the image in the system or cdn and the path in the db is the best approach, and that's what django does by default.
Related
The project I am working on relies on many static files. I am looking for guidance on how to deal with the situation. First I will explain the situation, then I will ask my questions.
The Situation
The files that need management:
Roughly 1.5 million .bmp images, about 100 GB
Roughly 100 .h5 files, 250 MB each, about 25 GB
bmp files
The images are part of an image library, the user can filter trough them based on multiple different kinds of meta data. The meta data is spread out over multiple Models such as: Printer, PaperType, and Source.
In development the images sit in the static folder of the Django project, this works fine for now.
h5 files
Each app has its own set of .h5 files. They are used to inspect user generated images. Results of this inspection are stored in the database, the image itself is stored on disc.
Moving to production
Now that you know a bit about the problem it is time to ask my questions.
Please note that I have never pushed a Django project to production before. I am also new to Docker.
Docker
The project needs to be deployed on multiple machines, to make this more easy I decided to use Docker. I managed to build the Image and run the Container without the .bmp and .h5 files. So far so good!
How do I deal with the .h5 files? It does not seem like a good idea to build an Image that is 25 GB in size. Is there a way to download the .h5 file at a later point in time? As in building a Docker Image that only contains the code and downloads the .h5 files later.
Image files
I'm pretty sure that Django's collectstatic command is not meant for moving the amount of images the project uses. I'm thinking along the lines of directly uploading the images to some kind of image server.
If there are specialized image servers I would love to hear your suggestions.
I am currently developping an Flask application that dynamically generated images. I save the image to static/img folder.
But the image is never changed after first time created.
Any body know what the issue behind.
many thanks.
It could be a caching issue (especially since you're saving to a static folder). Try appending a dummy query parameter e.g <your_url>/?123. If you see the new file, then it's a caching issue. One quick and dirty fix would be to generate unique values and append to the url or you can look up cache bursting techniques for GAE
I'm running Apex 19.2 on Oracle 18c and I would like to get some images URL to show them in the application. The images are stored in the database as blob (not static images).
For the moment what I did is creating an ORDS Restfull Service that connects to database and load the images. The images are then accessible via an URL that I insert in my pages
<img src="URL to my Restfull service module with the image identifier">
This works well but I find it quite complex and most importantly, it's very slow and doesn't cache the image. Whenever I load the page I have to wait for the image to load (even though it's very small : 50kb)
Does anyone have a solution for this please ? Is there any Apex out of the box solution like for static imaes ?
Thanks,
Cheers
There is no direct method to expose BLOBs to the end user as it would be kind of complicated to secure these files. I can suggest the following two methods:
Use the code just like you did it, but consider putting it in an application process. This way, you can use all your session variables directly. You will then be able to generate a link that does exactly what you want, or call the process from a button or branch. There is a nice tutorial here:
https://oracle-base.com/articles/misc/apex-tips-file-download-from-a-button-or-link
Using APEX_UTIL.GET_BLOB_FILE_SRC
This function only works out of a apex session and requires you to set up an application page with an item that holds a primary key to your table. I doubt that this is what you want.
Note that APEX_MAIL.GET_IMAGES_URL does not work for your use case - this only works for files in your shared components application files or workspace files.
I actually like your approach, because it may be more lightweight than 1). That the image gets loaded again every time probably does not depend on the method you are using. I guess it is more likely due to the headers you are sending out. Take a look at the cache-control headers on this page:
https://developer.mozilla.org/de/docs/Web/HTTP/Headers/Cache-Control
Maybe check out APEX_MAIL.GET_IMAGES_URL
It is supposed to do essentially what you need so perhaps you can use it.
I am using S3 storage backend across a Django site I am developing, both to reduce load from the EC2 server(s), and to allow multiple webservers (redundancy, load balancing) access the same set of uploaded media.
Sorl.thumbnail (v11) template tags are being used in our templates to allow flexible image resizing/cropping.
Performance on media-rich pages is not very good, and when a page containing thumbnails needing to be generated for the first time is accessed, the requests even time out.
I understand that this is due to sorl thumbnail checking/downloading the original image from S3 (which could be quite large and high resolution), and rendering/checking/uploading the thumbnail.
What would you suggest is the best solution to this setup?
I have seen suggestions of storing a local copy of files in addition to the S3 copy (not to great when a couple of server are being used for load balancing). Also I've seen it suggested to store 0-byte files to fool sorl.thumbnail.
Are there any other suggestions or better ways of approaching this?
sorl thumbnail is now created with remote slow storages in mind. The first creation of the thumbnail is however done quering the storage, for example first accessed from template, but after that the references are cached in a key value store. Still you need the first query and creation, well one solution is to use the low level api sorl.thumbnail.get_thumbnail with the same options when the image is uploaded. When the image uploaded add this thumbnail creation job to a que like celery.
You can use Sorlery. It combines sorl and celery to create thumbnails via workers. It's very careful not to do any filesystem access outside of the worker thread.
The thumbnail returned immediately (before the worker has had a chance) can be controlled by setting your THUMBNAIL_DUMMY_SOURCE to an appropriate placeholder.
The job is created the first time the thumbnail is requested, subsequent requests are served the dummy image until the worker thread completes.
Almost same as #Aidan's solution, I have made some tweaks on sorl-thumbnail. I also pre-generate thumbnails with celery. My code is here sorl_thumbnail-async
But I came to know easy_thumbnails does exactly what I was trying to do, so I am using it in my current project. You might find useful, short post on the topic is here
The easiest solution I've found so far is actually this third party service: http://cloudinary.com/
I have MySQL replication setup, and it replicates nicely the database data. However, I also use FileField and ImageField, and have file loaded onto the FS. I probably will just use rsync to manually replicate this, but is there a better way?
I know of key value storage. But for this project, I am looking to minimize the number of technologies involved and stick with simple options. I've successfully used rsync for this before, but I was wondering if others who have done this have any new cool tools (or even rsync wrappers) that work better.
Your experiences are appreciated.
I haven't searched to see if anyone has already done this, but you can write your own code in Django to remotely copy the file to your goal server (i.e. SFTP).
Option 1 on this front: create your own Form Field that extends the Image and File field that does this uploading.
Option 2: in your form/view, call some additional function that does the uploading. Option 3: override something in Django code to handle this automatically for Image and File fields (probably not recommended, unless there is some slick way I'm not thinking of).
Here's info on using SFTP in Python: SFTP in Python? (platform independent)
If you're using something like Amazon's CloudFront or Buckets, then you can use Boto to handle the uploading (I believe): http://aws.amazon.com/code/827?_encoding=UTF8&jiveRedirect=1 (if not, there are probably other python libraries to help).