I know how to limit the size of the image file in django summernote. However, I can't find the way to limit the total number of images I upload. Is there anyone who knows?
Related
I'm making a bulletin board with django.
I'm using the Summer Note Editor and I'm going to use aws3 as an image repository.
If you post too many images for each post, I think there will be a lot of s3 charges
Is there a way to limit the total capacity of each post or the total capacity of uploading summer note files?
Or is there a way to limit the number of image uploads per post?
I can limit the capacity of each image using the summer note setting
I am creating a geospatial web app and have a very large database, at the moment it is 7.5 million entries long and will be almost 60 million entries long by the time all the data is uploaded. I want to display all the points from a certain date and am simply using Model.objects.filter(datetime__date=requested_date) at the moment. Each date request takes ~8s to load. How can I filter in a more efficient way? If there are better ways to store geospatial data points, I would greatly appreciate any advice!
I am storing 6 images in S3, and getting all of them in a single request, by a call to S3 per image.
The load time of the page already increased by 1-2 seconds, and there are more images to come. I am converting images to base64 from buffer after retrieving, and am not sure if it is the right approach.
Any suggestions?
I am working on a project where a photographer is going to upload a set of high-resolutions pictures (20 to 40 pictures). I am going to store each picture twice: 1 original and 1 with the watermark. On the platform, only pictures with a watermark are going to be displayed. The user will be able to buy pictures and the one selected are going to be send by email (original pictures).
bucket-name: photoshoot-fr
main-folder(s): YYYY-MM-DD-HH_MODEL-NAME example: 2020-01-03_Melania-Doul
I am not sure here if I should have 2 different folder inside the previous folder which are original and protected. Both folders are going to contain the exact same pictures with the same id but one is going to store the original pictures and the other one protected pictures. Is there any better bucket design solution?
n.b: it's a personal project but there are multiple photographers and each photographers are going to upload 2-3 set of photos every day. Each main-folder is going to be deleted after 2 months.
I am having issues with Prestashop 1.6 and I hope I could find some help in here.
Basically, I uploaded a csv with all the products and everything worked fine. After that I also uploaded another csv with the combinations that include the images related to the product colors and, again, no problems, it simply adds these images to the previously uploaded with the products.
The issue occurs when I update the combinations csv. It seems it duplicates the images from the previous csv. I tried to add a column "delete existing images" set to 1 (=yes) but the problem is that it also removes the product images!
Does anyone have any idea?
Thanks in advance,
Pat
PrestaShop fits the image of the first csv as an image product.
Even if you set to 1 "delete existing images", PrestaShop does not find images because that function deletes the entity attribute and not the product.
My advice is do not upload images of products with attributes in the first csv. General product will look like the image pricipale Default Product.