Upload image to Firebase Storage from django models in FileField - django

I need to upload an image to Firebase Storage, I'm thinking do it with post_save signal or with the save method. But since Firebase is pure JS, how can I do it in the models.py? Here is the reference of how upload with Firebase Web:
https://firebase.google.com/docs/storage/web/upload-files

You're going to want to use google-cloud-storage for this:
# Import
from google.cloud import storage
# Initialize
client = storage.Client()
bucket = client.get_bucket('bucket-id-here')
# Download
blob = bucket.get_blob('remote/path/to/file.txt')
print(blob.download_as_string())
# Upload
blob2 = bucket.blob('remote/path/storage.txt')
blob2.upload_from_filename(filename='/local/path.txt')

Related

How to Update file name in FileField after file is uploaded on s3 via presigned post URL in Django?

I have integrated django-storages in Django Project. for large file sizes, I have used a pre-signed URL to upload files from externally without taking a load on my server.
by pre-signed URL, Files uploaded successfully in the s3 bucket AWS, after uploading the file in s3 I need to update the name of the file in FileField.
Probably you need something like, using the boto3 library to retrieve the file from S3 and os library to rename the file.
import boto3
import os
s3 = boto3.client('s3')
uploaded_file = s3.get_object(Bucket='your-bucket-name', Key='object-key')
new_filename = 'new_file_name.txt'
os.rename(uploaded_file['Body'].name, new_filename)
...
with open(new_filename, 'rb') as f:
file_obj = File(f, name=new_filename)
my_model.file_field_name = file_obj
my_model.save()

Copy a folder from server to Azure blob using django-storage

How can I upload a folder (media_root/folder) which contains subfolder and files in it to Azure blob container? I can upload a file to Azure blob container using this:
from django.core.files.storage import default_storage
f = open('media_root/folder/file.csv', 'rb')
default_storage.save(path, f)
I have set AzureStorage class in my settings.py.
DEFAULT_FILE_STORAGE = 'storages.backends.azure_storage.AzureStorage'
Any help would be much appreciated.
As the doc of django-storage shows, there is no method to upload a folder to Azure Storage blob, it just can be used to upload files.
If you want to upload a folder(make sure it is not empty) to Azure Storage blob, see the following code with Python.
from azure.storage.blob import BlockBlobService,PublicAccess
import os
def run_sample():
block_blob_service = BlockBlobService("your_account_name", "your_account_key")
container_name ='test1'
path_remove = "F:\\"
local_path = "F:\\folderA"
for r,d,f in os.walk(local_path):
if f:
for file in f:
file_path_on_azure = os.path.join(r,file).replace(path_remove,"")
file_path_on_local = os.path.join(r,file)
block_blob_service.create_blob_from_path(container_name,file_path_on_azure,file_path_on_local)
# Main method.
if __name__ == '__main__':
run_sample()

Django AES Encryption : how encrypt user-uploaded files before they are saved?

I want to encrypt user uploaded files in django before saving them.
When user send files through POST requests, I get a "InMemoryUploadedFile" type object.
How can I encrypt the files before saving them ? I currently use pyAesCrypt to encrypt files but I can't manage to pass in it the "InMemoryUploadedFile" objects. I manage to only encrypt them after they are saved with :
import pyAesCrypt
with open("*FileName*", "rb") as InputFile:
with open("*OutputFileName*", "wb+") as OutputFile:
pyAesCrypt.encryptStream(InputFile, OutputFile, Password, BufferSize)
I recently asked this questions and a user told me to use a package with better community support. It is pyca/cryptography. I was stuck in the same thing and I found a solution. Mind that, I use Django Rest Framework.
from cryptography.fernet import Fernet
# Generate a key and store safely
key = Fernet.generate_key()
f = Fernet(key)
I'll take an excel file for example but you could actually use any file.
import pandas as pd
import io
from django.core.files.uploadedfile import SimpleUploadedFile
# Request file from user and load the file into a dataframe
df = pd.read_excel(request.FILES('file_name'))
# Create BytesIO
output = io.BytesIO()
# Output df to BytesIO
df.to_excel(output, index=False)
# Encrypt data (BytesIO)
encrypted_out = f.encrypt(output.getvalue())
# Export encrypted file
output_file = SimpleUploadedFile('<some_file_name.extension>',encrypted_out)
# Typically you would pass this through a serializer.
To decrypt the file before you can serve the user. Read the file and write it to BytesIO and then you can serve the file to the user.

Using Django's ImageField, how to store files in STATIC_ROOT?

Using Django's ImageField(), how would I store the image file in STATIC_ROOT rather than MEDIA_ROOT?
I realise this is a bad idea for images which are uploaded through the admin, but these images are stored with the rest of the code in git, and attached to the objects when the site is set up using a script which imports all the objects' data. The images are not added/edited/deleted after this.
You can import FileSystemStorage and specify your custom upload path and pass the FileSystemStorage object as an argument in the ImageField()
from django.core.files.storage import FileSystemStorage
from django.conf.settings import STATIC_ROOT
upload_storage = FileSystemStorage(location=STATIC_ROOT, base_url='/uploads')
class ABCModel(models.Model):
...
image = models.ImageField(upload_to='/your_image_name', storage=upload_storage)
References:
FileField - ImageField inherits from FileField
FileSystemStorage

How to save image located at URL to S3 with Django on Heroku?

I'm using django-storages for a project hosted on Heroku. I have django-storages configured and am now serving static files from S3 just fine.
I want to programmatically take an image located at an URL and upload it to S3 and associate it with a model.
I know how to download files from an url with either urlib2 or requests.
What I want to know is how best to get that image onto S3 and have it available to me to use with my model and associated templates. File size is indeterminate, but will probably always be less than 10MB in size.
My questions boil down to:
How do I get image located at an url onto S3?
Once image is on S3, how do I associate that image with a model?
Because of any inherent limitations on Heroku is there a way to avoid saving to Heroku storage and "stream" the file right to S3...or is this a non-issue?
I had the same problem, I use the following method to save the image
from boto.s3.key import Key
import urllib2
import StringIO
from django.conf import settings
def save_image_s3(self, img_url, img_name):
"""Saves the image in img_url into S3 with the name img_name"""
conn = boto.connect_s3(settings.AWS_ACCESS_KEY_ID, settings.AWS_SECRET_ACCESS_KEY)
bucket = conn.get_bucket(settings.AWS_STORAGE_BUCKET_NAME)
k = Key(bucket)
k.key = img_name
fp = StringIO.StringIO(urllib2.urlopen(img_url).read())
k.set_contents_from_file(fp)
return img_name