in workflowfile(.wsp)deploying process to same site,the uploaded file is overriding the existing workflowfile.But would need to create duplicate file - sharepoint-2013

in workflow file (.WSP) deploying process to same site, the uploaded file is overriding the existing workflow file. But would need to create duplicate of the same instead of overriding the existing.

Related

Django File object and S3

So I have added s3 support to one of my Django projects. (storages and boto3)
I have a model that has a file field with zip-archive with images in it.
At some point I need to access this zip-archive and parse it to create instances of another model with those images from archive. It looks something like this:
I access archive data with zipfile
Get image from it
Put this image to django File object
Add this file object to model field
Save model
I works perfectly fine without s3, however with it I get UnsupportedOperation: seek error.
My guess is that boto3/storages does not support uploading files to s3 from memory files. Is it the case? If so, how to fix id/ avoid this in this kind of situation?

How to allow google cloud storage to save duplicate files and not overwrite

I am using google cloud storage bucket to save file uploads from my Django web application. However if a file with same name is uploaded, then it overwrites the existing file and i do not want this to happen. I want to allow duplicate file saving at same location. Before moving to google cloud storage when i used my computer's hard disk to save files, Django used to smartly update filename in database as well as hard disk.
I upload files with the name given by users, and I concatenate a timestamp including seconds and milliseconds. but the name of the file is seen by clients as they added it, since I remove that part of the string when it is displayed in the view.
example
image1-16-03-2022-12-20-32-user-u123.pdf
image1-27-01-2022-8-22-32-usuario-anotheruser.pdf
both users would see the name image1

Google Cloud Storage bucket has stopped overwriting files by default when uploading with the Python library

I have an App Engine cron job that runs every week, uploading a file called logs.json to a Google Cloud Storage bucket.
For the past few months, this file has been overwritten each time the new version was uploaded.
In the last few weeks, rather than overwriting the file, the existing copy has been retained and the new one uploaded under a different name, e.g. logs_XHjYmP3.json.
This is a simplified snippet from the Django storage class where the upload is performed. I have verified that the filename is correct at the point of upload:
# Prints 'logs.json'
print(file.name)
blob.upload_from_file(file, content_type=content_type)
blob.make_public()
Reading the documentation, it says:
The effect of uploading to an existing blob depends on the
“versioning” and “lifecycle” policies defined on the blob’s bucket. In
the absence of those policies, upload will overwrite any existing
contents.
The versioning for the bucket is set to suspended, and I'm not aware of any other settings or any changes I have made that would affect this.
How can I make the file upload overwrite any existing file with the same name?
After further testing, although print(file.name) looked correct, the incorrect filename was actually coming from Django's get_available_name() storage class method. That method was trying to generate a unique filename if the file already existed. I have added the method to my custom storage class, and, if the file meets the criteria, I just return the existing name to allow it to overwrite. I'm still not sure why it started doing this, however.

Django how to upload file directly to 3rd-part storage server, like Cloudinary, S3

Now, I have realized the uploading process is like that:
1. Generate the HTTP request object, and set the value to request.FILE by using uploadhandler.
2. In the views.py, the instance of FieldFile which is the mirror of FileField will call the storage.save() to upload file.
So, as you see, django always use the cache or disk to pass the data, if your file is too large, it will cost too much time.
And the design I want to figure this problem is to custom an uploadhandler which will call storage.save() by using input raw data. The only question is how can I modify the actions of FileField?
Thanks for any help.
you can use this package
Add direct uploads to AWS S3 functionality with a progress bar to file input fields.
https://github.com/bradleyg/django-s3direct
You can use one of the following packages
https://github.com/cloudinary/pycloudinary
http://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html

Upload txt file to server

I want create a form to upload files (txt, xls) to the server, not the database.
Does anyone kown any example showing how I can do this?
In order to get the file on to the database server's file system, you would first have to upload the file to the database which it sounds like you are already familiar with. From there, you can use the UTL_FILE package to write the BLOB to the database server's file system.