I am trying to implement file upload using Suave and I'm failing to get it working. What I am doing is a multipart form post and I can find the file in ctx.request.files.[0]. However trying to read the tempFilePath using Read permission I get a sharing violation. When done the file has size 0. Am I supposed to access it through a stream somewhere else?
Related
I have a 121MB MP3 file I am trying to upload to my AWS S3 so I can process it via Amazon Transcribe.
The MP3 file comes from an MP4 file I stripped the audio from using FFmpeg.
When I try to upload the MP3, using the S3 object upload UI in the AWS console, I receive the below error:
InvalidPart
One or more of the specified parts could not be found. the part may not have been uploaded, or the specified entity tag may not match the part's entity tag
The error makes reference to the MP3 being a multipart file and how the "next" part is missing but it's not a multipart file.
I have re-run the MP4 file through FFmpeg 3 times in case the 1st file was corrupt, but that has not fixed anything.
I have searched a lot on Stackoverflow and have not found a similar case where anyone has uploaded a single 5MB+ file that has received the error I am.
I've also crossed out FFmpeg being the issue by saving the audio using VLC as an MP3 file but receive the exact same error.
What is the issue?
Here's the console in case it helps:
121MB is below the 160 GB S3 console single object upload limit, the 5GB single object upload limit using the REST API / AWS SDKs as well as the 5TB limit on multipart file upload so I really can't see the issue.
Considering the file exists & you have a stable internet-connected (no corrupted uploads), you may have incomplete multipart upload parts in your bucket somehow which may be conflicting with the upload for whatever reason so either follow this guide to remove them and try again or try creating a new folder/bucket and re-uploading again.
You may also have a browser caching issue/extension conflict so try incognito (with extensions disabled) or another browser if re-uploading to another bucket/folder doesn't work.
Alternatively, try the AWS CLI s3 cp command or a quick "S3 file upload" application in a supported SDK language to make sure that it's not a console UI issue.
I'm trying to performance test APIs of a CorDapp(built with open source Corda) using JMeter.
One of the API requires file upload,when I'm trying to upload a file multiple times(using more than 1 Number of threads) I get the 'net.corda.nodeapi.exceptions.DuplicateAttachmentException' response along with status code 500.
I guess the error is because the hash that is generated takes the file as well as timestamp as its inputs and when I try to do multiple uploads at the same instant, timestamp basically remains same,which results in same hash generation with every http request, hence resulting in the above mentioned error. How can I solve this issue as without properly hitting this API I cannot test subsequent APIs.
Corda prohibits upload the same attachment file twice. This is why you are seeing this error.
As I read the different posts here and libtorrent documentation, I know (as documented), I have to download the torrent file in order to get the metadata. but how the uTorrent App works, when I just start downloading, I get the metadata within a second then after getting the metadata, I can pause downloading. So, it doesn't restrict me to download a complete file in order to return metadata.
So, is there a way to get metadata without downloading the complete file
libtorrent's metadata_received_alert is what you want. This will be sent once the metadata finishes downloading. Do make sure that youre receiving status notifications though.
I am able to create a test to send a file to my API as expected. This works.
File Upload Works
What I need to do now is change the file sent to my API based on data in the JSON data file used for collection runs. How would you accomplish this?
I am doing following steps:
I have API gateway ( PUT Method) which integrated with AWS lambda.
It is direct mapping in multipart/form-data ( so big logic is happening here )
Now through POSTMAN, file is being uploaded.
File is getting uploaded.
When I download this ZIP file, It says "End-of-central-directory signature not found. Either this file is not a Zip file, or it constitutes one disk of a multi-part Zip file."
Then I opened the ZIP in notepad++ ( yes i did that ), I can see only few line with binary data, on other hand my original file has lot of it.
Please help, let me know if some more needed.
I had the same issue and it turned out I was sending the request without all needed headers. It worked fine with Postman. Please check if your request contains these postman-headers