Attempts in uploading video in vimeo - vimeo-api

I have to upload video to vimeo,I am using resumable PUT uploads in which I have uploaded video, then get a 200 response code.Thereafter I verify the upload but the range that api is returning is not the size of my file size.
so how many attempts can we make to achieve a complete 100% upload of our video file.. So that we can make a delete call???

There is no limit to the amount of attempts you can make, but your next request should resume where the first one left off until the entire file is uploaded.
eg:
You upload a 100mb video
You verify the bytes uploaded, but Vimeo claims you have only uploaded 50mb
You upload bytes 51 through 100
You verify the bytes upload, but Vimeo claims you have only uploaded a total of 90mb including step 1.
You upload bytes 91 through 100
You verify the bytes uploaded, and Vimeo claims it has received 100mb
You perform the "delete" call to wrap up the upload flow.
The verification/resume loop can happen any number of times.

Related

AWS s3 object boto3, how to stream upload while streaming download

I have a function that gets an object from one bucket and uploads it to another bucket. My file sizes are unpredictable so what I do is give my memory more than what I need most of the time.
Ideally what I want to do is stream the download/upload so I do not have to give it more memory than what it needs.
Stream download from bucketA (a chunk at a time)
Stream upload to bucketB
Remove uploaded chunk from buffer
repeat step 1 until all chunks have been transferred
This way, I'm only buffering the chunk size during the whole process.
So far I know that streaming download is possible
response = s3.get_object(Bucket='bucket-name', Key=file)
for i,line in enumerate(response['Body'].iter_lines()):
# upload line by line
How do I do I upload per "line" with put_object and also validating integrity with md5 hash?

Amazon S3 multipart upload: Can I upload a partial file to a part?

That is, I am recording a time sensitive video, so I want to reduce upload latency as much as possible. What I want to do is, as I'm recording the video, I want to upload the recorded chunks while still recording.
So let's say I am recording a 10 min video. Instead of uploading the entire video at the end, I want to record, say, a minute, upload that as part of my multipart upload, while recording the next min. Once the recording is complete, I only need to upload the last min at the end and then call s3 multipart to complete the upload. This way, the latency between the end of the recording to when it's available in S3 is closer to the upload time of 1 min of video instead of the upload time for 10 min of video
Is this possible with S3 multipart uploads?

Doubts using Amazon S3 monthly calculator

I'm using Amazon S3 to store videos and some audios (average size of 25 mb each) and users of my web and android app (so far) can access them with no problem but I want to know how much I'll pay later exceeding the free stage of S3 so I checked the S3 monthly calculator.
I saw that there is 5 fields:
Storage: I put 3 gb cause right now there are 130 files (videos and audios)
PUT/COPY/POST/LIST Requests: I put 15 cause I'll upload manually around 10-15 files each month
GET/SELECT and Other Requests: I put 10000 cause a projection tells me that the users will watch/listen those files around 10000 times monthly
Data Returned by S3 Select: I put 250 Gb (10000 x 25 mb)
Data Scanned by S3 Select: I don't know what to put cause I don't need that amazon scans or analyze those files.
Am I using that calculator in a proper way?
What do I need to put in "Data Scanned by S3 Select"?
Can I put only zero?
For audio and video, you can definitely specify 0 for S3 Select -- both data scanned and data returned.
S3 Select is an optional feature that only works with certain types of text files -- like CSV and JSON -- where you make specific requests for S3 to scan through the files and return matching values, rather than you downloading the entire file and filtering it yourself.
This would not be used with audio or video files.
Also, don't overlook "Data transfer out." In addition to the "get" requests, you're billed for bandwidth when files are downloaded, so this needs to show the total size of all the downloads. This line item is data downloaded from S3 via the Internet.

What is the Maximum file size for using multipart upload in s3?

Any one have idea on What is the Maximum file size for using multipart upload in
s3?
because when i tried to upload 10 gb file its got stuck. There is no error message in the log.
can any one have a idea
Thanks in advance
The maximum size of an object you can store in an S3 bucket is 5TB so the maximum size of the file using multipart upload also would be 5TB.
Using the multipart upload API, you can upload large objects, up to 5 TB.
The multipart upload API is designed to improve the upload experience for larger objects. You can upload objects in parts. These object parts can be uploaded independently, in any order, and in parallel. You can use a multipart upload for objects from 5 MB to 5 TB in size.
Official documentation- http://docs.aws.amazon.com/AmazonS3/latest/dev/UploadingObjects.html

Can I recover lost information about an S3 multipart upload?

In this multipart upload example, one needs to save the upload ID and a set of etags corresponding to each uploaded part until the upload is "closed." If I lose my upload ID, I guess I can recover it by looking through open multipart uploads with ListMultipartUploads, but what if I lose an etag? Can those be recovered somehow, or must I abort the whole transfer and start over?
Once you have retrieved the upload ID from ListMultipartUploads, you can then use ListParts to get the list of parts (and their etags) that have been completed for this upload. You can use this information to then restart your upload from the last completed part.
Multipart Upload API and Permissions
Example of resuming multipart uploads using AWS SDK for iOS