I am working on an application where some videos are uploaded to Amazon S3.
Now a user can choose to download the video or stream it.
The public url on amazon s3 is say "amazons3.com/test/file.mp4".
As of now, everytime I visit the url, it streams the video.
How to I append parameters to the url so that the video can be streamed or dowloaded.
Basically, original url should stream the video, and modified_url should download it.
Thank you!
I found a very easy way.
In the html file, instead of
Download
I used
Download
Reference: href image link download on click
Related
I am having problems with indexing a video, If I try to load a video with a URL from the Azure media service, Video Indexer throws error and abruptly fails indexing with error message "Video Unavailable", this happens if I load it with the API or directly in videoindexer.ai, does anyone have any idea how I can fix this problem?
Video Indexer (VI) expect to have a direct URL for the video itself. So, when you put the URL in your browser the video should start download/play
Can you please elaborate more about how you generate the URL in Azure Media Services (AMS)? For example, are you using a ClearStreamingOnly or DownloadAndClearStremaing locator AMS Streaming Locators?
You should be using the DownloadAndClearStremaing and use the generated video download URL.
Another option will be, as you already have your video in AMS, then it is saved in the AMS linked storage, you can generate a SAS Url for the video file and give that to VI to index.
i have a website similar to video hosting where i need to display upload videos and images and have the images be visible and also the videos if they are purchased, however their locations are saved in the database (MongoDB) and are displayed on the web-page and therefore show up in the network tab in the developer console.
this means that if you click on the link e.g. "https://s3.Region.amazonaws.com/bucket-name/key-name/folder/file-name.mp4" it will auto download, this only happens on chrome though but not Firefox where it just displays the object with no download option. i have tried to change the bucket policy, add encryption but either that causes the images that i want to display to become invisible as they are not publicly accessible or just has no effect and still allows for the video to be downloaded. is there any way for me to have the images and videos in the same bucket and have them both be visible under the right circumstances but block access to the bucket and prevent them from being downloaded by anyone but the bucket owner?
You cannot stop the downloads because the ability to show videos and images in a browser also means that the files are accessible via URL (that's how the browser fetches them).
One option is to use an Amazon S3 pre-signed URL, which is a time-limited URL that provides temporary access to a private object. The way it would work is:
Users authenticate to your back-end service
When a user requests access to one of the videos or images, your back-end checks that they are authorized to access the file
If so, your back-end generates an Amazon S3 pre-signed URL and includes it in the web page (eg <img src='...'>)
When the user's browser accesses that URL, Amazon S3 will verify that the URL is correct and the time-limit has not expired. If it's OK, then the file is provided.
Once the time limit expires, the URL will not work
This will not prevent a file being downloaded, but it will limit the time during which it can be done.
Alternate methods would involve serving content via streaming instead of via a file, but that is a much more complex topic. (For example, think about how Netflix streams content to users rather than waiting for them to download files.)
Is there a way to download the current site content, namely, the uploaded user images, from a web application on AWS? Everything I have found only gives access to previous code deployments, which do not include the user uploaded files.
I have tried the instructions here but it only seems to give access to the code as it was at the time of deployment.
Thank you for any help.
User uploaded images are usually stored in Amazon's S3 service, so go to your AWS dashboard and navigate to the S3 section, and you should find the files in a bucket there
Are you trying to download your own website ? Then you need to get not just code or user images; but also database containing data. You need to check the code where images are saved.. Are they on local EBS or EFS or S3 and correspondingly copy from there.
If you are trying to download some-one else website. Then surely you will not have access to database or code or other user images; but still you can download full website as seen to the public using many tools like WinHTTrack.
I have a couple of mp4 videos I have stored in Amazon S3.
When I try to access the file's links from Firefox, they are played automaticlly, but when I try to open the same links in Chrome or IE the files are downloaded and not played.
I also tried setting the content type of the files to video/mp4 with no luck.
Doe's anyone know of a solution to this issue?
Try to store your videos with file_get_contents($videos) to s3 bucket , so that the stored video in bucket can get the original file content and then it can play on any browser
I was trying to find this on facebook's site in their documentation but so far no luck. I'm sure others must have run into this before.
I use Amazon S3 for storing images. I didn't know ahead of time that if I named my bucket as my domain name with subdomain I could link that way, so until I move all of the pictures I have to link to mybucket.s3.amazonaws.com domain. When I include a picture from there with a post to the wall the picture doesn't show up. If I change the picture to one on the server itself the picture does show up. It seems that the domain name of the picture must match my app? I looked at bugzilla and didn't see this mentioned. Facebook's forum says to post questions here.
I'm using the C# Facebook SDK from CodePlex.
My code looks like (with error handling and authentication check removed):
var client = new FacebookClient(FACEBOOK_APP_ID, FACEBOOK_SECRET);
client.AccessToken = facebook.AccessToken;
var parameters = new Dictionary<string, object>();
parameters.Add("name", name);
parameters.Add("caption", title);
parameters.Add("message", message);
parameters.Add("link", link);
parameters.Add("source", link);
parameters.Add("picture", imageUrl);
client.Post("me/feed", parameters);
I verified that imageUrl does indeed have a correct picture, the domain name just doesn't match. The picture on amazon s3 has public read access. I can view it from my browser so I don't think it's a permission problem. I've tried a few different pictures with the same problem. Only time it's worked so far is when the picture was on the server itself.
So, my question is, is it a problem with me, or does facebook block images that don't match the domain name specified on the app?
You can upload the picture from that url, then add its object id in the post.
Refer to: http://developers.facebook.com/blog/post/526/?ref=nf
Uploading Photos to the Graph API via a URL
Earlier this year, we released support for uploading photos directly
via the Graph API. This requires sending the photo as a MIME-encoded
form field. We are now enhancing our photo upload capability by
introducing the ability to upload photos simply by providing a URL to
the image. This simplifies photo management for a number of use cases:
App developers who host their images on Amazon S3 or a similar
service can pass the S3 URL directly to Facebook without having to
download the file to their application servers only to upload it
again to Facebook. This improves performance and reduces costs for
developers.
Apps written on platforms that don't have good support for
multipart file uploads can create new photos more easily.
To upload a photo via a URL, simply issue an HTTP POST to
ALBUM_ID/photos with the url field set to the URL of the photo you
wish to upload. You need the publish_stream permission to perform this
operation. You can also include an optional message parameter to set a
caption for the photo.
I'am facing the same issue as well. Based on my observations it seems that facebook does not like it when the picture url has more than one sub-domain.
I tried the below 2 URL variations for the same image..
mybucket.s3.amazonaws.com - throws an error
s3.amazonaws.com/mybucket - works fine
:picture => 'http://mybucket.s3.amazonaws.com/footprints/15/coverimgs/medium.jpg'
OAuthException: (#100) picture URL is not properly formatted
:picture => 'http://s3.amazonaws.com/mybucket/footprints/15/coverimgs/medium.jpg'
{"id"=>"587472956_10150280873767957"}
Now i have to figure out how to change the URL structure for the image while passing it to the FB graph API.
I would log it as a bug. If this is really the case, which I kinda doubt, you could create a 301 redirect on your own domain for each image that redirects to the Amazon url.