We currently host online tutorials on our website embeding the videos using Youtube.
However I have been asked to secure the video links so users need to authenticate in order to view the videos, and once authenticated, not be able to copy the video link and share it with others as they will be paid tutorials.
We use AWS to store our other assets (Website images, documents, etc) and want to use AWS to now store our videos.
Does anyone know the best way to secure these links so they can only be used from within our website and not be able to share the video links?
First of all think how much effort you want to put into solving a problem, that the world failed to solved in the last 40 years. We had VHS and everyone could copy everything. We had CDs and DVDs with copy protection. BlueRays can and are ripped too. If you consider how a book can be copied then it is a problem we failed to solve in the last 2000+ years.
Have you played with youtube-dl? Have you seen how easy it is to download things from youtube once you get access to it? And I could always use a screen recorder tool to capture the screen if all else fails.
Given how easy it is to bypass the copy protection, how much time do you want to spend into solving the impossible? Do you want to make the code more complex and the architecture more crappy (and the usability worse) along the way?
If the history has shown anything is that legal measures are the only way to protect from piracy. So you have two options here: pretend you do something to protect knowing you will fail or talk to the managers and convince them, that there are better ways of spending money.
By default, all objects in the bucket are private.
A pre-signed URL may solve your current problem.
Have a look on below links:
https://docs.aws.amazon.com/AmazonS3/latest/dev/ShareObjectPreSignedURL.html
https://www.msp360.com/resources/blog/s3-pre-signed-url-guide/
Related
I want to develop an app for a friend's small business that will store/serve media files. However I'm afraid of having a piece of media goes viral, or getting DDoS'd. The bill could go up quite easily with a service like S3 and I really want to avoid surprise expenses like that. Ideally I'd like some kind of max-bandwidth limit.
Now, the solutions for S3 this has been posted here
But it does require quite a few steps. So I'm wondering if there is a cloud storage solution that makes this simpler I.e. where I don't need to create a custom microservice. I've talked to the support on Digital Ocean and they also don't support this
So in the interest of saving time, and perhaps for anyone else who finds themselves in a similar dilemma, I want to ask this question here, I hope that's okay.
Thanks!
Not an out-of-the-box solution, but you could:
Keep the content private
When rendering a web page that contains the file or links to the file, have your back-end generate an Amazon S3 pre-signed URLs to grant time-limited access to the object
The back-end could keep track of the "popularity" of the file and, if it exceeds a certain rate (eg 1000 over 15 minutes), it could instead point to a small file with a message of "please try later"
I recently started creating a website with angular and Django. This is to be an online bookstore or an ELibraby something like Amazon Kindle, my problem is that I found out that it's not advisable to store ebooks on a database but I need a way for users to get these ebooks from the database and for admins to be able to upload to some sort of file system since database is not possible, please is there anyway I can accomplish this on my site.
I have checked the internet but I haven't seen anything helpful, maybe I am searching wrong or something but I will really appreciate any advice.
And also I will like to know if there is any API that can help me add books to my website at least to fill in some space till actual ebooks are uploaded.
Any advise will really help...
First, you will never want to store binary data of any sorts in the database. You will use a storage and the database will refer to that storage instead. I think you need to see how you can archive that first and then proceed with the rest.
Check Amazon S3 and https://pypi.org/project/django-storages/
Facebook Graph API allows for uploading ad videos from an external link, e.g. https://example.con/videos/my_video.mp4 by POSTing to the https://graph.facebook.com/v6.0/$ACCOUNT_ID/advideos with argument file_url=https://example.con/videos/my_video.mp4.
I haven't found anything similar for playable ads, and my application, that receives such an URL, currently needs to download the file and upload it locally.
I don't think there's any documented way, at least I was unable to find anything in the official documentation, and the fact playables are missing from the reference main page doesn't help. However, maybe someone found one, or worked it out with Facebook support or the like.
In any case, care to share?
EDIT: this would be very useful mostly to be able to do bulk requests; in the current scenario it's either impossible (it's likely to exceed the allowed request size) or harder than desirable (need to either download to temporary files instead of memory, keep track of the files, etc). I assume (may be wrong) that in principle bulk requests are accepted for this API, too.
My open cart website is too slow so how can I solve the issue. When I open the admin side its working good and fast but when I open front-end site then its to load my site. I remove my image folder and check whats problem but again the same problem. if anyone has a solution so please tell me how to resolve it.
The simplest is to use a Cache module that would implement speed optimization features like image compression, caching, file minification and CDN
here are two top extensions:
Carbon Cache - OpenCart speed Optimization for Google page speed
Nitro pack
check for errors
You need to check if you have any issues in your Log file (system/storage/logs/error.log) and if you do, correct them. errors can cause speed drop.
flatten category structure
Also, a heavy category structure with lots of subcategories can influence your speed as well so try to keep your categories flat (3 subcategories deep max)
too many modifications are bad
Modifications like OCMOD can also create speed issues since they can implement bad code directly into the core. so watch out for those.
find a better hosting
And last but not least - your hosting provider. Check if your hosting is actually providing you with a fast server. We often suggest opting for Digital Ocean since the guys have done a good job providing high-quality servers for a reasonable price. if you want $10 when signing up with DO and feel comfortable if we also get $10, use our Digital Ocean link, or just visit their website digitalocean.com
There many questions regarding this topic, but i didnt find any exact answer to my problem. I am building a django app where users can store photos and they are private. When showing them the thumbnails I am having to show the thumbnails by generating signed url from S3. But it takes quite a long time. I wanted to know is there any better option that I am missing.???
Sorl-Thumbnail also has the same problem. I have looked at it and didn't find any better option.
Is there any other option where the source of the image cant be visible?? That might also work.
Please help me as soon as possible
It's not entirely clear what you are asking, but sounds like you have some slowness because you are generating thumbnails on the fly. If that's the case, you could improve performance pre-generating the thumbnails-- generate them when you store the image instead of when you request them.
I don't see much room for improvement if you are concerned with the performance of signed requests because your images are private in S3, without changing your application design. In some cases, it's acceptable to use hard-to-guess URLs for private photos. For example, I believe Flickr does that.