Prevent public content from embedding on other websites - amazon-web-services

I am building SaaS to help people build VOD websites and I am planning on adding single CDN to serve content for all the websites. The content on those websites should be publicly available, but I want to prevent other websites (not built with my service) from embedding videos hosted on my CDN. Another thing I care about is reducing network cost. What is the best way to achieve those things? What the video format should be, and what is the best protection method in my case?
P.S. I am on AWS
Thank you in advance!
I thought about CORS policies and signed URLs. Don't know about other method and which of the above is better in my case (more secure, easier to implement, cheaper etc.).

Related

Cloud CDN + Custom URL's — how to?

I'm wondering what's the best way to architecture a setup where we basically want to rewrite URLs on a domain to another domain, but have it hosted on Cloud CDN (for performance).
So, for example, I want to have example1.com/path/image.png on example2.com/path2.png. Note, I own the domain for example1.com, but not the server for it.
To clarify, I want this to be quick to happen too — the URL rewrite tutorial describes "several minutes to propagate," which is too slow.
Any advice is highly appreciated!

Opencart CDN's suggestions

I have been using OpenCart for our site, I am really happy with it, it is an awesome platform that comes with many features and a big community around it. Can someone guide which CDN should is use? What are the important things in the website for which i have to use CDN services. I haven't managed to find any module for this. Any guidance would be appreciated.
Thanks,
Your question doesn't give us full information on what you want to accomplish. Also, there actually are some modules, that make CDN integration.
Also, most CDN services don't require a special plugin, bit can do their magic on the fly, for example like CloudFlare, and their setup is rather basic.
Here is a list of free modules:
https://www.opencart.com/index.php?route=marketplace/extension&filter_license=0&filter_search=cdn
For example, if you would like to have an ImageCDN - KeyCDN has this service and the module utilizes it. Juts an example.
Cloudflare has a good free account service and even better Pro (2nd level) plan, that is good enough for any starter shop. Paid plan has these extras:
Enhanced security with Web Application Firewall (WAF)
Enhanced performance with image and mobile optimization - lossless compression for images and WebP format
HTTP/2 prioritization

updating an S3 website with zero downtime

What would be the best approach to update a static website hosted on an S3 bucket such that there is no downtime. The updates will be done by marketing teams of a company with zero knowledge of cli commands or how to move around in the console. Are there ways to achieve this without having to learn to move around in the console?
Edit
The web site is a collection of static html pages and will be updated using an Html editor. Once edited the marketing team will upload each individual updated file to the S3 bucket. There are no more than 10 such files including html and images. This was currently being hosted on a shared server and we now want to move it to an S3 bucket capable of hosting simple web pages. The preference is to not provision console access for certain users as they are comfortable only using a WYSIWYG html editor and uploading using an FTP client. The editors don't know html and the site doesn't use javascript. I am thinking of writing a batch script to manage the uploads to keep all that cli complexity away so they only work on the HTML in the editor. Looking for the simplest approach to achieve this.
You can set up a CI/CD pipeline as mentioned here
Update static website in an automated way
The above pipeline generally has a code commit as trigger. It depends what your marketing teams are doing on the content and how they are updating it. If they are updating the content that is hosted on AWS, you can change the trigger to S3 updates. The solution depends on an individual use case, which may require some dev from your side to make it simpler for your marketing teams.
I'm unsure what you are asking here, because to "update" a static website, surely you must have some technical knowledge in the very basics of web development.
It's important here to define what exactly you mean by update, because again, updating a website and updating a bucket are two completely different things.
Also, S3 has eventual consistency with PUT's (updating) so there will be some minimal downtime.
The easiest way to update an S3 bucket is via the console, and not the CLI. The console is pretty user friendly, and shouldn't take long to get used to.

When to use different server for a static file in django

I am in a scenario where i have to resize existing images before displaying them on webpage. For that purpose is it wise to use different server for storing my images, as django server may take bandwidth to fetch images from another server.
Yes it is wise you use seperate server for static files. It is also mentioned in the django official documents.Check this serving static files from dedicated server
Absolutely. Especially for images, it is very meaningful to use infrastructures which were built for this purpose.
It is not without reason that cloudinary is a dedicated cdn service for images that can be manipulated on the fly.
You might even consider Flickr as a great cdn solution for photos or slideshows.
By all means, outsourcing photo delivery will boost the performance along with the perceived quality of services that your webiste provides.

Can I use restful as website data service?

90% information in website are static and updated by daily batch job. I am wondering I could use restful services for multi websites? If only 10% information is static, should I use it as well? Dose any body has used restful as data services for public website? My main website 's page view is round 10k/hour.
Thanks
Can you provide a little bit more information? I'm finding it hard to understand what you're asking here.
If you have any kind of structured data that you want to provide to external websites, REST is certainly a quick and easy way to do this.
If you know that the information changes only once a day, you could be caching the results of the REST GET requests on the external websites.