Cloud CDN + Custom URL's — how to? - google-cloud-platform

I'm wondering what's the best way to architecture a setup where we basically want to rewrite URLs on a domain to another domain, but have it hosted on Cloud CDN (for performance).
So, for example, I want to have example1.com/path/image.png on example2.com/path2.png. Note, I own the domain for example1.com, but not the server for it.
To clarify, I want this to be quick to happen too — the URL rewrite tutorial describes "several minutes to propagate," which is too slow.
Any advice is highly appreciated!

Related

Prevent public content from embedding on other websites

I am building SaaS to help people build VOD websites and I am planning on adding single CDN to serve content for all the websites. The content on those websites should be publicly available, but I want to prevent other websites (not built with my service) from embedding videos hosted on my CDN. Another thing I care about is reducing network cost. What is the best way to achieve those things? What the video format should be, and what is the best protection method in my case?
P.S. I am on AWS
Thank you in advance!
I thought about CORS policies and signed URLs. Don't know about other method and which of the above is better in my case (more secure, easier to implement, cheaper etc.).

Google Cloud load balancer redirect if from a certain region

I have two static pages (English, Danish) which are currently in Google Cloud Storage Bucket. This bucket contains en and dk folders for those static pages. Also, I have a Load Balancer with Google CDN feature.
I want to achieve that when a person connects to the website from Denmark, that person would get the Danish static page. Meanwhile, people outside Denmark would receive the English site.
How can I achieve this goal in the load balancer level?
P.s. I think it can be done with forwardingRules but I couldn’t figure out how the rules should look like.
I did some research and I think that your needs could be solved with a LB using the header-based routing, using the Accept-Language vary as the routeRules for the URL query.
This guide can help you to achieve this configuration, but you need to adapt it to be ruled by the Accept-Language header parameters that could be more useful for your case.

steps to take website from local server to hosted server (going live)

I am very new to web development, and have a question in regards to taking your website "live."
I coded my site in python, using the Django framework. I have all my code stored in a local server, and want to move ahead with taking my web into production.
I've been looking online on any resources that provide a clear step-by-step instruction of going public with your site. However, they are either all unclear, extremely complicated, or based off "wordpress." I'm not using wordpress.
I understand the basic rules:
get a domain
get a web host
get a ftp
??
my confusion is somewhere between 3. and 4. What happens after you get a FTP? Am I using GitHub at any point in this process? Are there special rules I have to follow (what do you do with your secret key in the settings.py?)
If you know of a great resource for beginner web developers who are trying to take their website live (and who are NOT using wordpress), I would truly appreciate your guidance.
thanks much!
There is no need of ftp or anything else
i used godaddy.com
i just brought a domain name then hosting space
no other thing i purchased
then i uploaded my files to website in sequence
just thats it

Setting up LAMP Web Server on AWS EC2 t1 Micro

I'm sorry for being dumb, but I am really stuck for few days. This is my first time using AWS. I have successfully installed LAMP web server under t1.micro on my customer's AWS account http://54.72.132.215/ following this tutorial . But I don't know what to do next after the installation. My goal is:
Setup a Domain
Run a Prestashop.
I hope you can guide me to the right path, I am totally lost. Thanks.
You need to register a domain with someone, this is outside of Amazon. Just google domain name registrars:
https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=domain%20name%20registrar
Then you'll need to point your domain to your Amazon EC2 instance. I would suggest using Route 53 to do this, another Amazon AWS service that makes it easier to setup and control your domains:
http://aws.amazon.com/route53/
Once you have that setup, visiting your name domain should show the default apache It works! page, if you've correctly setup your LAMP server. It'll look something like these:
https://www.google.co.uk/search?q=default+apache+web+page&espv=2&source=lnms&tbm=isch&sa=X&ei=yRfWU_v8OeHe7Abp1ICICw&ved=0CAYQ_AUoAQ&biw=1457&bih=881#imgdii=_
You'll want to add a new vhost for your new PrestaShop site, this will allow you to setup a specific set of files to serve for your new URL, and means you can add other sites to the server later on. Just a quick google shows multiple tutorials on doing this, here's one of them:
http://calebogden.com/multiple-websites-amazon-ec2-linux-virtual-hosts/
Then follow the tutorial in the PrestaShop documentation about installing PrestaShop via the command line:
http://doc.prestashop.com/display/PS16/Installing+PrestaShop+using+the+command-line+script
Now I'm guessing that all those steps in one go is a little overwhelming, so I would suggest you break this task down into chunks and work on them one at a time, and post a few different questions on StackOverflow and probably ServerFault: https://serverfault.com/, as that is better suited to setting up servers.
To summarise you need to:
register a domain name and point it to your EC2 server, you should see the default apache page
create a new vhost to server web pages for your new domain
follow the guide on PrestaShop about installing the software
Treat each of those a separate task. This question covers lots of topics in one very general idea, the full answer to your problem wouldn't really fit in a single post.
ServerFault will probably have a lot of your answers already, regarding setting up domains and vhosts at least.

Proxying external images for SSL compliance

I've got a little Django site in which users can link to images on other sites in their comments. It's by no means a core feature.
I've just moved the entire site to SSL. That has worked fine for the most part but remote images are obviously not always going to be available over SSL. Only the slightest number of domains have valid certificates.
What's the best way to funnel images through then?
Download them when the user posts and alter the URL to a local one?
Make a proxy that just proxies another URL?
The second seems like less work (I feel like it would be possible just with NGINX rules) but that it would also open the site up to people using my proxy for their own nefarious gain... Which I'd like to avoid.
What's the best compromise here?
Github ran into this same issue when they moved to HTTPS everywhere and detailed it in their blog: https://github.com/blog/743-sidejack-prevention-phase-3-ssl-proxied-assets
Their solution was to create a proxy server which they open sourced as https://github.com/atmos/camo To address the same concerns about abuse of the proxy it is deployed with a shared secret with the application server. Integrating this would a Django project would be straight forward as you would just need to generate the digest from the shared key for the given image url.