Proxying external images for SSL compliance - django

I've got a little Django site in which users can link to images on other sites in their comments. It's by no means a core feature.
I've just moved the entire site to SSL. That has worked fine for the most part but remote images are obviously not always going to be available over SSL. Only the slightest number of domains have valid certificates.
What's the best way to funnel images through then?
Download them when the user posts and alter the URL to a local one?
Make a proxy that just proxies another URL?
The second seems like less work (I feel like it would be possible just with NGINX rules) but that it would also open the site up to people using my proxy for their own nefarious gain... Which I'd like to avoid.
What's the best compromise here?

Github ran into this same issue when they moved to HTTPS everywhere and detailed it in their blog: https://github.com/blog/743-sidejack-prevention-phase-3-ssl-proxied-assets
Their solution was to create a proxy server which they open sourced as https://github.com/atmos/camo To address the same concerns about abuse of the proxy it is deployed with a shared secret with the application server. Integrating this would a Django project would be straight forward as you would just need to generate the digest from the shared key for the given image url.

Related

Chrome hitting my Django backend but I only made an iOS app

So I have a Django backend deployed on Google App Engine. This backend supports an iOS app. In my server logs I can see all the requests coming in and where they were made. It used to be that I would only get requests from Joon/7.** (which is the iOS app name + version). However, recently I've been getting requests from Chrome 72 which doesn't make sense cause the app shouldn't be able to be used on Chrome. Furthermore these requests are creating a lot of errors in my backend because it is not sending an authentication token. Does anyone know what is going on here? Are my servers being hacked?
Looks like someone discovered the URL to your App Engine app. You can use Ingress controls to only allow access via Cloud Load Balancing and then Google Cloud Armor in front to protect that with rules that look like:
has(request.headers['user-agent']) && request.headers['user-agent'].contains('Godzilla')
It is quite common to see all sorts of hits (from what I call spam bots) to an App Engine App. Technically, GCP expects you to use Google Firewall rules to block these. The challenge though is that these bots usually change their IP Addresses frequently or use multiple ones. I don't have a 'perfect' solution.
a) You can try the method by #jeff-williams (I've never tried that)
b) You can also try GCP's firewall rules (I use this but I try to block a range of IPs instead of blocking them one by one)
c) Sometimes I also put my service behind a specific non-intuitive path. This way, the spam bots will only hit the default/base url and then I have a separate service which returns 404 for all calls to that base url

steps to take website from local server to hosted server (going live)

I am very new to web development, and have a question in regards to taking your website "live."
I coded my site in python, using the Django framework. I have all my code stored in a local server, and want to move ahead with taking my web into production.
I've been looking online on any resources that provide a clear step-by-step instruction of going public with your site. However, they are either all unclear, extremely complicated, or based off "wordpress." I'm not using wordpress.
I understand the basic rules:
get a domain
get a web host
get a ftp
??
my confusion is somewhere between 3. and 4. What happens after you get a FTP? Am I using GitHub at any point in this process? Are there special rules I have to follow (what do you do with your secret key in the settings.py?)
If you know of a great resource for beginner web developers who are trying to take their website live (and who are NOT using wordpress), I would truly appreciate your guidance.
thanks much!
There is no need of ftp or anything else
i used godaddy.com
i just brought a domain name then hosting space
no other thing i purchased
then i uploaded my files to website in sequence
just thats it

How to make Django pass cookies when communicating with Node.js server using socket.io?

I am currently developing an instant messaging feature for my apps (ideally cross platform mobile app/web app), and I am out of ideas to fix my issue.
So far, I have been able to make everything work locally, using a Node.js server with socket.io, django, and redis, following what most tutorials online suggest.
The step I am now at consists in putting all that in the cloud using amazon AWS. My Django server is up and running, I created a new separate Node.js server, and I am using Elasticache to handle the Redis part. I launch the different parts, and no error shows up.
However, whenever I try using my messaging feature on the web, I keep getting an error 500:
handshake error
I then used the console to check the request header, and I observed that the cookies are not in there, contrary to when I am on localhost. I know it is necessary to authorize the handshake, so I guess that's where my error is coming from..
Furthermore, I have also checked that the cookies do exist, they are just not set in the request header.
My question is then: How can I make sure Django or socket client (not sure who's responsible here..) puts the cookies in the header??
One of my ideas was that maybe I am supposed to put everything on the same server, with different ports, instead of 2 separate servers? Documentation on that specific architecture problem is surprisingly scarce, compared to the number of tutorials describing how to make it work on local.
I hope I described the problem accurately enough! :)
Important note: I am using socket.io v0.9.1-1, only one compatible with a titanium mobile app.
Thank you for any help!
All right, so I've made some progress.
The cookie problem came from the fact I was making cross-domain request, adding a few lines enabled CORS, which didn't solve the cookie issue, but allowed me to communicate between servers (basically I set the headers of the response using express. I then passed necessary data in the query, even if not the most secure way to do it, I'm just building an MVP, and it's enough for now.
I haven't been able to make the chat work from my Titanium mobile app, but since I can use a webview to handle it, I will be fine.
Hopefully that will help someone.. If anyone needs me to post some code snippets I will gladly do so upon request!
Cheers

Setting up LAMP Web Server on AWS EC2 t1 Micro

I'm sorry for being dumb, but I am really stuck for few days. This is my first time using AWS. I have successfully installed LAMP web server under t1.micro on my customer's AWS account http://54.72.132.215/ following this tutorial . But I don't know what to do next after the installation. My goal is:
Setup a Domain
Run a Prestashop.
I hope you can guide me to the right path, I am totally lost. Thanks.
You need to register a domain with someone, this is outside of Amazon. Just google domain name registrars:
https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=domain%20name%20registrar
Then you'll need to point your domain to your Amazon EC2 instance. I would suggest using Route 53 to do this, another Amazon AWS service that makes it easier to setup and control your domains:
http://aws.amazon.com/route53/
Once you have that setup, visiting your name domain should show the default apache It works! page, if you've correctly setup your LAMP server. It'll look something like these:
https://www.google.co.uk/search?q=default+apache+web+page&espv=2&source=lnms&tbm=isch&sa=X&ei=yRfWU_v8OeHe7Abp1ICICw&ved=0CAYQ_AUoAQ&biw=1457&bih=881#imgdii=_
You'll want to add a new vhost for your new PrestaShop site, this will allow you to setup a specific set of files to serve for your new URL, and means you can add other sites to the server later on. Just a quick google shows multiple tutorials on doing this, here's one of them:
http://calebogden.com/multiple-websites-amazon-ec2-linux-virtual-hosts/
Then follow the tutorial in the PrestaShop documentation about installing PrestaShop via the command line:
http://doc.prestashop.com/display/PS16/Installing+PrestaShop+using+the+command-line+script
Now I'm guessing that all those steps in one go is a little overwhelming, so I would suggest you break this task down into chunks and work on them one at a time, and post a few different questions on StackOverflow and probably ServerFault: https://serverfault.com/, as that is better suited to setting up servers.
To summarise you need to:
register a domain name and point it to your EC2 server, you should see the default apache page
create a new vhost to server web pages for your new domain
follow the guide on PrestaShop about installing the software
Treat each of those a separate task. This question covers lots of topics in one very general idea, the full answer to your problem wouldn't really fit in a single post.
ServerFault will probably have a lot of your answers already, regarding setting up domains and vhosts at least.

Authentication with apache2 php pages and tomcat REST calls

Hello smart people on stackoverflow,
I would be very happy if someone could point me to the right libraries/frameworks to do what I want.
We have the following web architecture set up.
1. We have a tomcat server that offers REST services.
2. We have an apache2 server that serves up php pages to users.
a. Some of these php pages make REST calls to tomcat for data.
b. Other php pages contain javascript that makes REST calls that are routed through apache2 via mod_proxy to tomcat. e.g. All request to http://myapache.com/PASSTOTOMCAT/rest/getSecureData would go to tomcat.
Now, I'm asked to add authentication to everything, both the user pages as well as the REST calls. It would obviously be ideal for the user to sign-in once for access to both.
What library can I use for this? I don't think I can use any php-based solution (ie. one that involves adding a ) because the pass-through url's won't have a chance to add this code and check for authentication. I think I need to use something built into apache2 itself.
One minor requirement is that I would like the user credentials stored in a mysql database as opposed to a file.
Am I over-thinking this?
Thanks in advance
Well it's been 5 days, so I guess I'll answer my own question...
I ended up using the new mod_auth_form for authentication because it lets you use a nice stylized webpage to log users in.
I also used mod_dbd to access user credentials in mysql.
I couldn't find a nice tutorial on this so I struggled through the installation and setup a bit, but if anyone cares, I created a set of instructions on my blog in case anyone else tries to do the same thing.
Installation
Setup