Explicitly set document.domain on every page - xss

Is is a good idea to set document.domain on every page of your site. For example if your site is 'www.example.com'. But you also work with 'en.example.com' and 'fr.example.com' and 'www.example.com:8444'. All these domains make it difficult to work with when you have to deal with xss issues. So is it a terrible idea to explicity set document.domain = 'example.com' on all pages?

It depends. It means any other subdomain can do the same, so if ever one with a lower trust level comes along (like ads.example.com or survey.example.com) it may come back and bite you.

Related

Analytics _setDomainName not working anymore

This suddenly stopped working. We have Google Analytics on our page and a couple of months ago we tweaked the code so that GA's cookies would only be set for www.igre123.com and not it's subdomains (we do a redirect from igre123.com to www.igre123.com, so that users are always using www.).
We did this because we have two subdomains that serve static content (css, js, thumbnails, etc) from (s.igre123.com and static.igre123com).
To prevent cookies being set for the subdomains (and only for www.) we modified our GA code to look something like this:
...
_gaq.push(['_setDomainName','www.igre123.com']);
_gaq.push(['_trackPageview']);
...
This did the trick but now it's not working anymore. Anybody have some idea why this stopped working? Unfortunately I don't know when exactly this stopped working :/
edit: tracking otherwise works without a problem.
OK after much head banging I managed to solve this mystery. Turns out that the order of the _gaq.push([]);'s is relevant. If you have, besides the usual _setAccount and _trackPageView, any other GA calls (like _setCustomVar), that set cookies, you have to tell GA which domain to set cookies for first. Seems obvious in retrospect but to be fair, that's not really clear from the documentation.
Anyhow be sure to put the call to _setDomainNamefirst, and then any other GA calls you might have.

How to scope GA *and* GWO cookies to a single domain

I use both Google Analytics and Google Website Optimizer on www.britely.com, and want to limit their cookies to that domain only. (To avoid cookie overhead in requests for static assets loaded from other subdomains of britely.com that should get some CDN love.)
An example page that uses both (and that currently sets the __utma, __utmb, __utmc and __utmz cookies on .britely.com, instead of on the wanted www.britely.com), is http://www.britely.com/ninjamom/s-t-dogs-think
As far as Google's docs go, it seems that a _gaq.push(['_setDomainName', 'none']); call (or ditto 'www.britely.com' instead of 'none') at the top of the page should be enough to achieve this goal.
Somehow, it isn't. I think we used to have even more cookies set on .britely.com before I read through the source of GWO's siteopt.js, which doesn't seem to know _setDomainName. It's responsive to a page-global constant _udn declaring the cookie domain it should use though - so leading in the page with this, at least GWO's __utmx and __utmxx cookies are handled correctly:
<script>
var _gaq = _gaq || [], _udn = 'www.britely.com';
_gaq.push(['_setDomainName', 'none']);
</script>
I know the common way of fixing the cookie overhead issue is to serve static content from some domain entirely different from the one using GA and GWO. That is not the solution I seek.
Besides the above tweaks, the Google Website Optimizer control script also needs its own _gaq.push(['gwo._setDomainName', 'none']); call - similar to the GA one, which only seems to be system global, but isn't.
With the above setup, all cookies get scoped to www.britely.com except for the __utmx and __utmxx ones, which end up scoped to .www.britely.com for some reason. Good enough for me.

Externally linked images - How to prevent cross site scripting

On my site, I want to allow users to add reference to images which are hosted anywhere on the internet. These images can then be seen by all users of my site. As far as I understand, this could open the risk of cross site scripting, as in the following scenario:
User A adds a link to a gif which he hosts on his own webserver. This webserver is configured in such a way, that it returns javascript instead of the image.
User B opens the page containg the image. Instead of seeing the image, javascript is executed.
My current security messures are currently such, that both on save and open, all content is encoded.
I am using asp.net(c#) on the server and a lot of jquery on the client to build ui elements, including the generation of image tags.
Is this fear of mine correct? Am I missing any other important security loopholes here? And most important of all, how do I prevent this attack? The only secure way I can think of right now, is to webrequest the image url on the server and check if it contains anything else than binary data...
Checking the file is indeed an image won't help. An attacker could return one thing when the server requests and another when a potential victim makes the same request.
Having said that, as long as you restrict the URL to only ever be printed inside the src attribute of an img tag, then you have a CSRF flaw, but not an XSS one.
Someone could for instance create an "image" URL along the lines of:
http://yoursite.com/admin/?action=create_user&un=bob&pw=alice
Or, more realistically but more annoyingly; http://yoursite.com/logout/
If all sensitive actions (logging out, editing profiles, creating posts, changing language/theme) have tokens, then an attack vector like this wouldn't give the user any benefit.
But going back to your question; unless there's some current browser bug I can't think of you won't have XSS. Oh, remember to ensure their image URL doesn't include odd characters. ie: an image URL of "><script>alert(1)</script><!-- may obviously have bad effects. I presumed you know to escape that.
Your approach to security is incorrect.
Don't approach the topic as "I have a user input, so how can I prevent XSS". Rather approach it like it this: "I have user input - it should be restrictive as possible - i.e. allowing nothing through". Then based on that allow only what's absolutely essential - plain-text strings thoroughly sanitized to prevent anything but a URL, and the specific, necessary characters for URLS only. Then Once it is sanitized I should only allow images. Testing for that is hard because it can be easily tricked. However, it should still be tested for. Then because you're using an input field you should make sure that everything from javascript scripts and escape characters, HTML, XML and SQL injections are all converted to plaintext and rendered harmless and useless. Consider your users as being both idiots and hackers - that they'll input everything incorrectly and try to hack something into your input space.
Aside from that you may run into som legal issues with regard to copyright. Copyrighted images generally may not be used on other people's sites without the copyright owner's consent and permission - usually obtained in writing (or email). So allowing users the opportunity to simply lift images from a site could run the risk of allowing them to take copyrighted material and reposting it on your site without permission which is illegal. Some sites are okay with citing the source, others require a fee to be paid, and others will sue you and bring your whole domain down for copyright infringement.

cookies tend to be top-down. is there a way to ban them from certain directories?

in a website where cookies are used for top-level pages (such as example.com/test.php, example.com/whatever.php), is it possible to ban cookies from certain directories such as "/images/", or am I just going to have to use a second domain (static.example.com/images/photo.jpg) ?
does anyone know of a workaround? it's for a CMS where I may not always be able to create a second domain.
Change your top-level pages to be one level down: example.com/test.php -> example.com/app/test.php
You can then set your cookies on example.com/app and they will not be sent to example.com/images
(Also your workaround won't quite work either, as static.example.com/images/photo.jpg will still get cookies set for example.com. It needs to be a different parent, e.g. example2.com/images).
No, sorry. Cookies are domain based, not directory based.

URL Rewrite in DotNetNuke remove chunk of address (and read cookie?)

I am working on a DotNetNuke application using the iFinity URL Master module. (that may be irrelevant, as a solution may be platform independent)
What I have is a site with addresses based on language.
so
www.thesite.com/en/products/towels/redtowel
is the english version and
www.thesite.com/de/products/towels/redtowel
is the german version.
What I need to do is allow a user (who has already visited the site and set a cookie with their language) to be able to go to www.thesite.com/products/towels/redtowel and get to www.thesite.com/en/products/towels/redtowel if their cookie is set to english, and /de/products/towels/redtowel if it is set to german.
How would I do this?
if it was me and i didnt want to spend a lot of time programming I would look at something like this
http://www.snowcovered.com/snowcovered2/Default.aspx?tabid=242&PackageID=10059
then it could do a redirect based on the cookie - otherwise with iFinity I think you can do that sort of but not exactly. (I may be wrong on that - not a fan of iFinity url rewriter)