iFrames + Google Analytics + Cookies + P3P - cookies

I am working on a website that generates traffic for partner sites. When a partner site's logo is clicked on our site we open the partner site in a page that contains our basic header and the partner site within an iframe. Earlier we were simply opening the partner site in new window. All cool so far.
Most partner sites use google analytics to track the traffic that we send them and soon after we started opening sites within iframe our partners reported that google analytics does not track data anymore (or tracks just a fraction of data).
I have done my fair share of homework/research on googleverse and found the know issue with google analytics or cookies in general across domains and iframes.
I am trying to resolve this issue and the only solution that has been referenced is the use of P3P headers.
First, where do the P3P headers go? In my sites pages or the partner sites pages. Since we have many partner sites (big and small) it wont be practical if the solution is to put tags in each of these sites. I can easily have them added to the page that contains the iframe.
Among the various p3p header generators is there a reliable one that you recommend?
Is there any way around this issue? I really need to open the sites in iframes and obviously the partner sites really need to track the traffic.
Thank you for the help.

Unfortunately, both you and the partner site needs to set the headers.
Alternatives:
If you do not want the partner site to set headers, one option is to lower the security level (in IE) or grant access to 3rd party cookies (in FF) in the browser settings. Every client has to do this, so this may not be an attractive solution.
Use localStorage (HTML5 thingy - browsers that support localStorage allow access to both the site and the iFrame's content that is stored in localStorage). This may not be feasible in the short term as it requires both you and your partner site to implement saving/reading information to/from localStorage and not every browser supports it (older IE browsers especially).
To add a basic policy header (ideally you should generate your own policy which is straight forward - check item#2 below)
in php add this line:
<?php header('P3P: CP="CAO PSA OUR"'); ?>
in ASP.Net:
HttpContext.Current.Response.AddHeader("p3p", "CP=\"CAO PSA OUR\"");
in HTML pages:
<meta http-equiv="P3P" content='CP="CAO PSA OUR"'>
Regarding your other concerns:
1) P3P headers refer to the HTTP header that delivers something called a compact policy to the browser. Without such a policy in place, IE (most notably) and other browsers will block access to 3rd party cookies (a term used to refer to iFrame's cookies) to protect user's privacy concerns.
As far as Google Analytics goes, both your site and the partner site still needs to configure cross domain tracking as outlined in their documentation.
2) You can use this basic policy header (which is enough to fix iFrame's cookies):
P3P: CP="CAO PSA OUR"
or generate your own. If you're not sure what those terms mean, see this.
To generate such policy, you can use online editors such as p3pedit.com or IBM's tool which present a set of questions and allow you to present answers. This makes it easy for you to quickly generate such policy. You can generate the policy XML, compact policy and more.
3) You can try the two alternatives mentioned above.
Steps to add the policy to your entire site
Generate a compact policy (using one of the tools mentioned earlier) or use the basic policy
In IIS, right-click the desired page, directory, or site, and then click Properties.
On the HTTP Headers tab, click Add.
In the Custom Header Name field, type P3P.
In the Custom Header Value field, enter your Compact P3P Policy (or the basic one from above) and then click OK.
In Apache, a mod_header line like this will do:
Header append P3P "CP=\"CAO PSA OUR\""
Hope ths helps.

Related

Cookiewall and content cloaking

To comply with the European cookie law, we should implement cookie wall. But search engines should be able to see and index actual page content not cookie wall.
Searching online I found that many people recommend checking user-agent and feeding actual content for bots and crawlers and show cookie wall for real users. Popular WordPress Cookie wall plugins also implement this way by checking bots & crawlers/real users
My question is: Does google count this as content cloaking and penalize SEO ranking or not? Or is there another way to implement cookie wall without affecting SEO ranking
Cloaking is a search engine optimization (SEO) technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page.
Cloaking takes a user to other sites than he or she expects by disguising those sites' true content. During cloaking, the search engine spider and the browser are presented with different content for the same Web page. HTTP header information or IP addresses assist in sending the wrong Web pages. Searchers will then access websites that contain information they simply were not seeking, including pornographic sites. Website directories also offer up their share of cloaking techniques.
Many of the larger search engine companies oppose cloaking because it frustrates their users and does not comply with their standards. In the search engine optimization (SEO) industry, cloaking is considered to be a black hat technique that, while used, is frowned on by most legitimate SEO firms and Web publishers. Getting caught cloaking can result in huge penalties from the search engines, including being removed from the index altogether.
So, yeah, this count as cloacking.
Put the cookie disclaimer in an <aside> element. Make sure you initialise this with some internet explorer js code as it's HTML5 only. Google will generally ignore these based on their content, position and it's relevance to the rest of the page.

Parallel website running to my original website

We have been working on a gaming website. Recently while making note of the major traffic sources I noticed a website that I found to be a carbon-copy of our website. It uses our logo,everything same as ours but a different domain name. It cannot be, that domain name is pointing to our domain name. This is because at several places links are like ccwebsite/our-links. That website even has links to some images as ccwebsite/our-images.
What has happened ? How could have they done that ? What can I do to stop this ?
There are a number of things they might have done to copy your site, including but not limited to:
Using a tool to scrape a complete copy of your site and place it on their server
Use their DNS name to point to your site
Manually re-create your site as their own
Respond to requests to their site by scraping yours real-time and returning that as the response
etc.
What can I do to stop this?
Not a whole lot. You can try to prevent direct linking to your content by requiring referrer headers for your images and other resources so that requests need to come from pages you serve, but 1) those can be faked and 2) not all browsers will send those so you'd break a small percentage of legitimate users. This also won't stop anybody from copying content, just from "deep linking" to it.
Ultimately, by having a website you are exposing that information to the internet. On a technical level anybody can get that information. If some information should be private you can secure that information behind a login or other authorization measures. But if the information is publicly available then anybody can copy it.
"Stopping this" is more of a legal/jurisdictional/interpersonal concern than a technical one I'm afraid. And Stack Overflow isn't in a position to offer that sort of advice.
You could run your site with some lightweight authentication. Just issue a cookie passively when they pull a page, and require the cookie to get access to resources. If a user visits your site and then the parallel site, they'll still be able to get in, but if a user only knows about the parallel site and has never visited the real site, they will just see a crap ton of broken links and images. This could be enough to discourage your doppelganger from keeping his site up.
Another (similar but more complex) option is to implement a CSRF mitigation. Even though this isn't a CSRF situation, the same mitigation will work. Essentially you'd issue a cookie as described above, but in addition insert the cookie value in the URLs for everything and require them to match. This requires a bit more work (you'll need a filter or module inserted into the pipeline) but will keep out everybody except your own users.

Is there something a site can do to incorporate third party cookies

I work for an e-commerce site. Part of what we do is to offer customized items to some clients. Recently some non-technical management promised that we could incorporate our check-out process into one such client's website. The only way we've figured out how to do this is by using an iframe (I know, I don't like it either). The issue is that most customers of this site are unable to check out because we use cookies to determine which custom items to display. Browsers are recognizing our cookies as third party and almost everybody has third party cookies turned off, as they should. I'm going to be shocked if the answer is yes, but is there any workaround for this? ie can the site hosting our iframe somehow supply the necessary cookie?
Try an invisible, interstitial page.
Essentially the hosting site would issue a redirect to a site within your domain, which is then free to set cookies (because at this point is is actually the first party). Then your site immediately redirects back to the hosting site. At this point your newly-created cookies will be invisible to the hosting site but visible to your iFramed page henceforth.
Unfortunately the hosting site will have to do this every time a cookie is to be updated but the double-redirect can happen so quickly they'll hardly notice. Hopefully your system only needs the cookies to be set once.
Instead of using a cookie, pass the information in the each url request as name/value pairs.
It is a bit of a pain to add the name/value to every url...I know...oh well...it will work.
I'm going to be shocked if the answer is yes, but is there any workaround for this? ie can the site hosting our iframe somehow supply the necessary cookie?
Your iframed page itself, which is the third party in this scenario, could send a P3P Cookie Policy header – some browsers then accept third-party cookies by default, whereas others (mainly Safari) will not be convinced to do so at all if not by the user manipulating the default settings themselves.
What you could also do, is pass the session id not (only) by cookie, but as a GET or POST parameter as well – f.e. under PHP this can be done quite easily by configuring the session options. You should consider if that’s worth the slightly increased risk of session stealing.
The interstitial page solution should work but it might be a lot of trouble for your hosting site, so here's another solution that will allow you to work cookieless.
Write an HttpModule that responds to the BeginRequest event, reads the querystring, and inserts corresponding cookie headers into the Context.HttpRequest object (Note: you can't use AddCookie, you have to use AddHeader, because cookies added by a module directly are disposed of before they hit your application proper). That way the hosting site can simply issue a request (within the iFrame) that contains the necessary value in the querystring, the module will convert it into a cookie (that only exists in memory, not on the wire), and your application will be deceived into thinking that there's a cookie there. No code changes required, you just need to add the module in web.config.
This only works if you are using IIS 7.0+ in integrated pipeline mode. If you're on an earlier version of IIS or if you have to run in classic mode, you'll need an ISAPI filter instead.
Ryan , John
For the Chrome v80 update with SameSite flags, want to set the samesite=none;secure for the site hosting our iframe and somehow supply the necessary samesite=none;secure cookie. We have apache 2.2 and tomcat 6 setup, so would appreciate a solution and advice on how to make it work. Currently with flag enabled the iFrame is not punching out successfully.
Thanks

Github pages & jquery cookie; state saved on page level only

So I just put an old project on github. It was in PHP and had some cookie magic so that when
you "invert" the colors it would stay through the site. I saved a rendered HTML version of the the site for github and I also switched all the cookie magic to jquery.cookie so that it would work without a server-side component
There's a link to the site below.
http://reggi.github.io/csi-bp
So it seems that github pages set cookies on a page-level and they don't seem to transfer between pages. Is this the case? I'm curious as to what I'm missing.
I know they just switched the domain from github.com to github.io for cookie / security purposes, but I didn't know that it was going to result in this.
It could be a side effect of the newest security measure from GitHub (April 18th):
"Content Security Policy"
Content Security Policy is a new HTTP header that provides a solid safety net against XSS attacks.
It does this by blocking inline scripts and limiting the domains that other scripts can be loaded from.

Cross Domain User Tracking

We have several websites on different domains and I'd like to be able to track users' movements on these sites.
Obviously cookies are not feasable, because they don't cross domain borders.
I could look at a combination of IP address and User Agent, but there are some cases where that does not work.
I don't want to use flash or other plugins.
Any ideas? Or am I doomed to rely on the IP/User_Agent combination?
You can designate one domain or subdomain to tracking and have it serve a 1x1 pixel image which you include in all pages you would like to track. Serve a cookie with the image, look at the tracking domain's server logs, voilà.
This solution requires no JavaScript, and works even if the user disables third-party cookies.
First, let's make sure the user agent is sending cookies:
If getCookie("c") == null then setCookie("c", "anyValue")
Then let the request finish (aka wait for next request)
Let's call our tracker cookie uaid.
If GET http://child.com/any-page and getCookie("c") is not null and getCookie("uaid") is null...
Redirect to http://parent.com/give-me-a-uaid?returnTo=http://child.com/any-page
On http://parent.com/give-me-a-uaid, check for cookie uaid
If not exists, create it and add it to response. If it exists, get its value.
Redirect to http://child.com/any-page?uaid=valueOfParentsUAIDCookie
Child.com sets cookie uaid with valueOfParentsUAIDCookie
Redirect to http://child.com/any-page
And of course, you are validating input, and white-listing your redirect URLs :)
Flows:
This question is closely related to the Question Accessing Domain Cookies within an iFrame on Internet Explorer.
For Internet Explorer I need to take P3P Policies into account and set an additional P3P HTTP-Header to allow images to set cookies across domain borders. Then I can use simon's suggestion.
You can follow the same concept used in Google Analytics. Injecting javascript in the pages you want to track.
You do not give any context to your situation -just the basic problem. So it is difficult to give an answer that clearly fits. However, here are some techniques/mechanisms for passing information from one page to another, regardless of what domain is involved.
include hyperlink to a 1x1 pixel transparent gif image (sometimes called a "beacon")
rely on referrer information in HTTP request headers to identify page hyperlink is on
include extra parameters in hyperlinks to other site - assuming you run both sites
buy services of a company like Akamai to do user tracking for you
possibly use cross domain cookie mechanism in the future if standard is ever approved
Which techniques really come down to whether you can place software on all of the sites (servers) that the user will visit where you have interest - or you cannot place your software on all of them.