One of my clients uses Sellerdeck as their shopping cart solution. I am currently implementing a service for them that relies heavily on cookies.
The cookie is set on a product page which has a URI that is something like http://www.mydomain.co.uk/retail/acatalog/A11-Insect-Net.html. When I browse around the site, I can see the cookie set on all pages, like it is supposed to.
Then when I go into the checkout process, Sellerdeck apparently starts using perl, because the URI changes to something like http://www.mydomain.co.uk/cgi-bin/retail/ca001000.pl. The weird thing is that, although we're still on the same domain, I can't see the cookie. When I go back to the product pages it is there again.
Doe anyone know why this may be?
Turns out the cookie was tied to a specific path and not to /. Fixed now.
Related
I found that the cookie in browser is a random string which web server sends to each client for remembering users' information purpose. But I don't understand in programmers viewpoint, what does cookie use for?
For example, I've used EditThisCookie extension in Chrome Browser to read wikipedia.org site's cookie, in the following picture included here. The value of this cookie (sessionId) is useless for programmers (EDIT: I mean I don't extract any information from this cookie, I know the cookie is very important for web developers, so sorry about my poor expression). If I get this cookie, which kind of information I can understand about the users?
Looking for some help! Thank you very much!
The example about cookie
http://i102.photobucket.com/albums/m86/dienkun1/cookie_example_zps455f0dad.png
EDIT: Sorry, I've just expressed my problem in wrong way.
Actually, I am going to write an extension for collecting users' preferences via users' cookie, but I can't understand anything what information can be extracted from cookie. I've read about cookie in many documents, like wikipedia, and all of them just show how to get cookie, the definition of cookie, classified... and nothing about which information we can get from cookie.
Thank you very much!
Why do you say that the sessionId is useless for programmers? It actually can be extremely useful. Somewhere on Wikipedia's servers, they're probably storing quite a bit of information about your session. This could include things like whether you've already hidden one of their fundraising banners (so that it won't keep showing it to you again and again), to things that are required for basic functionality, such as what user you are currently logged in as.
However, Wikipedia is storing this same information for millions of sessions. It needs a way to tie the information back to each individual browser. That sessionId is how it does so. It set the sessionId in a cookie when you first accessed the page, and that cookie gets sent back to the server with every request you make to it now. Then they have code on the back end that reads that sessionId from the cookie and uses it to look up all of the information specific to your session, and do whatever needs to be done with it.
You could of course store the session information itself in the cookies, but there are a couple problems with that. First, there are limits on the size of each cookie, and on the overall size of all cookies for a single domain. Some of the data you want to store might not even fit. But the bigger problem is that cookies can be very easily manipulated by the end user. If you stored the information of who the user is logged in as in a cookie, the user could just change that value to something else, and suddenly be logged in as someone else! Of course, it's also possible that the user could change their sessionId to be some other user's session and suddenly be logged in as them. That's why session IDs need to be as random as possible, and should be long enough that guessing someone else's is basically impossible.
Well, why would someone bother writing a sessionId to a cookie if it's useless?
Cookies are extremely useful when it comes to (e.g) identifying users on your site so you can have them logged in right away, count their visits, track them on your site and even beyond.. only to name a few use cases.
To cite a somewhat popular site (wikipedia.org):
Cookies were designed to be a reliable mechanism for websites to remember stateful information (such as items in a shopping cart) or to record the user's browsing activity (including clicking particular buttons, logging in, or recording which pages were visited by the user as far back as months or years ago).
The most important word here is "stateful".
I have searched to find a way that I can clear cookies from a browser for a specific site just by appending parameters to an end of a URL. I assume that if this even existed it would be browser dependent. Due to not finding anything within a 30 minute search, I assume that this doesn't exist for any browser. I hope that
It is an odd situation why I need this. I use BigCommerce, and I need to clear out everything from my cart. I spent yesterday afternoon trying to find a BigCommerce arguement that allowed for this. After talking with support, they don't have a clearCart function that other shopping cart platforms has. They only have a clear item function/parameter that you can append to the end of a URL.
If I could clear all the cookies created by my site by a user clicking on a link, it would clear the contents of the cart. I have tried to hack the clear item parameters, but never could find a way to clear all items.
The clear item URL is like this: mysite.com/cart.php?action=remove&item=52fa8fd1e398b
Speaking generally first, no - there is no part of specification or browser implementations that supports arbitrarily clearing all cookies from a particular domain through the use of http GET parameters. This would be a horrible security hole if it did exist in either specification or practice. If it was possible, I could maliciously redirect your browser to some known site and destroy information that you did not want destroyed.
Now to your particular situation ... I gather that what you want to do is not so much clear the cookies from your own browser in this particular site, but rather, you are trying to implement a "clear cart" button on your own BigCommerce site. Is that right?
If so, cookies is probably not the path forward that you are seeking. I don't think BigCommerce exposes shopping cart functions through their api. But I would be interested to hear from the BigCommerce folks on that.
Your best bet to add a "clear cart" function is to add javascript to the page that iterates all cart items and makes that http call you mentioned. If you do this, make sure to have some graceful fallback for the case where they change the uri. :)
I work for an e-commerce site. Part of what we do is to offer customized items to some clients. Recently some non-technical management promised that we could incorporate our check-out process into one such client's website. The only way we've figured out how to do this is by using an iframe (I know, I don't like it either). The issue is that most customers of this site are unable to check out because we use cookies to determine which custom items to display. Browsers are recognizing our cookies as third party and almost everybody has third party cookies turned off, as they should. I'm going to be shocked if the answer is yes, but is there any workaround for this? ie can the site hosting our iframe somehow supply the necessary cookie?
Try an invisible, interstitial page.
Essentially the hosting site would issue a redirect to a site within your domain, which is then free to set cookies (because at this point is is actually the first party). Then your site immediately redirects back to the hosting site. At this point your newly-created cookies will be invisible to the hosting site but visible to your iFramed page henceforth.
Unfortunately the hosting site will have to do this every time a cookie is to be updated but the double-redirect can happen so quickly they'll hardly notice. Hopefully your system only needs the cookies to be set once.
Instead of using a cookie, pass the information in the each url request as name/value pairs.
It is a bit of a pain to add the name/value to every url...I know...oh well...it will work.
I'm going to be shocked if the answer is yes, but is there any workaround for this? ie can the site hosting our iframe somehow supply the necessary cookie?
Your iframed page itself, which is the third party in this scenario, could send a P3P Cookie Policy header – some browsers then accept third-party cookies by default, whereas others (mainly Safari) will not be convinced to do so at all if not by the user manipulating the default settings themselves.
What you could also do, is pass the session id not (only) by cookie, but as a GET or POST parameter as well – f.e. under PHP this can be done quite easily by configuring the session options. You should consider if that’s worth the slightly increased risk of session stealing.
The interstitial page solution should work but it might be a lot of trouble for your hosting site, so here's another solution that will allow you to work cookieless.
Write an HttpModule that responds to the BeginRequest event, reads the querystring, and inserts corresponding cookie headers into the Context.HttpRequest object (Note: you can't use AddCookie, you have to use AddHeader, because cookies added by a module directly are disposed of before they hit your application proper). That way the hosting site can simply issue a request (within the iFrame) that contains the necessary value in the querystring, the module will convert it into a cookie (that only exists in memory, not on the wire), and your application will be deceived into thinking that there's a cookie there. No code changes required, you just need to add the module in web.config.
This only works if you are using IIS 7.0+ in integrated pipeline mode. If you're on an earlier version of IIS or if you have to run in classic mode, you'll need an ISAPI filter instead.
Ryan , John
For the Chrome v80 update with SameSite flags, want to set the samesite=none;secure for the site hosting our iframe and somehow supply the necessary samesite=none;secure cookie. We have apache 2.2 and tomcat 6 setup, so would appreciate a solution and advice on how to make it work. Currently with flag enabled the iFrame is not punching out successfully.
Thanks
I would like Akamai not to cache certain URLs if a specified cookie exist (i.e) If user logged in on specific pages. Is there anyway we can do with Akamai?
The good news, is that I have done exactly this in the past for the Top Gear site (www.topgear.com/uk). The logic goes that if a cookie is present (in this case "TGCACHEKEY") then the Akamai cache is to be bypassed for certain url paths. This basically turns off Akamai caching of html pages when logged in.
The bad news is that you require an Akamai consultant to make this change for you.
If this isn't an option for you, then Peter's suggestions are all good ones. I considered all of these before implementing the cookie based approach for Top Gear, but in the end none were feasible.
Remember also that Akamai strips cookies for cached resources by default. That may or may not effect you in your situation.
The Edge Server doesn't check for a cookie before it does the request to your origin server and I have never seen anything like that in any of their menus, conf screens or documentation.
However, there are a few ways I can think of that you can get the effect that I think you're looking for.
You can specify in the configuration settings for the respective digital property what path(s) or URL(s) you don't want it to cache. If you're talking about a logged on user, you might have a path that only they would get to or you could set up such a thing server side. E.g. for an online course you would have www.course.com/php.html that anybody could get to whereas you might use www.course.com/student/php-lesson-1.html for the actual logged on lessons content. Specifying that /student/* would not be cached would solve that.
If you are serving the same pages to both logged on and not-logged on users and can't do it that way, you could check server-side if they're logged on and if so add a cache-breaker to the links so when they follow a link a cache-breaker is automatically added. You could also do this client side if you want, but it would be more secure and faster to do it server-side. As a note on this, this could be userid-random#. That would keep it unique enough when combined with the page that nobody else would request it and get the earlier 'cache-broken' page.
If neither of the above are workable, there is one other way I can think of, which is a bit unconventional to say the least, but it would work. Create symbolically linked directory in your document root with another name so that you can apply the first option and exempt it from cacheing. Then you check if the guy is logged on and if so prepend the extra directory to the links. From akamai's point of view www.mysite.com/logged-on/page.html can be exempt from cache where www.mysite.com/content/page.html is cached. On your server if /logged-on/ symbolically links over to /content/ then you're all set.
When they login you could send them to a subdomain which is set up as a ServerAlias, so on your side it's the same, but on Akamai has differnt cache handling rules.
Following the same answer than #llevera, you can use the cookies on CloudFlare without intervention of engineers to make the change for you.
Having that sort of cookies to bypass content is a technique that its becoming more popular with the time, and even bug companies like Magento are using it for Magento 2 platform.
But solutions from above still valid, Maybe Akamai supports that that already now, we are in 2017!
We have several websites on different domains and I'd like to be able to track users' movements on these sites.
Obviously cookies are not feasable, because they don't cross domain borders.
I could look at a combination of IP address and User Agent, but there are some cases where that does not work.
I don't want to use flash or other plugins.
Any ideas? Or am I doomed to rely on the IP/User_Agent combination?
You can designate one domain or subdomain to tracking and have it serve a 1x1 pixel image which you include in all pages you would like to track. Serve a cookie with the image, look at the tracking domain's server logs, voilà.
This solution requires no JavaScript, and works even if the user disables third-party cookies.
First, let's make sure the user agent is sending cookies:
If getCookie("c") == null then setCookie("c", "anyValue")
Then let the request finish (aka wait for next request)
Let's call our tracker cookie uaid.
If GET http://child.com/any-page and getCookie("c") is not null and getCookie("uaid") is null...
Redirect to http://parent.com/give-me-a-uaid?returnTo=http://child.com/any-page
On http://parent.com/give-me-a-uaid, check for cookie uaid
If not exists, create it and add it to response. If it exists, get its value.
Redirect to http://child.com/any-page?uaid=valueOfParentsUAIDCookie
Child.com sets cookie uaid with valueOfParentsUAIDCookie
Redirect to http://child.com/any-page
And of course, you are validating input, and white-listing your redirect URLs :)
Flows:
This question is closely related to the Question Accessing Domain Cookies within an iFrame on Internet Explorer.
For Internet Explorer I need to take P3P Policies into account and set an additional P3P HTTP-Header to allow images to set cookies across domain borders. Then I can use simon's suggestion.
You can follow the same concept used in Google Analytics. Injecting javascript in the pages you want to track.
You do not give any context to your situation -just the basic problem. So it is difficult to give an answer that clearly fits. However, here are some techniques/mechanisms for passing information from one page to another, regardless of what domain is involved.
include hyperlink to a 1x1 pixel transparent gif image (sometimes called a "beacon")
rely on referrer information in HTTP request headers to identify page hyperlink is on
include extra parameters in hyperlinks to other site - assuming you run both sites
buy services of a company like Akamai to do user tracking for you
possibly use cross domain cookie mechanism in the future if standard is ever approved
Which techniques really come down to whether you can place software on all of the sites (servers) that the user will visit where you have interest - or you cannot place your software on all of them.