Efficient way to inspect Cookie creation in real time - cookies

I want to inspect what happens to cookies when I browse a certain website, in order to figure out how this website track some data.
I guess I can use the Chrome developer tools, and use the Network tab, and look within each HTTP requests one-by-one (there are about 100 of them) and check the cookie tab and look for "response cookies", right?
Is there an easier way to do this? some tool to simply see all cookies that were being created / changed?

Related

Need to track what websites a user visits after leaving my site

I would like to track what websites my site's visitors go to after they leave.
Would it be possible to place a cookie on their browser when they visit my site, and then later if they go to Facebook.com or stackoverflow.com, my cookie will retreive the browser's URL data and send it back to my server.
I could then look at this data and know that my visitors had gone to Facebook.com and stackoverflow.com after they left my site.
Is this possible using cookies?
Thanks for the help.
No. Cookies are not executed or anything. They are just dumb bits of data.
You would need to be able to execute code on the page they are visiting afterwards.
What I presume you are trying to ask, is that you want to track your outbound links.
This is mainly done with Javascript: You need to intercept click events from outbound anchor links, and send an event notification as described here, or using the hitCallbackmethod prior to completing the redirection to the external website. For Google Analytics see documentation. Or you could do via a custom JS implementation sending the info back to your server instead.
Alternatively your could replace all outbound links on the server side in your html source, and have all links pointed to your server first, and redirected to the external sites. But using redirects for this purpose is not really a good recommendation, unless you are an ad networks or a search engine company requiring such method.
Lastly, there is an alternative method using the HTML5 'ping' attribute. But the feature has been either removed and/or not yet fully implemented across all browsers as of this writing.
But you can't track where your visitors go beyond the 1st level outbound links of your site.

Evercookie browser security

I've just discovered Evercookie project on Github.
Evercookie is a Javascript API that produces extremely persistent cookies in a browser. Its goal is to identify a client even after they've removed standard cookies, Flash cookies (Local Shared Objects or LSOs), and others.
This is accomplished by storing the cookie data as many browser storage mechanisms as possible. If cookie data is removed from any of the storage mechanisms, evercookie aggressively re-creates it in each mechanism as long as one is still intact.
If the LSO mechanism is available, Evercookie may even propagate cookies between different browsers on the same client machine!
I tested it online, on this example page. I clicked "Create evercookie" button, I deleted all browsing data and I refreshed the page. The cookies that were deleted by deleting browsing data returned again there.
Where is the browser security in this thing? Is this secured?
If you want to disable Flash based cookies, use Adobe's "Global Storage Settings" panel here:
http://www.macromedia.com/support/documentation/en/flashplayer/help/settings_manager03.html
Perform all of these Steps:
Uncheck "Allow 3rd Party Flash Content to store data on your computer"
Check "Never ask again" (a non-obvious, but important step)
Click the 2nd to last tab: "Website Storage Settings"
Delete all existing data
Chrome bundles its own Flash plugin on Windows and Mac OS X. The settings and disk storage are separate from the plugin packaged directly by Adobe, so you may need to perform the above steps twice if you use Chrome. On the plus side, the separate storage location prevents Flash from being used to synchronize cookies to or from Chrome and other browsers.
I recommend testing with my personal site:
http://noc.to
The "Zombie Cookie" section can show you exactly how cookies are being restored and help you determine if the above steps (or any tools you use) are working.
In order to create an Evercookie, all you need is:
The ability to run JavaScript (or other active content, like Flash and perhaps Java); and
The ability to access the various client-side locations where copies of the cookie data are stored.
Totally disabling access to all storage mechanisms would render most of them useless; for most of them, their whole reason for being is to allow a script to use them. So the only even remotely feasible option is restricting access by domain. I'm not sure what browsers (if any) allow that kind of granularity, though. Most can allow or block JS as a whole from certain domains, but as for what features a given domain's scripts can use...? I'm not seeing that ability in Chrome 26 or IE 10, at least.
Well, it doesn't seem to work that well.
Created the everCookie
Closed the window
Empty all elements of Firefox cache (just by going to delete recent history anc check everything except site preferences)
Closed the window
Came back to the page
Finally realized it wasn't stored
What is strange is that I dind't explicitely removed Flash cookies in Flash Website Storage Settings panel. Maybe it's integrated into Firefox. Or I may have disabled them.
I think there's several other ways to store cookies and trace you. Facebook is already tracking you all over the web, even when disconnected. Google too (do you use Chrome?). Moreover, with IPv4 addresses, we certainly can find you back (why not just after you've emptied your cache!). We also can find you back while logging back on any site, and make a link with your previous sessions.
I suggest:
Using Firefox, even it's slower than Chrome, it's still more respectful of privacy
Removing the whole Internet cache on window close (sorry you'll have to log again on your preferred sites)
Check third-party cookie options
Use browser addons with care
Check Flash & Silverlight cookie options
Avoid website reputation checking (provided that you can recognize a fishing attempt)
Use private browsing mode when you don't want to share your digital lives

Bypass specific URL from Akamai if certain cookie exist

I would like Akamai not to cache certain URLs if a specified cookie exist (i.e) If user logged in on specific pages. Is there anyway we can do with Akamai?
The good news, is that I have done exactly this in the past for the Top Gear site (www.topgear.com/uk). The logic goes that if a cookie is present (in this case "TGCACHEKEY") then the Akamai cache is to be bypassed for certain url paths. This basically turns off Akamai caching of html pages when logged in.
The bad news is that you require an Akamai consultant to make this change for you.
If this isn't an option for you, then Peter's suggestions are all good ones. I considered all of these before implementing the cookie based approach for Top Gear, but in the end none were feasible.
Remember also that Akamai strips cookies for cached resources by default. That may or may not effect you in your situation.
The Edge Server doesn't check for a cookie before it does the request to your origin server and I have never seen anything like that in any of their menus, conf screens or documentation.
However, there are a few ways I can think of that you can get the effect that I think you're looking for.
You can specify in the configuration settings for the respective digital property what path(s) or URL(s) you don't want it to cache. If you're talking about a logged on user, you might have a path that only they would get to or you could set up such a thing server side. E.g. for an online course you would have www.course.com/php.html that anybody could get to whereas you might use www.course.com/student/php-lesson-1.html for the actual logged on lessons content. Specifying that /student/* would not be cached would solve that.
If you are serving the same pages to both logged on and not-logged on users and can't do it that way, you could check server-side if they're logged on and if so add a cache-breaker to the links so when they follow a link a cache-breaker is automatically added. You could also do this client side if you want, but it would be more secure and faster to do it server-side. As a note on this, this could be userid-random#. That would keep it unique enough when combined with the page that nobody else would request it and get the earlier 'cache-broken' page.
If neither of the above are workable, there is one other way I can think of, which is a bit unconventional to say the least, but it would work. Create symbolically linked directory in your document root with another name so that you can apply the first option and exempt it from cacheing. Then you check if the guy is logged on and if so prepend the extra directory to the links. From akamai's point of view www.mysite.com/logged-on/page.html can be exempt from cache where www.mysite.com/content/page.html is cached. On your server if /logged-on/ symbolically links over to /content/ then you're all set.
When they login you could send them to a subdomain which is set up as a ServerAlias, so on your side it's the same, but on Akamai has differnt cache handling rules.
Following the same answer than #llevera, you can use the cookies on CloudFlare without intervention of engineers to make the change for you.
Having that sort of cookies to bypass content is a technique that its becoming more popular with the time, and even bug companies like Magento are using it for Magento 2 platform.
But solutions from above still valid, Maybe Akamai supports that that already now, we are in 2017!

How to disable writing in cookie?

I'm trying to disable writing data in a specific cookie on a website,
At the same time, i want the data to be sent,
So it means, i send cookie data and don't want to receive any,
Is it possible ?
Cookie is just a mechanism to store information at per client basis or in client layer above the session layer. In general people hate cookies cause they can do creepy stuff and some website is using resources on their PC.
When you say I want to store a cookie that I never want to read, its really shady. No browser should allow this sort of cookie. You might want to re-look at your architecture.
But may be I donno what exactly you mean.

Prevent anyone from executing your web service?

I've got a webservice which is executed through javascript (jquery) to retrieve data from the database. I would like to make sure that only my web pages can execute those web methods (ie I don't want people to execute those web methods directly - they could find out the url by looking at the source code of the javascript for example).
What I'm planning to do is add a 'Key' parameter to all the webmethods. The key will be stored in the web pages in a hidden field and the value will be set dynamically by the web server when the web page is requested. The key value will only be valid for, say, 5 minutes. This way, when a webmethod needs to be executed, javascript will pass the key to the webmethod and the webmethod will check that the key is valid before doing whatever it needs to do.
If someone wants to execute the webmethods directly, they won't have the key which will make them unable to execute them.
What's your views on this? Is there a better solution? Do you forsee any problems with my solution?
MORE INFO: for what I'm doing, the visitors are not logged in so I can't use a session. I understand that if someone really wants to break this, they can parse the html code and get the value of the hidden field but they would have to do this regularly as the key will change every x minutes... which is of course possible but hopefully will be a pain for them.
EDIT: what I'm doing is a web application (as opposed to a web site). The data is retrieved through web methods (+jquery). I would like to prevent anyone from building their own web application using my data (which they could if they can execute the web methods). Obviously it would be a risk for them as I could change the web methods at any time.
I will probably just go for the referrer option. It's not perfect but it's easy to implement. I don't want to spend too much time on this as some of you said if someone really wants to break it, they'll find a solution anyway.
Thanks.
Well, there's nothing technical wrong with it, but your assumption that "they won't have the key which will make them unable to execute them" is incorrect, and thus the security of the whole thing is flawed.
It's very trivial to retrieve the value of a hidden field and use it to execute the method.
I'll save you a lot of time and frustration: If the user's browser can execute the method, a determined user can. You're not going to be able to stop that.
With that said, any more information on why you're attempting to do this? What's the context? Perhaps there's something else that would accomplish your goal here that we could suggest if we knew more :)
EDIT: Not a whole lot more info there, but I'll run with it. Your solution isn't really going to increase the security at all and is going to create a headache for you in maintenance and bugs. It will also create a headache for your users in that they would then have an 'invisible' time limit in which to perform actions on pages. With what you've told us so far, I'd say you're better off just doing nothing.
What kind of methods are you trying to protect here? Why are you trying to protect them?
ND
MORE INFO: for what I'm doing, the visitors are not logged in so I can't use a session.
If you are sending a client a key that they will send back every time they want to use a service, you are in effect creating a session. The key you are passing back and forth is functionally no different than a cookie (expect that it will be passed back only on certain requests.) Might as well just save the trouble and set a temporary cookie that will expire in 5 minutes. Add a little server side check for expired cookies and you'll have probably the best you can get.
You may already have such a key, if you're using a language or framework that sets a session id. Send that with the Ajax call. (Note that such a session lasts a bit longer than five minutes, but note also it's what you're using to keep state for the users regular HTPP gets and posts.)
What's to stop someone requesting a webpage, parsing the results to pull out the key and then calling the webservice with that?
You could check the referrer header to check the call is coming from one of your pages, but that is also easy to spoof.
The only way I can see to solve this is to require authentication. If your webpages that call the webservice require the user to be logged in then you can check the that they're logged in when they call the webservice. This doesn't stop other pages from using your webservice, but it does let you track usage more and with some rate limiting you should be able to prevent abuse of your service.
If you really don't want to risk your webservice being abused then don't make it public. That's the only failsafe solution.
Let's say that you generate a key valid from 12.00 to 12.05. At 12.04 i open the page, read it with calm, and at 12.06 i trigger action which use your web service. I'll be blocked from doing so even i'm a legit visitor.
I would suggest to restrain access to web services by http referrer (allow only those from your domain and null referrers) and/or require user authentication for calling methods.