Importing Firefox and Chrome cookies to libcurl - c++

I'm using Code::Blocks with MinGW, under Windows 7.
I'm writing a multithreaded web crawler with libcurl, using a CURLSH object with CURL_LOCK_DATA_COOKIE enabled to share cookies among different threads.
Once a handle receives a cookie, it is successfully shared among every other handle. However, I need to copy the initial set of cookies from Firefox or Chrome. I found that they store cookies using sqlite, and I've been able to read cookies from both of them from within my program. The problem is, how do I give these cookies to libcurl? Ideally, there should be some way to feed these cookies to my CURLSH object, so that they get distributed to every handle. I have found no such thing.
Following this document, I can try to save the cookies I read from my browser to a cookies.txt file, which reduces to finding a correspondence between the fields in the database used by Firefox/Chrome and the Netscape format.
Netscape uses the following format:
domain flag path secure expiration name value
The problem comes with the flag field. I don't know what to write there. Firefox uses the following fields (file cookies.sqlite, table *moz_cookies*), which correspond with the Netscape format as follows (is this correct?):
host ??? path isSecure expiry name value
Chrome uses the following fields (file Cookies, table cookies):
host_key ??? path secure expires_utc name value
So, to create this cookies.txt file, I'm only missing that flag field. The document linked above says:
flag - A TRUE/FALSE value indicating if all machines within a given domain
can access the variable. This value is set automatically by the
browser, depending on the value you set for domain.
Which doesn't really tell me what to write there.
However, writting a file and then reading it seems like unnecessary work, given that I'll first load the cookies from Firefox/Chrome in RAM, and I should be able to give them to libcurl directly without going through the hard drive. I've found the CURLOPT_COOKIE option, but it is missing some fields (namely, domain). Also, that option doesn't seem to save the cookies for posterior use. It looks like I would need to call it for every transaction with only the cookies of the corresponding domain (and what if these cookies get changed? I would not want to check for changes manually, given that libcurl can do that).
So, given that I have all my cookies from Firefox/Chrome in memory, how do I give them to libcurl? If the only option is to use a cookies.txt file, what should I write in the flag field?

I've found the answer, with CURLOPT_COOKIELIST (I was confusing it with CURLINFO_COOKIELIST, which can only be used to read cookies). Using CURLOPT_COOKIELIST, I can enter my cookies as HTTP headers, which do not need that flag field. I'll only need to give format to the date. It looks like specifying the cookies for any handle is enough to set them in the CURLSH object, because I can set them in one handle and read them any other handle.

Related

Cookie “PHPSESSID” will be soon treated as cross-site cookie against <file> because the scheme does not match

I've just noticed my console is littered with this warning, appearing for every single linked resource. This includes all referenced CSS files, javascript files, SVG images, and even URLs from ajax calls (which respond in JSON). But not images.
The warning, for example in case of a style.css file, will say:
Cookie “PHPSESSID” will be soon treated as cross-site cookie against “http://localhost/style.css” because the scheme does not match.
But, the scheme doesn't match what? The document? Because that it does.
The URL of my site is http://localhost/.
The site and its resources are all on http (no https on localhost)
The domain name is definitely not different because everything is referenced relative to the domain name (meaning the filepaths start with a slash href="/style.css")
The Network inspector just reports a green 200 OK response, showing everything as normal.
It's only Mozilla Firefox that is complaining about this. Chromium seems to not be concerned by anything. I don't have any browser add-ons. The warnings seem to originate from the browser, and each warning links to view the corresponding file source in Debugger.
Why is this appearing?
that was exactly same happening with me. the issue was that, firefox keeps me showing even Cookies of different websites hosted on same URL : "localhost:Port number" stored inside browser memory.
In my case, i have two projects configured to run at http://localhost:62601, when i run first project, it saves that cookie in browser memory. when i run second project having same URL, Cookie is available inside that projects console also.
what you can do, is delete the all of the cookies from browser.
#Paramjot Singh's answer is correct and got me most of the way to where I needed to be. I also wasted a lot of time staring at those warnings.
But to clarify a little, you don't have to delete ALL of your cookies to resolve this. In Firefox, you can delete individual site cookies, which will keep your settings on other sites.
To do so, click the hamburger menu in the top right, then, Options->Privacy & Security or Settings->Privacy & Security
From here, scroll down about half-way and find Cookies and Site Data. Don't click Clear Data. Instead, click Manage Data. Then, search for the site you are having the notices on, highlight it, and Remove Selected
Simple, I know, but I made the mistake of clearing everything the first time - maybe this will prevent someone from doing same.
The warning is given because, according to MDN web docs:
Standards related to the Cookie SameSite attribute recently changed such that:
The cookie-sending behaviour if SameSite is not specified is SameSite=Lax. Previously the default was that cookies were sent for all requests.
Cookies with SameSite=None must now also specify the Secure attribute (they require a secure context/HTTPS).
Which indicates that a secure context/HTTPS is required in order to allow cross site cookies by setting SameSite=None Secure for the cookie.
According to Mozilla, you should explicitly communicate the intended SameSite policy for your cookie (rather than relying on browsers to apply SameSite=Lax automatically), otherwise you might get a warning like this:
Cookie “myCookie” has “SameSite” policy set to “Lax” because it is missing a “SameSite” attribute, and “SameSite=Lax” is the default value for this attribute.
The suggestion to simply delete localhost cookies is not actually solving the problem. The solution is to properly set the SameSite attribute of cookies being set by the server and use HTTPS if needed.
Firefox is not the only browser making these changes. Apparently the version of Chrome I am using (84.0.4147.125) has already implemented the changes as I got this message in the console:
The previously mentioned MDN article and this article by Mike Conca have great information about changes to SameSite cookie behavior.
Guess you are using WAMP or LAMP etc. The first thing you need to do is enable ssl on WAMP as you will find many references saying you need to adjust the cookie settings to SameSite=None; Secure That entails your local connection being secure. There are instructions on this link https://articlebin.michaelmilette.com/how-to-add-ssl-https-to-wampserver/ as well as some YouTube vids.
The important thing to note is that when creating the SSL certificate you should use sha256 encoding as sha1 is now deprecated and will throw another warning.
There is a good explanation of SameSite cookies on https://web.dev/samesite-cookies-explained/
I was struggling with the same issue and solved it by making sure the Apache 2.4 headers module was enabled and than added one line of code
Header always edit Set-Cookie ^(.")$ $1;HttpOnly;Secure
I wasted lots of time staring at the same sets of warnings in the Inspector until it dawned on me that the cookies were persisting and needed purging.
Apparently Chrome was going to introduce the new rules by now but Covid-19 meant a lot of websites might have been broken while people worked from home. The major browsers are working together on the SameSite attribute this so it will be in force soon.

Cookies: Sent in request even after all have been deleted

I am confused about how cookies are set. It seems that cookies can be sent in the request header, even after I have deleted them all.
What I do:
In IE: delete all cookies (wrench-thing->safety->delete browsing history-> check all, except preserve favorites-> Delete)
Go to random site (google.com) and open the Network tab (F12/Network) - because it won't open from blank tab.
Make sure browsing history persists (tools-> clear entries on navigate-> uncheck both)
Click "Start capturing"
Go to site: http://www.klm.com/travel/dk_da/index.htm
Look at Network data. For the first url (http://www.klm.com/travel/dk_da/index.htm ), click "Go to detailed view". Click "cookies"
I look at the cookie that is being sent (in Cookies tab or under 'Request headers') and it's already sending 7 values, for example, EBT_JSESSIONID. But, where do these values come from? I haven't received anything at this point. I realize that cookies can be set via javascript, but I haven't loaded any js at this point either.
I am trying to figure this out as part of webscrabing. Really want to be able to do it without Selenium or the like, and need to generate/use the various IDs that are being passed around the various calls.
Using chrome in Mac we had this issue and restarting the browser did solve the issue. The scenario was weird because the value was being sent only for one specific HTML.

Cookie handling in subsequent requests in same page

I'm creating my own (multi threaded) ISAPI based website in C++ and I'm trying to implement correct session management.
The problem is that when a new session should be created, the session is created twice when using subsequent requests in the generated web page.
Here's how it works:
- Client requests http://localhost and sends either no cookie or a cookie with an old session ID in it.
- Server looks at the session cookie and feels that it needs to create a new one because it no longer exists: it prepares a header with a cookie in it with a new session ID and sends the complete header to the client (I tracked this with http live headers plugin in firefox and it is correct). It also prepares some data like the page and and stuff like that (not yet body data, as it is still processing data from the database and stuff like that) and sends what it has back to the client.
- Client should have this new session cookie now, and sees the stylesheet link and immediately sends the stylesheet request http://localhost/css to my server. But... he still does this with the old session ID for some reason, not with the newly received one!
- Server sees this request (with again an no longer existing session id), generates another new session and sends the new session id with a cookie along with the stylesheet data.
So the client has received two session id's now and will from now on keep using the second one as the first one is overwritten, but nevertheless the first page has used the wrong session (or actually, the second page has).
You could say that this is not a problem, but when I start using personalized stylesheets, I will have the wrong stylesheet on the first page and as the page will use AJAX to refresh the content (if available), it is possible that the stylesheet is never reloaded unless the client refreshes.
So, is this a problem that is always there when doing this kind of thing? Will the browser always send an old cookie although it has already received a new one but is still processing the page? Is this a problem that, for example PHP, also has?
Note: before all the discussions start about "use php instead" or something: I am rewriting a website that I had first written in PHP, it became popular, had thousands of (real) visitors every hour and started killing my server (the website doesn't make that kind of money that I can throw lots of servers at it). By writing it in C++, requests take 2ms instead of 200ms in PHP... I can optimize everything. By taking my time to develop this ISAPI correctly, it is safely multi-threaded and can be multi-processed, multi-servered. And most of all, I like the challenge.
Added note: It seems that the problem is only there when an old session exists in the cookies, because when I completely clear all cookies from my browser, and a new one is created and sent back to the client, the subsequent stylesheet request immediately uses the given session id. This seems to be some kind of proof that I'm doing something wrong when an old session id is sent... Should an existing cookie be deleted first? How?
Added note: The cookie is written with an expire-date one year ahead.
I have found out what the problem was, I was in the assumption that setting a cookie without specifying a path would result in making the session work on all paths on that domain.
By using http://foo.bar/home as main page and http://foo.bar/home/css as stylesheet, internally translating that url to ?s1=home and ?s1=home&css=y, I was actually using two different paths according to the browser which did not pass the cookie to the css-request.
For some reason, they actually got together afterwards, I don't fully understand why.
Isn't this silly? Will people not often have a http://foo.bar/index.php and a http://foo.bar/css/style.css.php , just because they use subdirectories to keep their structure clean?
If anyone knows of a way to fix it, to make subpaths also work with the same cookies, let me know, but as I understand it from the definition of cookies, they are stuck within a specific path (although, it seems that if you specifically add a path other than /, it will work on subdirectories as well?)

Coldfusion load session from Id

Is there any way to load a specific user session by providing the correct value of CFTOKEN and/or CFID?
Something like php's session_id($id) function.
Or some way to change data of an specific session.
I need a webservice that will add or change some information on a specific user session. I know the CFID and CFTOKEN values because I share a subdomain cookie. However applications are on different servers
Not that it's the best way of doing things, as it uses undocumented code that can change without notice between versions. But you can use certain methods to access sessions.
<cfscript>
appName = 'zadsApp';
jSessTracker = CreateObject('java', 'coldfusion.runtime.SessionTracker');
appSessions = jSessTracker.getSessionCollection(JavaCast('string', appName));
targetSession = appSessions[appName & '_' & sessionCFID & '_' & sessionCFTOKEN];
// Dumping, reading, writing WILL update the last accessed time.
// There are ways around this if needed...
WriteDump(targetSession);
targetSession.something = 'A new value';
</cfscript>
Now you mention that this is on a different server, can I just double check that you want a seperate server to change a session on another server? Without putting any kind of code like the above on that server (the one with the session)? Although it'd be a heck of a lot easier if the code was on the same server, you might be able to perform code like this remotely using JMX... but I'm sure there must be an easier way to do all of this.
You can do it by passing the correct values on the URL, more information available here
http://ruthsarian.wordpress.com/2005/10/03/cf-session-hijacking/
ColdFusion uses two unique values to keep track of user session
information. These values are CFID and CFTOKEN. They are stored as
cookies but can also be passed along the URL and inside POST data.
Session variables are a place to store information specific to the
user and to the current session (such as whether or not a user is
logged in).
It is possible to hijack a user’s session by supplying the correct
CFID and CFTOKEN values to the server, either on the URL, or wherever
else you want.
What it sounds like you are trying to do is effecitvely described by this post http://old.nabble.com/ColdFusion-9-Session-Replication-td32621620.html which advises against session replication on grounds of high network usage due to it.
Where you are trying to maintain a specific sessions scope across multiple physical servers. The way I've worked around this in the past is to maintain a database storing the information which needs to be passed between physical servers tied to a UUID. For this purpose you could just use the CFID/CFTOKEN values as your database PK's, or you could make another PK altogether. This would then allow you to pass the CFID etc on the URL string, and then if it hits a server which it hasn't so far hit (i.e. no session / session wasn't loaded using those CFID/CFTOKEN) then you can load the variables you need from a database.
Edit an alternative non-database method
Firstly set up a script on one server i.e. getSessionData.cfm which returns
the data in the session scope in transportable format i.e. using
SaveObject() (if on CF9), or maybe SerializeJSON(), something like
that
<!--- on source server, getSessionData.cfm --->
<cfscript>
WriteOutput(ToBase64(ObjectSave(session)));
</cfscript>
Then set up a handler that will request data from the other server
using a cfhttp request populating CFID/CFTOKEN to access the session, and then pull that data into the session on the new server.
<!--- on target server --->
<cfhttp url="http://sourceserver/getSessionData.cfm">
<!--- Params to pass through CFID/CFTOKEN or any other cookie/url/post params etc --->
</cfhttp>
<cfscript>
structToImportToSession = ObjectLoad(ToBinary(cfhttp.FileContent));
for (thisStructKey in structToImportToSession) {
session[thisStructKey] = structToImportToSession[thisStructKey];
}
</cfscript>
The problem with this is that I would feel uneasy with this kind of script being on my server in a production environment. It also means that you will need to know explicitly which physical server the user came from so that you can request the getSessionData.cfm script from the correct server.
This post from Ben Nadel seems to employ a similar principle to update session data, same could be applied for updating it I expect, http://www.bennadel.com/blog/725-Maintaining-Sessions-Across-Multiple-ColdFusion-CFHttp-Requests.htm
Personally I'd still advise the database-drive method as it gives you clearer mechanics by which to purge old sessions etc, however this second option should be viable and give you access to what you need.
as long as the session isn't expired you can access a session by using the CFID and CFTOKEN in the url

Cookie write fails to work on hosted site

I have created a basic but extensive javascript-html page that depends on cookies to keep user information. It runs perfectly on my computer (MAC - Firefox) but when loaded into my hosted web site (the page is in my domain) the cookies are not being written when the page is opened.
I was hoping that by keeping all the programming in javascript I could get some basic interactivity. Is this assumption wrong? Must the cookies be written using PHP?
My cookie writes are very vanilla.
document.cookie = cookieArray[ja]+expires+"; path=/"; // writes cookie data into browser.
update
well cookies are now being written since I added "path=/; domain=.my.org". But now there is one other problem.
It seems that safari and Firefox write the cookies in reverse order to each other. I create the cookies by altering an array then simply stepping thru the array to write the cookies. I was hoping that I could simply read the cookies one by one and keep the order. Ah well.
Did you added the ";" between cookieArray[ja] and expires?
document.cookie = 'cookie-name=cookie-value; expires=Thu, 01-Jan-70 00:00:01 GMT;';
Also the cookieArray[ja] have to contain the cookie-name.
Do you really need the path? This parameter is also optional.
Cookies are, by default, available to all other files in the same directory the cookie was created in.
http://www.comptechdoc.org/independent/web/cgi/javamanual/javacookie.html