I'd like to do a little data mining, to see if any of my customers are giving out their username/password.
Q: What info can I get about the PC that the person is on?
Their IP address (presuming it's fairly static)
Their Browser (presuming they're not hacking the cgi scope)
Their screen resolution
Their cookie scope (presuming they have cookies turned on)
What else?
I want to see: "Hey! This person is logging in from three different machines!"
See here (plus comments):
http://www.coldfusionjedi.com/index.cfm/2010/11/5/Ask-a-Jedi-Preventing-logins-from-other-machines
Most of that is in the CGI scope:
<cfdump var="#CGI#">
Exactly what will be in there is determined by which Web server you're using (e.g. IIS, Apache, etc), and will vary from server to server.
Example: CGI.REMOTE_ADDR is one of the variables that might contain a usable IP address of the client.
CGI.HTTP_COOKIE may contain all of the cookies from the browser, URL-encoded. You can also dump the Cookie scope:
<cfdump var="#Cookie#">
To get screen resolution I think you'd need to use a JavaScript or Flash solution - maybe even CSS; I recall something about being able to serve different CSS files based on screen size for mobile devices. Maybe in Modernizr? http://www.modernizr.com/
Related
I've just noticed my console is littered with this warning, appearing for every single linked resource. This includes all referenced CSS files, javascript files, SVG images, and even URLs from ajax calls (which respond in JSON). But not images.
The warning, for example in case of a style.css file, will say:
Cookie “PHPSESSID” will be soon treated as cross-site cookie against “http://localhost/style.css” because the scheme does not match.
But, the scheme doesn't match what? The document? Because that it does.
The URL of my site is http://localhost/.
The site and its resources are all on http (no https on localhost)
The domain name is definitely not different because everything is referenced relative to the domain name (meaning the filepaths start with a slash href="/style.css")
The Network inspector just reports a green 200 OK response, showing everything as normal.
It's only Mozilla Firefox that is complaining about this. Chromium seems to not be concerned by anything. I don't have any browser add-ons. The warnings seem to originate from the browser, and each warning links to view the corresponding file source in Debugger.
Why is this appearing?
that was exactly same happening with me. the issue was that, firefox keeps me showing even Cookies of different websites hosted on same URL : "localhost:Port number" stored inside browser memory.
In my case, i have two projects configured to run at http://localhost:62601, when i run first project, it saves that cookie in browser memory. when i run second project having same URL, Cookie is available inside that projects console also.
what you can do, is delete the all of the cookies from browser.
#Paramjot Singh's answer is correct and got me most of the way to where I needed to be. I also wasted a lot of time staring at those warnings.
But to clarify a little, you don't have to delete ALL of your cookies to resolve this. In Firefox, you can delete individual site cookies, which will keep your settings on other sites.
To do so, click the hamburger menu in the top right, then, Options->Privacy & Security or Settings->Privacy & Security
From here, scroll down about half-way and find Cookies and Site Data. Don't click Clear Data. Instead, click Manage Data. Then, search for the site you are having the notices on, highlight it, and Remove Selected
Simple, I know, but I made the mistake of clearing everything the first time - maybe this will prevent someone from doing same.
The warning is given because, according to MDN web docs:
Standards related to the Cookie SameSite attribute recently changed such that:
The cookie-sending behaviour if SameSite is not specified is SameSite=Lax. Previously the default was that cookies were sent for all requests.
Cookies with SameSite=None must now also specify the Secure attribute (they require a secure context/HTTPS).
Which indicates that a secure context/HTTPS is required in order to allow cross site cookies by setting SameSite=None Secure for the cookie.
According to Mozilla, you should explicitly communicate the intended SameSite policy for your cookie (rather than relying on browsers to apply SameSite=Lax automatically), otherwise you might get a warning like this:
Cookie “myCookie” has “SameSite” policy set to “Lax” because it is missing a “SameSite” attribute, and “SameSite=Lax” is the default value for this attribute.
The suggestion to simply delete localhost cookies is not actually solving the problem. The solution is to properly set the SameSite attribute of cookies being set by the server and use HTTPS if needed.
Firefox is not the only browser making these changes. Apparently the version of Chrome I am using (84.0.4147.125) has already implemented the changes as I got this message in the console:
The previously mentioned MDN article and this article by Mike Conca have great information about changes to SameSite cookie behavior.
Guess you are using WAMP or LAMP etc. The first thing you need to do is enable ssl on WAMP as you will find many references saying you need to adjust the cookie settings to SameSite=None; Secure That entails your local connection being secure. There are instructions on this link https://articlebin.michaelmilette.com/how-to-add-ssl-https-to-wampserver/ as well as some YouTube vids.
The important thing to note is that when creating the SSL certificate you should use sha256 encoding as sha1 is now deprecated and will throw another warning.
There is a good explanation of SameSite cookies on https://web.dev/samesite-cookies-explained/
I was struggling with the same issue and solved it by making sure the Apache 2.4 headers module was enabled and than added one line of code
Header always edit Set-Cookie ^(.")$ $1;HttpOnly;Secure
I wasted lots of time staring at the same sets of warnings in the Inspector until it dawned on me that the cookies were persisting and needed purging.
Apparently Chrome was going to introduce the new rules by now but Covid-19 meant a lot of websites might have been broken while people worked from home. The major browsers are working together on the SameSite attribute this so it will be in force soon.
Here is the situation, we have a site that is hosted and updated by a third party vendor. I am providing links to additional resources that are hosted on our servers. A client will access the vendor site and click on a link to gain access to our additional resources. To validate that the request came from our third party vendor I need to get the IP address of the vendors server.
My question is, is there a way to get the IP address of the vendors servers using ColdFusion? I can't use the clients IP address, I need the vendor server address the client is using.
You have to work with 3rd party to accomplish this goal, this is for sure.
I can see at least two more or less working approaches here.
(1) Append some kind of protection token to the links. Your vendor generates encrypted string or hash including some information only you two know, so you can decrypt (or generate same hash) and validate it.
Example with hashing:
moment = DateConvert("local2utc", Now());
token = Hash("SecretSaultYouBothKnow" & DateFormat(moment, "yyyy-mm-dd") & TimeFormat(moment, "-HH-mm"));
This token is passed with link and expires quickly to prevent sharing/leaking.
You can generate and validate it on your side.
It's a raw idea and there could be possible problems with validation, plus avoiding invalid links for clients (maybe skip "mm" mask as well).
Encrypted/decrypted string would work similarly. You both just need to now the secret key.
By the way, your vendor could encrypt their server IP address or other identifier for you to check it against your database and maybe apply some other actions.
(2) Your vendor could set up simple web-service for you to validate the incoming links (it could respond with 0/1 or something else simple).
Exact implementation may be different. Again, it could be some token in URL which you send back for validation.
This is similar to solution which Jason suggested: vendor could send the server-to-server request to your server on link click and then relocate to the resource. But this may be complicated because you have to be sure 1st request is already handled when client arrives.
Hope these ideas make sense.
No, there isn't. Not if the request comes directly from the client. If the vendor sends some sort of a message first you can use that to validate. Or if the vendor's server is the one making the request on behalf of the client then you could use CGI.REMOTE_ADDR. But if the vendor is just providing a link to your site, then no, you cannot be assured of the IP of the vendor's server.
The closest you could come is to check the HTTP_REFERER, as Jeremy said above, but that can be spoofed (very easily), so it wouldn't be very secure.
To access the CGI variables available to ColdFusion, you can do something like this:
<cfset ThisIP = CGI.SERVER_NAME>
There are many useful CGI variables available here:
http://www.perlfect.com/articles/cgi_env.shtml
try placing a page on your server that uses the cfhttp tag to fetch:
http://www.dslreports.com/whois
That will give you the IP address of the web server.
I'm trying to accomplish the following behaviour:
When the user access to the site by means of:
http://example.com/
I want him to be redirected to:
https://example.com/
By middleware, if user is not logged in, the login template is rendered when accessing /. If the user is logged, / is the main view. When the user logs in, I want the site working by http.
To do so, I am running the same server on ports 80 and 443 (is this really necessary? I have the impression that i'm running two separate servers with the same application while I want a server listening to two ports).
When the user navigates away from login, due to the redirection to http server the data in request.session is not present (altough it is present on https), thus showing that there is no user logged. So, considering the set up of apache is correct (running the same server on two different ports) I guess I have to pass the cookie from the server running on https over to http.
Can anybody shed some light on this? Thank you
First off make sure that the setting SESSION_COOKIE_SECURE is set to false. As long as the domains are the same the cookies on the browser should be present and so the session information should still be there.
Take a look at your cookies using a plugin. Search for the session cookie you have set. By default these cookies are named "sessionid" by Django. Make sure the domains and paths are in fact correct for both the secure session and regular session.
I want to warn against this however. Recently things like Firesheep have exploited an issue that people have known but ignored for a long time, that these cookies are not secure in any way. It would be easy for someone to "sniff" the cookie over the HTTP connection and gain access to the site as your logged in user. This essentially eliminates the entire reason you set up a secure connection to log in in the first place.
Is there a reason you don't have a secure connection across the entire site? Traditional arguments about it being more intensive on the server really don't apply with modern CPUs any longer and the exploits that I refer to above are becoming so prevalent that the marginal (really marginal) cost of encrypting all of your traffic is well worth it.
Apache needs to have essentially 2 different servers running because a.) it is listening on 2 different ports and b.) one is adding some additional encryption logic. That said this is a normal thing for Apache. I run servers with dozens of "servers" running on different ports and doing different logic. In the grand scheme of things, this shouldn't really weight your server down.
That said once you pass the same request to *WSGI or mod_python, you will then have to have logic to make sure that no one tries to log in over your non-encrypted connection because the only difference to Django will be the response in request.is_secure(). All the URLs and views in your urlconf will be accessible.
Whew that is a lot. I hope that helps.
I have been googling, but haven't found an answer.
I understand pretty well what the cookie scope does in ColdFusion. But I'm not 100% sure about the purpose of the client scope or the differences between it and the cookie scope.
It gets a bit muddy because one of the storage methods for the client scope can be set to cookie.
Can someone supply an example, or use-case, that illustrates what the differences are and when I would use one versus the other?
Cookie scope persists the data as cookies on client's browser. Keep it small as it is sent along Every freaking http request. :)
Client scope can persist the data on DB (or registery on Windows, BAD BAD BAD, but it is the default). It is used often in a clustered env with non-sticky session, where a request might be routed to any server where Session data is not available.
I don't have the link, but you can read more on them in CF Dev Guide.
We have installed an ASP.NET web site on a client's server. This site has a web service with a couple of web methods that are called by a Flash object in order to display a news feed. If you browse to their site (ex: www.domain.com), everything's working fine except the flash.
The issue is that when we browse to the .asmx, the header shows that the Host is a subdomain internal to their network (internal.domain.com). Obviously this doesn't resolve to any public IP when browsing from outside of their network. This causes the Flash to fail since the flash object is embedded on a page and is therefore running client side.
I checked the computer name on the server in question, and it doesn't even match "internal.domain.com" - it is something completely different. Where is it getting this information from. It is not coming from IIS, since we have no host headers set up, and the IP for the site is set to (all unassigned).
We either need to force the web service to run against a specific host, or we need to change something on the server so that it resolves to a valid public-facing host name. Any and all help is greatly appreciated!!!!
The solution is to add a host header for www.domain.com
More details here
While you probably did this already, it's always a good first step:
Do a global Find in the source code of both the Flash object and the web service for the string in question.
It sounds like someone may have configured/coded the internal.domain.com string into the Flash object's request. (Host: is a HTTP Request header, not Response header, IIRC.)
Does the Flash object get the web service URL from the C# code? If so, it might be getting the default web service URL that you choose when adding a Web Reference to your project in VS. Therefore it might be pointing to a URL locally to the developer's machine/server which is not recognized on the live server.