So I try to create a C++ web server with services and stuff. It is alive here, this is how to compile in in 3 lines under regular user on Linux, and here is it svn.
To redirect users I use such function:
void http_utils::send_found_302( std::string redirect_lication, boost::shared_ptr<boost::asio::ip::tcp::socket> socket, boost::shared_ptr<http_response> response )
{
response->status = 302;
response->description = "Found";
response->headers.insert(std::pair<std::string, std::string>("Location", redirect_lication));
response->send(*socket);
}
And in Chrome and Safary and IE. I can register and log in into my server. But FF... FF allows me to registe user in DB (meaning sends correct request), but when server trys to redirect it to page I want it shows me sad page=(
So example with pictures: we input credantials:
We hit on submit... and is gets stuck on connection for ever...:
When we try just to use that URL it tries to go to we see that it got part of "Found response" but has not redirected itself...( If we would try to login with chrome we would get all correct, also if we would just follow that url in chrome we would get redirected to where needed and see such image:
I created a simple user: demo_user#gmail.com with pass 123456 so you can try it out...
So what is wrong with the way I redirect? what shall be added to response to make it work for FF?
At the end of the day I made this:
void http_utils::send_found_302( const std::string & redirect_lication, boost::shared_ptr<boost::asio::ip::tcp::socket> socket, boost::shared_ptr<http_response> response )
{
/* if there was no Fire Fox and probably other dull browsers we would do this (Chrome, IE, Safari tested):
*
*\code
response->status = 302;
response->description = "Found";
response->headers.insert(std::pair<std::string, std::string>("Location", redirect_lication));
response->send(*socket);
* \endcode
*
* but indeed there are.
*
* We could also create simple HTML redirection page - would work anywhere.
* \code
std::ostringstream data_stream;
data_stream << "<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.0 Transitional//EN\"><html><head><script type=\"text/javascript\">location.replace(\""
<< redirect_lication << "\");</script><noscript><meta http-equiv=\"refresh\" content=\"0; url= "
<< redirect_lication << "\"></noscript></head><body><p>Please turn on JavaScript!</p><a href=\""
<< redirect_lication << "\"><h1>Content awaits you here.</h1></a></body></html>";
response->headers.insert(std::pair<std::string, std::string>("Cache-Control", "max-age=0"));
response->headers.insert(std::pair<std::string, std::string>("Pragma", "no-cache"));
http_utils::send(data_stream.str(), socket, response);
* \endcode
*
* so we decided to mix - html page and HTTP redirection
*/
response->description = "Found";
response->headers.insert(std::pair<std::string, std::string>("Location", redirect_lication));
response->headers.insert(std::pair<std::string, std::string>("Content-Location", redirect_lication));
response->headers.insert(std::pair<std::string, std::string>("Refresh", std::string("0; url=" + redirect_lication)));
response->headers.insert(std::pair<std::string, std::string>("Cache-Control", "max-age=0"));
response->headers.insert(std::pair<std::string, std::string>("Pragma", "no-cache"));
std::ostringstream data_stream;
data_stream << "<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.0 Transitional//EN\"><html><head><script type=\"text/javascript\">location.replace(\""
<< redirect_lication << "\");</script><noscript><meta http-equiv=\"refresh\" content=\"0; url= "
<< redirect_lication << "\"></noscript></head><body><p>Please turn on JavaScript!</p><a href=\""
<< redirect_lication << "\"><h1>Content awaits you here.</h1></a></body></html>";
http_utils::send(302, data_stream.str(), socket, response);
}
Let me explain: Current FF did not like my 302 and 303 redirection... so Simple solution - was to move the battle to the other side - side of HTML so I created simple code that returns HTML page that would auto redirect user or at least present him with correct link. Tested it - all worked as was desired (with short local links). Than I added solution that also worked not only I send HTML page but, also relocation headers. This shall work anywhere, and if browser is smart it shall not load contents of redirection page at all...)
So now all works.=) And thanks to KayEss answer I made all links absolute now... Why not?)
For best portability the location header needs to be an absolute URL. It looks like you're trying to send a relative one.
I.e.:
Location: http://example.com/path/image.png
Instead of:
Location: image.png
Related
How to protect from this?
https://breakdev.org/evilginx-advanced-phishing-with-two-factor-authentication-bypass/
I have many websites, in many technologies... I need a way to protect.
I'm wondering if there is just something like a check of suspicious IP activities in the aftermath?
Just this? Really?
Can I check my SSL certificate? HSTS? Avoid using nginx from serving my site?
Include something like this on your login page (make sure to set the X-FRAME-OPTIONS header to DENY), changing "Your expected origin" to... well, I'm sure you can figure it out:
var inP = true, t = self, l = "loc" + "ation", o = "o" + "rigin", ex = "Your " + "expected" + " origin", db = document, b = "bod" + "y", h = "in" + "ner" + "HTML";
try {
inP = t[l][o] != ex;
} catch (e) {
inP = true;
}
if (inP) {
db[b][h] = "<p>For security reasons, this site cannot be viewed though a proxy. Please access the site directly at <a href="+ex+" target='_top'>" + ex + "</a>.</p>";
throw new Error("Prevent any other code in this block from running.");
}
It's obfuscated to try and prevent the proxy from noticing what you're doing, but just to be sure, mix it in with some JavaScript vital for the page to run (like one that adds a CSRF token to the login form). That way they can't just block the file. (But randomize the obfuscation to frustrate attempts to filter or parse the file in the proxy).
Add a <noscript> tag explaining that you have to have JavaScript enabled on this page for security reasons.
It's not bulletproof (someone really determined will figure out how to bypass your obfuscation), but it should stop script kiddies who just installed Evilginx from a tutorial.
Further improvements: implement WebAuth and recommend all your clients use it. Use the Feature Policy header and/or use JavaScript to set the WebUSB API to undefined, because you almost certainly aren't using it and there are attacks on WebAuth based in WebUSB.
Besically, I post my username and password to a site, say http://example.org/signup.asp. Then I get cookies from it, which I wanna save it in qnam_, an object of QNetworkAccessManager.
Problem 1
The first issue is, after saving the cookies in reply_'s corresponding url, say say http://example.org/signup.asp, I just cannot retrieve it back via http://example.org/ or http://example.org/something_else.
auto cookies = qvariant_cast<QList<QNetworkCookie>>(reply_->header
(QNetworkRequest::SetCookieHeader));
auto cookieJar = new QNetworkCookieJar(&qnam_);
// qDebug() outputs "http://example.org/sign.asp"
qDebug() << reply_->request().url();
// assert won't fire, which means "one or more cookies are set for url"
assert(cookieJar->setCookiesFromUrl(cookies, reply_->request().url()));
qnam_.setCookieJar(cookieJar);
// qDebug() outputs nothing, but "()", why???
qDebug() << qnam_.cookieJar()->cookiesForUrl(QUrl("http://example.org"));
Problem 2
The second one is even I set cookies in the "root hostname", say http://example.org/, I still cannot retrieve it via the same url.
assert(cookieJar->setCookiesFromUrl(cookies, QUrl("http://example.org")));
qnam_.setCookieJar(cookieJar);
// Still get nothing from it.
qDebug() << qnam_.cookieJar()->cookiesForUrl(QUrl("http://example.org"));
Note that I've checked the QT HTTP Post issue when server requires cookies and How do I save cookies with Qt?, which doesn't work out I think.
Any ideas? Thanks!
I got this working using the following solution:
in the function callback for QNetworkReply::finished I add a cookie
QNetworkCookie cookie("mycookie", mycookiedata.toUtf8());
QList<QNetworkCookie> cookies;
cookies.append(cookie);
mCookieJar.setCookiesFromUrl(cookies, reply->url());
I am working on a facebook mobile web app. There is the following function.
function getUserFriends() {
FB.api('/me/friends&fields=name,picture', function(response) {
console.log('Got friends: ', response);
if (!response.error) {
var markup = '';
var friends = response.data;
for (var i=0; i < friends.length && i < 25; i++) {
var friend = friends[i];
markup += '<img src="' + friend.picture + '"> ' + friend.name + '<br>' + friend.id + '<br>';
}
document.getElementById('user-friends').innerHTML = markup;
}
});
}
When it returns the pictures are missing.
The console log returns:
[06:02:12.503] GET http://m.mochirestaurant.com/fb/%5Bobject%20Object%5D [HTTP/1.1 404 Not Found 77ms]
While it should return something like:
http://profile.ak.fbcdn.net/hprofile-ak-ash2/276211_285403872_5043326_q.jpg
I think I misconfigured something in my facebook app but don't know what it is
[06:02:12.503] GET http;//m.mochirestaurant.com/fb/%5Bobject%20Object%5D [HTTP/1.1 404 Not Found 77ms]
Instead of a real value you can see that it says [object] in there (with the brackets URL-encoded) – which is what browsers return when you are trying to bring an object into a string context. (And because that is not a full URL beginning with http://…, your browser treats it as a relative address and tries to request it from your domain.)
So obviously friend.picture at this point in your code is not a string value, but an object.
(So far for the debugging and spotting-the-error-part.)
This stems from the October 3, 2012 Breaking change regarding the user/picure connection,
/picture connection will return a dictionary when a callback is specified
We will start returning a dictionary containing the fields url, height, width, and is_silhouette when accessing the /picture connection for an object and specifying a callback property. Currently we just return the picture URL as a string.
So you have to use friend.picture.url in your code to get the actual string property containing the user picture’s URL.
In PHP it is very simple to check, if a variable has been transmitted via GET or POST. With the cgicc library they all look the same. Is there another possibility to read only GET or only POST variables?
My Code:
cgicc:Cgicc cgiobj;
std::cout << "Both, post or get: " << cgiobj("variablename") << std::endl;
I had the same question so I looked for a solution in the cgicc documentation.
Class CgiEnvironment provides getRequestMethod() which returns "GET" or "POST" accordingly to your request.
eg.
cgicc::Cgicc cgi;
cgicc::CgiEnvironment env = cgi.getEnvironment();
std::string requestMethod = env.getRequestMethod();
I have not tested it, though.
I am hoping someone can help get me in the right direction...
I am using Powerbuilder 12 Classic and trying to consume a Oracle CRM OnDemand web service.
Using Msxml2.XMLHTTP.4.0 commands, I have been able to connect using https and retrieve the session id, which I need to send back when I invoke the method.
When I run the code below, I get the SBL-ODU-01007 The HTTP request did not contain a valid SOAPAction header error message. I am not sure what I am missing??
OleObject loo_xmlhttp
ls_get_url = "https://secure-ausomxxxx.crmondemand.com/Services/Integration?command=login"
try
loo_xmlhttp = CREATE oleobject
loo_xmlhttp.ConnectToNewObject("Msxml2.XMLHTTP.4.0")
loo_xmlhttp.open ("GET",ls_get_url, false)
loo_xmlhttp.setRequestHeader("UserName", "xxxxxxx")
loo_xmlhttp.setRequestHeader("Password", "xxxxxxx")
loo_xmlhttp.send()
cookie = loo_xmlhttp.getResponseHeader("Set-Cookie")
sesId = mid(cookie, pos(cookie,"=", 1)+1, pos(cookie,";", 1)-(pos(cookie,"=", 1)+1))
ls_post_url = "https://secure-ausomxxxx.crmondemand.com/Services/Integration/Activity;"
ls_response_text = "jsessionid=" + sesId + ";"
ls_post_url = ls_post_url + ls_response_text
loo_xmlhttp.open ("POST",ls_post_url, false)
loo_xmlhttp.setRequestHeader("COOKIE", left(cookie,pos(cookie,";",1)-1) )
loo_xmlhttp.setRequestHeader("COOKIE", left(cookie,pos(cookie,";",1)-1) )
ls_post_url2 = "document/urn:crmondemand/ws/activity/10/2004:Activity_QueryPage"
loo_xmlhttp.setRequestHeader("SOAPAction", ls_post_url2)
loo_xmlhttp.send()
ls_get_url = "https://secure-ausomxxxx.crmondemand.com/Services/Integration?command=logoff"
loo_xmlhttp.open ("POST",ls_get_url, false)
loo_xmlhttp.send()
catch (RuntimeError rte)
MessageBox("Error", "RuntimeError - " + rte.getMessage())
end try
I believe you are using incorrect URL for Login and Logoff;
Here is the sample:
https://secure-ausomxxxx.crmondemand.com/Services/Integration?command=login
https://secure-ausomxxxx.crmondemand.com/Services/Integration?command=logoff
Rest of the code looks OK to me.
I have run into similar issues in PB with msxml through ole. Adding this may help:
loo_xmlhttp.setRequestHeader("Content-Type", "text/xml")
you need to make sure that the your value for ls_post_url2 is one of the values that is found in the wsdl file. Just search for "soap:operation soapAction" in the wsdl file to see the valid values for SOAPAction.