Sending a STOMP frame through websocket - c++

Since there's no webstomp (STOMP through Websocket) C++ implementation anywhere, I'm developing my own. I have a webstomp server set up already, and I've confirmed that it works using the javascript implementation of webstomp.
Now I'm relying on QT's implementation of websocket. I tested and works with a regular websocket. So now comes the implementation of STOMP. Looking at the STOMP frames the first frame I have to send could be something like this:
CONNECT
login: <username>
passcode: <passcode>
^#
With ^#being the null-character. The problem I'm having is that no matter what I do, I can't seem to get any type of response from the server. I tried different encodings, different messages, different connect frames, etc. I was wondering if I was doing something wrong or if I was missing something. An example of the above frame looks like this:
void WebSTOMP::onConnected()
{
if (m_debug)
qDebug() << "WebSocket connected";
connect(&m_webSocket, &QWebSocket::textMessageReceived,
this, &EchoClient::onTextMessageReceived);
std::string myMessage = "CONNECT \nlogin: test\npasscode : test\n\n\0";
m_webSocket.sendTextMessage(QString::fromUtf8(myMessage.c_str()));
}
And then I never get a response back.
Thanks in advance =)

Add one to the length of the message due to the string's null character terminator

Solved it. For the future: had to manually specify the length due to including a null character terminator.

Related

European characters switch to strange characters in response when posting to server using C++

I am struggeling to get the response from the server in correct format under Windows. I have tried two C++ libraries Beast, (based on Boost Asio) and Cpr (based on libcurl) and I get the exact same issue with both.
The strange thing is that I also tried this in C# (HttpClient) and everything works just fine. Also, in Postman and other REST tools it looks good.
When I post to the server and should get back the name René I get Ren� instead. Other European characters like æ,ø,å,ö give the same strange output. To me it looks like an issue with utf-8 / iso-8859-1 but I cannot figure it out. The server (based on node.js) and the response is set to push out utf-8. We have tried to just redirect the response so it does not hit a database or anything like that. So, the problem is under C++ it seems. Any suggestions to what I can try would be greatly appreciated.
Example code:
nlohmann::json test_json = nlohmann::json
{
{ "text", "Hi, my name is René" },
{ "language", "en" }
};
auto r = cpr::Post(cpr::Url{ "http://www.exampleserver.com" },
cpr::Body{ test_json.dump() },
cpr::Header{ { "content-type", "application/json; charset=utf-8" } });
std::cout << r.text << std::endl;
It looks like you've got some ISO-8859-1 content being sent through but it's labelled as UTF-8. This causes a whole rash of conversion errors which can mangle non-ASCII characters beyond recognition.
The way to fix this is to either identify the non-UTF-8 data and properly convert it, or identify the payload with the correct MIME type and encoding.
Your issue is with the encoded string. The string is most likely coming back UTF-8 encoded but you are not converting it properly.
There are various libraries that help you convert. It all depends on the version of C++ you're using. Hard to tell you what to use without more details.

Can I decode € (euro sign) as a char and not as a wstring/wchar?

Let's try explain my problem. I have to receive a message from a server (programmed in delphi) and do some things with that message in the client side (which is the side I programm, in c++).
Let's say that the message is: "Hello €" that means that I have to work with std::wstring as €(euro sign) needs 2 bytes instead of 1 byte, so knowing that I have made all my work with wstrings and if I set the message it works fine. Now, I have to receive the real one from the server, and here comes the problem.
The person on the server side is sending that message as a string. He uses a EncodeString() function in delphi and he says that he is not gonna change it. So my question is: If I Decode that string into a string in c++, and then I convert it into a wstring, will it work? Or will I have problems and have other message on my string var instead of "Hello €".
If yes, if I can receive that string with no problem, then I have another problem. The function that I have to use to decode the string is void DecodeString(char *buffer, int length);
so normally if you receive a text, you do something like:
char Text[255];
DescodeString(Text, length); // length is a number decoded before
So... can I decode it with no problem and have in Text the "Hello €" message? with that I'll just need to convert it and get the wstring.
Thank you
EDIT:
I'll add another example. If i know that the server is going to send me always a text of length 30 max, in the server they do something like:
EncodeByte(lengthText);
EncodeString(text)
and in the client you do:
int length;
char myText[30];
DecodeByte(length);
DecodeString(myText,length);
and then, you can work with myText as a string lately.
Hope that helps a little more. I'm sorry for not having more information but I'm new in that work and I don't know much more about the server.
EDIT 2
Trying to summarize... The thing is that I have to receive a message and do something with it, with the tool I said I have to decode it. So as de DecodeString() needs a char and I need a wstring, I just need a way to get the data received by the server, decode it with decodeString() and get it into a wstring, but I don't really know if its possible, and if it is, I'm not sure about how to do it and what type of vars use to get it
EDIT 3
Finally! I know what code pages are using. Seems that the client uses the ANSI ones and that the server doesn't, so.. I'll have to tell to the person who does that part to change it to the ANSI ones. Thanks everybody for helping me with my big big ignorance about the existence of code pages.
Since you're using wstring, I guess that you are on Windows (wstring isn't popular on *nix).
If so, you need the Delphi app to send you UTF-16, which you can use in the wstring constructor. Example:
char* input = "\x0ac\x020"; // UTF-16 encoding for euro sign
wchar_t* input2 = reinterpret_cast<wchar_t*>(input);
wstring ws(input2);
If you're Linux/Mac, etc, you need to receive UTF-32.
This method is far from perfect though. There can be pitfalls and edge cases for unicodes beyond 0xffff (chinese, etc). Supporting that probably requires a PhD.

Cannot Send Image File (image/jpg) Using Winsock WSABUF

I'm stuck and I need help.
I'm trying to write the correct code for sending back an image file so the web browser can render it. It can send back text/html just fine, but image/* is not working.
You can see the code and the URL is shown below.
https://github.com/MagnusTiberius/iocphttpd/blob/master/iocphttpl/SocketCompletionPortServer.cpp
What the browser is receiving is just a few bytes of image data.
I tried vector, std::string and const char* to set the values of WSABUF, but still the same few bytes are sent over.
Please let know what is the missing piece to make this one work.
Thanks in advance.
Here's your problem:
PerIoData->LPBuffer = _strdup(str.c_str());
The _strdup function only copies up until the first null, so it cannot be used to copy binary data. Consider using malloc and memcpy if you don't want to use the C++ library.
The alternate implementation (in the false branch) is also incorrect, because it saves the data in an object (vc) that goes out of scope before the I/O is completed. You could instead do something like
vector<char> * vc = new vector<char>;

Socket Read \n at the End From Java Client

Using below code I reading data from socket. On the other side the Java client sending string data. But while reading the data an additional \n appears at the end of the string. Can anyone explain why this happen.
Code:
unsigned char buf[100];
rd=read(newsockfd,buf,100);
char cmd[30];
sprintf(cmd,"%s",buf);
Result:
buf->"DATA\n"
cmd->"DATA\n"
From the client if I sent "DATA" then I am getting "DATA\n" at the server side. Can anyone explain the reason for this ? and how can I extract the exact data I sent.
My guess here would be that the newline comes from the Java client itself.
Probably the client is using a function like sendLine(String) or something that adds a newline to the string passed to it before sending it on the network. I don't know Java but this seems very likely.
In java you can say (as other people has pointed) socket.writeLine("Data") which appends a "\n" at the end.
One thing I've noticed though, in the code you wrote, there is a possibly error you could get, if the sender sends you more than 100 chars you would get a memory error.
unsigned char buf[100];
rd=read(newsockfd,buf,1024);
Here you say you want to read up to 1024 chars/bytes but the buffer is declared as [100], be careful!

Why am I getting a bps_remove_fd failure when trying to store a QSslSocket inside a QScopedPointer?

I am developing a networking-based app for the blackberry playbook using Qt4.8.3, part of which involves storing a QAbstractSocket in a QScopedPointer as follows:
QScopedPointer<QAbstractSocket> nntp;
In my implementation, I store either a QSslSocket or a QTcpSocket (both of which inherit from QAbstractSocket) depending on whether the conenction is to be encrypted, i.e.,
if(ssl) {
nntp.reset(new QSslSocket(this));
(dynamic_cast<QSslSocket*>(nntp.data())))->connectToHostEncrypted(server, port);
} else {
nntp.reset(new QTcpSocket(this));
nntp->connectToHost(server, port);
}
When going down the ssl route (non-ssl works fine!), I end up with the following run time error:
virtual void QEventDispatcherBlackberry::unregisterSocketNotifier(QSocketNotifier*) bps_remove_fd() failed 19
The error is probably blackberry related given the error description and the fact that the code works as expected on other platforms (tested on mac and linux). (Note, the number 19 refers to the file descriptor).
Any ideas why I am seeing this error and how I can fix it?
Thanks,
Ben.
EDIT: I've just realised that instead of using the pointer, I can just have a single QSslSocket and treat it as a regular QTcpSocket when in non-ssl mode. Far easier. I would still like to know the reason for the above error however
We can have a look at the source code in order to see what is happening. The source code of unregisterSocketNotifier is:
void QEventDispatcherBlackberry::unregisterSocketNotifier(QSocketNotifier *notifier)
{
// Unregister the fd with bps
int sockfd = notifier->socket();
int result = bps_remove_fd(sockfd);
if (result != BPS_SUCCESS)
qWarning() << Q_FUNC_INFO << "bps_remove_fd() failed";
// Allow the base Unix implementation to unregister the fd too
QEventDispatcherUNIX::unregisterSocketNotifier(notifier);
}
And do a correlation with bps_remove_fd documentation which says:
If the file descriptor is present it is removed from the channel. The io_handler callback and associated user data are also removed.
[returns] BPS_SUCCESS if the fd (file descriptor) was successfully removed from the channel, BPS_FAILURE with errno value set otherwise.
The only clues about what could make bps_remove_fd fail are the possibility that fd is not present, which would mean that your socket does not have any valid file descriptor. The other error could be that for whichever reason not specified, the file exists but is not removed.
The variable errno should be set, so you may have a more complete error description if you look at it - I did not try though, I don't have what it takes -.
I bet bps_remove_fd works on the same principle than POSIX's close(int fd), so I had a look at close's documentation to see what could cause a failure. It states that it shall/may fail in the following cases:
The argument is not a valid file descriptor (shall fail).
close may be interrupted by a signal (shall fail).
An I/O error occurred while reading from or writing to the file system (may fail).
I would have made this answer a comment since it does not really answer the question in your particular case, but I hope it may at least help you understand what's going on a little bit more :)