Socket Read \n at the End From Java Client - c++

Using below code I reading data from socket. On the other side the Java client sending string data. But while reading the data an additional \n appears at the end of the string. Can anyone explain why this happen.
Code:
unsigned char buf[100];
rd=read(newsockfd,buf,100);
char cmd[30];
sprintf(cmd,"%s",buf);
Result:
buf->"DATA\n"
cmd->"DATA\n"
From the client if I sent "DATA" then I am getting "DATA\n" at the server side. Can anyone explain the reason for this ? and how can I extract the exact data I sent.

My guess here would be that the newline comes from the Java client itself.
Probably the client is using a function like sendLine(String) or something that adds a newline to the string passed to it before sending it on the network. I don't know Java but this seems very likely.

In java you can say (as other people has pointed) socket.writeLine("Data") which appends a "\n" at the end.
One thing I've noticed though, in the code you wrote, there is a possibly error you could get, if the sender sends you more than 100 chars you would get a memory error.
unsigned char buf[100];
rd=read(newsockfd,buf,1024);
Here you say you want to read up to 1024 chars/bytes but the buffer is declared as [100], be careful!

Related

How to 'read' from a (binary==true) boost::beast::websocket::stream<tcp::socket> into a buffer (boost::beast::flat_buffer?) so it is not escaped?

I am using boost::beast to read data from a websocket into a std::string. I am closely following the example websocket_sync_client.cpp in boost 1.71.0, with one change--the I/O is sent in binary, there is no text handler at the server end, only a binary stream. Hence, in the example, I added one line of code:
// Make the stream binary?? https://github.com/boostorg/beast/issues/1045
ws.binary(true);
Everything works as expected, I 'send' a message, then 'read' the response to my sent message into a std::string using boost::beast::buffers_to_string:
// =============================================================
// This buffer will hold the incoming message
beast::flat_buffer wbuffer;
// Read a message into our buffer
ws.read(wbuffer);
// =============================================================
// ==flat_buffer to std::string=================================
string rcvdS = beast::buffers_to_string(wbuffer.data());
std::cout << "<string_rcvdS>" << rcvdS << "</string_rcvdS>" << std::endl;
// ==flat_buffer to std::string=================================
This just about works as I expected, except there is some kind of escaping happening on the data of the (binary) stream.
There is no doubt some layer of boost logic (perhaps character traits?) that has enabled/caused all non-printable characters to be '\u????' escaped, human-readable text.
The binary data that is read contains many (intentional) non-printable ASCII control characters to delimit/organize chunks of data in the message:
I would rather not have the stream escaping these non-printable characters, since I will have to "undo" that effort anyway, if I cannot coerce the 'read' buffer into leaving the data as-is, raw. If I have to find another boost API to undo the escaping, that is just wasted processing that no doubt is detrimental to performance.
My question has to have a simple solution. How can I cause the resulting flat_buffer that is ws.read into 'rcvdS' to contain truely raw, unescaped bytes of data? Is it possible, or is it necessary for me to simply choose a different buffer template/class, so that the escaping does not happen?
Here is a visual aid - showing expected vs. actual data:
Beast does not alter the contents of the message in any way. The only thing that binary() and text() do is set a flag in the message which the other end receives. Text messages are validated against the legal character set, while binary messages are not. Message data is never changed by Beast. buffers_to_string just transfers the bytes in the buffer to a std::string, it does not escape anything. So if the buffer contains a null, or lets say a ctrl+A, you will get a 0x00 and a 0x01 in the std::string respectively.
If your message is being encoded or translated, it isn't Beast that is doing it. Perhaps it is a consequence of writing the raw bytes to the std::cout? Or it could be whatever you are using to display those messages in the image you posted. I note that the code you provided does not match the image.
If anyone else lands here, rest assured, it is your server end, not the client end that is escaping your data.

Sending a STOMP frame through websocket

Since there's no webstomp (STOMP through Websocket) C++ implementation anywhere, I'm developing my own. I have a webstomp server set up already, and I've confirmed that it works using the javascript implementation of webstomp.
Now I'm relying on QT's implementation of websocket. I tested and works with a regular websocket. So now comes the implementation of STOMP. Looking at the STOMP frames the first frame I have to send could be something like this:
CONNECT
login: <username>
passcode: <passcode>
^#
With ^#being the null-character. The problem I'm having is that no matter what I do, I can't seem to get any type of response from the server. I tried different encodings, different messages, different connect frames, etc. I was wondering if I was doing something wrong or if I was missing something. An example of the above frame looks like this:
void WebSTOMP::onConnected()
{
if (m_debug)
qDebug() << "WebSocket connected";
connect(&m_webSocket, &QWebSocket::textMessageReceived,
this, &EchoClient::onTextMessageReceived);
std::string myMessage = "CONNECT \nlogin: test\npasscode : test\n\n\0";
m_webSocket.sendTextMessage(QString::fromUtf8(myMessage.c_str()));
}
And then I never get a response back.
Thanks in advance =)
Add one to the length of the message due to the string's null character terminator
Solved it. For the future: had to manually specify the length due to including a null character terminator.

Can I decode € (euro sign) as a char and not as a wstring/wchar?

Let's try explain my problem. I have to receive a message from a server (programmed in delphi) and do some things with that message in the client side (which is the side I programm, in c++).
Let's say that the message is: "Hello €" that means that I have to work with std::wstring as €(euro sign) needs 2 bytes instead of 1 byte, so knowing that I have made all my work with wstrings and if I set the message it works fine. Now, I have to receive the real one from the server, and here comes the problem.
The person on the server side is sending that message as a string. He uses a EncodeString() function in delphi and he says that he is not gonna change it. So my question is: If I Decode that string into a string in c++, and then I convert it into a wstring, will it work? Or will I have problems and have other message on my string var instead of "Hello €".
If yes, if I can receive that string with no problem, then I have another problem. The function that I have to use to decode the string is void DecodeString(char *buffer, int length);
so normally if you receive a text, you do something like:
char Text[255];
DescodeString(Text, length); // length is a number decoded before
So... can I decode it with no problem and have in Text the "Hello €" message? with that I'll just need to convert it and get the wstring.
Thank you
EDIT:
I'll add another example. If i know that the server is going to send me always a text of length 30 max, in the server they do something like:
EncodeByte(lengthText);
EncodeString(text)
and in the client you do:
int length;
char myText[30];
DecodeByte(length);
DecodeString(myText,length);
and then, you can work with myText as a string lately.
Hope that helps a little more. I'm sorry for not having more information but I'm new in that work and I don't know much more about the server.
EDIT 2
Trying to summarize... The thing is that I have to receive a message and do something with it, with the tool I said I have to decode it. So as de DecodeString() needs a char and I need a wstring, I just need a way to get the data received by the server, decode it with decodeString() and get it into a wstring, but I don't really know if its possible, and if it is, I'm not sure about how to do it and what type of vars use to get it
EDIT 3
Finally! I know what code pages are using. Seems that the client uses the ANSI ones and that the server doesn't, so.. I'll have to tell to the person who does that part to change it to the ANSI ones. Thanks everybody for helping me with my big big ignorance about the existence of code pages.
Since you're using wstring, I guess that you are on Windows (wstring isn't popular on *nix).
If so, you need the Delphi app to send you UTF-16, which you can use in the wstring constructor. Example:
char* input = "\x0ac\x020"; // UTF-16 encoding for euro sign
wchar_t* input2 = reinterpret_cast<wchar_t*>(input);
wstring ws(input2);
If you're Linux/Mac, etc, you need to receive UTF-32.
This method is far from perfect though. There can be pitfalls and edge cases for unicodes beyond 0xffff (chinese, etc). Supporting that probably requires a PhD.

C++ Binary character are converting to plain text

im sending a file using TCPSocket and its being sent correctly but
the problem is in my split function it converts the binary bytes to plain characters
if i writed the orignal received buffer to the file it gives me real bytes 100% like orignal file
i need split function to split the sent messages because some times if i sent 2 messages in same time they will be received as 1 message
what do you think what should i do?
thanks
Solution :
i replaced sprintf with memcpy
and it works perfectly now :) thanks everyone.

IRC Client Sending Messages Improperly, C++

The problem occurs when sending messages. They segment into individual messages where spaces are. The messages are composed using sprintf(message, "PRIVMSG %s :%s\n", irc_chan, buffer); The error will appear as follows(Individual messages are contained in ""s). I will enter a message "Hi there". It will output "Hi" "there". buffer is a char[1024]. Any ideas please let me know.
The following is the part of the code that sends the message, the class I've used for the socket is of no concern to you, I can receive messages and connect FINE.
scanf("%s", buffer);
sprintf(message, "PRIVMSG %s :%s", irc_chan, buffer);
send(IRCSocket.iSocket, message, strlen(message), 0);
EDIT: I resolved this with help from Computer Guru. I was using scanf(), I should have been using cin.getline(); Thanks for the help, MUCH appreciated.
%s does not include spaces. Each word will be captured individually.
Here's from scanf(3) manual page:
s Matches a sequence of non-white-space characters; the next pointer
must be a pointer to char, and the array must be large enough to
accept all the sequence and the terminating NUL character. The
input string stops at white space or at the maximum field width,
whichever occurs first.
It's also way too easy to overrun the end of the buffer that way. Use fgets(3) instead. In C++ (since that's your tag) you can use std::string and getline().