I have a client-server application, with the server part written in C++ (Winsock) and the client part in Java.
When sending data from the client, I first send its length followed by the actual data. For sending the length, this is the code:
clientSender.print(text.length());
where clientSender is of type PrintWriter.
On the server side, the code that reads this is
int iDataLength;
if(recv(client, (char *)&iDataLength, sizeof(iDataLength), 0) != SOCKET_ERROR)
//do something
I tried printing the value of iDataLength within the if and it always turns out to be some random large integer. If I change iDataLength's type to char, I get the correct value. However, the actual value could well exceed a char's capacity.
What is the correct way to read an integer passed over a socket in C++ ?
I think the problem is that PrintWriter is writing text and you are trying to read a binary number.
Here is what PrintWriter does with the integer it sends:
http://docs.oracle.com/javase/7/docs/api/java/io/PrintWriter.html#print%28int%29
Prints an integer. The string produced by String.valueOf(int) is
translated into bytes according to the platform's default character
encoding, and these bytes are written in exactly the manner of the
write(int) method.
Try something like this:
#include <sys/socket.h>
#include <cstring> // for std::strerror()
// ... stuff
char buf[1024]; // buffer to receive text
int len;
if((len = recv(client, buf, sizeof(buf), 0)) == -1)
{
std::cerr << "ERROR: " << std::strerror(errno) << std::endl;
return 1;
}
std::string s(buf, len);
int iDataLength = std::stoi(s); // convert text back to integer
// use iDataLength here (after sanity checks)
Are you sure the endianness is not the issue? (Maybe Java encodes it as big endian and you read it as little endian).
Besides, you might need to implement receivall function (similar to sendall - as here). To make sure you receive exact number of bytes specified - because recv may receive fewer bytes than it was told to.
You have a confusion between numeric values and their ASCII representation.
When in Java you write clientSender.print(text.length()); you are actually writing an ascii string - if length is 15, you will send characters 1 (code ASCII 0x31) and 5 (code ASCII 0x35)
So you must either :
send a binary length in a portable way (in C or C++ you have hton and ntoh, but unsure in Java)
add a separator (newline) after the textual length from Java side and decode that in C++ :
char buffer[1024]; // a size big enough to read the packet
int iDataLength, l;
l = recv(client, (char *)&iDataLength, sizeof(iDataLength), 0);
if (l != SOCKET_ERROR) {
buffer[l] = 0;
iDataLength = sscanf(buffer, "%d", &iDataLength);
char *ptr = strchr(buffer, '\n');
if (ptr == NULL) {
// should never happen : peer does not respect protocol
...
}
ptr += 1; // ptr now points after the length
//do something
}
Java part should be : clientSender.println(text.length());
EDIT :
From Remy Lebeau's comment, There is no 1-to-1 relationship between sends and reads in TCP. recv() can and does return arbitrary amounts of data, so you cannot assume that a single recv() will read the entire line of text.
Above code should not do a simple recv but be ready to concatenate multiple reads to find the separator (left as exercise for the reader :-) )
Related
I have a relatively simple web server I have written in C++. It works fine for serving text/html pages, but the way it is written it seems unable to send binary data and I really need to be able to send images.
I have been searching and searching but can't find an answer specific to this question which is written in real C++ (fstream as opposed to using file pointers etc.) and whilst this kind of thing is necessarily low level and may well require handling bytes in a C style array I would like the the code to be as C++ as possible.
I have tried a few methods, this is what I currently have:
int sendFile(const Server* serv, const ssocks::Response& response, int fd)
{
// some other stuff to do with headers etc. ........ then:
// open file
std::ifstream fileHandle;
fileHandle.open(serv->mBase + WWW_D + resource.c_str(), std::ios::binary);
if(!fileHandle.is_open())
{
// error handling code
return -1;
}
// send file
ssize_t buffer_size = 2048;
char buffer[buffer_size];
while(!fileHandle.eof())
{
fileHandle.read(buffer, buffer_size);
status = serv->mSock.doSend(buffer, fd);
if (status == -1)
{
std::cerr << "Error: socket error, sending file\n";
return -1;
}
}
return 0
}
And then elsewhere:
int TcpSocket::doSend(const char* message, int fd) const
{
if (fd == 0)
{
fd = mFiledes;
}
ssize_t bytesSent = send(fd, message, strlen(message), 0);
if (bytesSent < 1)
{
return -1;
}
return 0;
}
As I say, the problem is that when the client requests an image it won't work. I get in std::cerr "Error: socket error sending file"
EDIT : I got it working using the advice in the answer I accepted. For completeness and to help those finding this post I am also posting the final working code.
For sending I decided to use a std::vector rather than a char array. Primarily because I feel it is a more C++ approach and it makes it clear that the data is not a string. This is probably not necessary but a matter of taste. I then counted the bytes read for the stream and passed that over to the send function like this:
// send file
std::vector<char> buffer(SEND_BUFFER);
while(!fileHandle.eof())
{
fileHandle.read(&buffer[0], SEND_BUFFER);
status = serv->mSock.doSend(&buffer[0], fd, fileHandle.gcount());
if (status == -1)
{
std::cerr << "Error: socket error, sending file\n";
return -1;
}
}
Then the actual send function was adapted like this:
int TcpSocket::doSend(const char* message, int fd, size_t size) const
{
if (fd == 0)
{
fd = mFiledes;
}
ssize_t bytesSent = send(fd, message, size, 0);
if (bytesSent < 1)
{
return -1;
}
return 0;
}
The first thing you should change is the while (!fileHandle.eof()) loop, because that will not work as you expect it to, in fact it will iterate once too many because the eof flag isn't set until after you try to read from beyond the end of the file. Instead do e.g. while (fileHandle.read(...)).
The second thing you should do is to check how many bytes was actually read from the file, and only send that amount of bytes.
Lastly, you read binary data, not text, so you can't use strlen on the data you read from the file.
A little explanations of the binary file problem: As you should hopefully know, C-style strings (the ones you use strlen to get the length of) are terminated by a zero character '\0' (in short, a zero byte). Random binary data can contain lots of zero bytes anywhere inside it, and it's a valid byte and doesn't have any special meaning.
When you use strlen to get the length of binary data there are two possible problems:
There's a zero byte in the middle of the data. This will cause strlen to terminate early and return the wrong length.
There's no zero byte in the data. That will cause strlen to go beyond the end of the buffer to look for the zero byte, leading to undefined behavior.
I'm trying to receive a number from an Arduino as an integer in C++. The full code is below:
#define STRICT
#include <tchar.h>
#include <windows.h>
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#include "Serial.h"
#include <boost\lexical_cast.hpp>
enum { EOF_Char = 27 };
int __cdecl _tmain(int /*argc*/, char** /*argv*/)
{
CSerial serial;
LONG lLastError = ERROR_SUCCESS;
// Attempt to open the serial port (COM4)
lLastError = serial.Open(_T("COM4"), 0, 0, false);
// Setup the serial port (9600,8N1, which is the default setting)
lLastError = serial.Setup(CSerial::EBaud9600, CSerial::EData8, CSerial::EParNone, CSerial::EStop1);
// Register only for the receive event
lLastError = serial.SetMask(CSerial::EEventBreak |
CSerial::EEventCTS |
CSerial::EEventDSR |
CSerial::EEventError |
CSerial::EEventRing |
CSerial::EEventRLSD |
CSerial::EEventRecv);
// Use 'non-blocking' reads, because we don't know how many bytes
// will be received. This is normally the most convenient mode
// (and also the default mode for reading data).
lLastError = serial.SetupReadTimeouts(CSerial::EReadTimeoutNonblocking);
// Keep reading data, until an EOF (CTRL-Z) has been received
bool fContinue = true;
do
{
// Wait for an event
lLastError = serial.WaitEvent();
// Save event
const CSerial::EEvent eEvent = serial.GetEventType();
// Handle data receive event
if (eEvent & CSerial::EEventRecv)
{
// Read data, until there is nothing left
DWORD dwBytesRead = 0;
char szBuffer[101];
do
{
// Read data from the COM-port
lLastError = serial.Read(szBuffer, sizeof(szBuffer) - 1, &dwBytesRead);
if (dwBytesRead > 0)
{
// Finalize the data, so it is a valid string
szBuffer[dwBytesRead] = '\0';
// Display the data
printf("%s", szBuffer);
// Check if EOF (CTRL+'[') has been specified
if (strchr(szBuffer, EOF_Char))
fContinue = false;
}
} while (dwBytesRead == sizeof(szBuffer) - 1);
}
} while (fContinue);
// Close the port again
serial.Close();
return 0;
}
I have my Arduino constantly sending out the number 51. This code works fine and consistently displays "51". However, I want an int to manipulate in C++.
First I added
std::stringstream str(szBuffer);
int tester;
str >> tester;
printf("My number is: %d\n", tester+1);
right after
printf("%s", szBuffer);
A typical result looks like:
51My number is: 52
51My number is: 52
51My number is: 52
51My number is: 52
51My number is: 52
5My number is: 6
1My number is: 2
After doing it perfectly 5 or 6 times, the output always separates the incoming digits once or twice in a row (I haven't been able to find a specific pattern yet, but it's always 5-6 and 1-2).
My other attempt was to use the boost library:
int tester = boost::lexical_cast<int>(szBuffer);
printf("My number is: %d\n", tester);
right after
printf("%s", szBuffer);
and I get the same result (1-2 errors after 5-6 correct ones). I don't think the Arduino is sending bad data, since just a
printf("%s", szBuffer);
will never deviate from the number it's supposed to be. Could the conversion be messing up the receiving of data? Thanks.
EDIT: The Arduino code is:
void setup() {
Serial.begin(9600); // same as in your c++ script
}
void loop() {
Serial.print(51);
delay(1000);
}
With serial ports, there is no mechanism where a transmitter can inform a receiver how many bytes were transmitted as a block. I.e. there's no "hidden" marker where Serial.print(51); tells the receiver that it sent two characters as one number. You have to add some kind of indication (spaces, commas, line ends, initial byte counts, whatever) to your serial protocol.
Because of this, the number of characters you get from serial.Read depends on the number of characters you asked it to read (the second parameter) and how many characters are in the serial port's receive buffer, whichever is smaller. Most of the time, it seems the Arduino sends both digits before you call serial.Read, but sometimes it only gets one out in time... and the second is read the next time through the loop.
So let's assume you decided to use line ends to separate your numbers. All you have to do on the Arduino end is change to Serial.println(51);. The receive end is a little more complex.
I don't know what your serial library has in it. Most have some kind of "read line" function, and you would just replace the serial.Read call with something like:
serial.Readline(szBuffer, sizeof(szBuffer) - 1);
and it will take care of null-terminating the output. If it doesn't take care of null-termination, you'll need to find the line end and change it to a \0 yourself. From this point on, your code will work fine, because the serial.Readline function will block until it gets the whole line.
If you don't have a "read line" or at least a "read until this character" function, it's a bit harder. You have to repeatedly call serial.Read, moving through your buffer, until you see the line end character. Further, you run the risk of reading part or all of the next line, so you can't just discard all the data you read when you're done reading the number; you have to move teh data in the buffer so the next line's data (and further) is at the start of the buffer.
If you're using Boost (are you? it has no CSerial that I see), it looks like it has a read_until function. This takes three parameters: the stream you're reading from, a stream buffer to store the data in, and something to stop on. In this case, the stream buffer for storage is the one in your std::stringstream:
std::stringstream buffer;
size_t chars = boost::asio::read_until(serial, buffer.rdbuf(), '\n');
if(chars == 0) return;
int tester;
buffer >> tester;
printf("My number is: %d\n", tester+1);
I have an arduino board that is connected to a sensor. From Arduino IDE serial monitor, I see the readings are mostly 160, 150, etc. Arduino has a 10 bit ADC, so I assume the readings range from 0 to 1024.
I want to fetch that readings to my computer so that I can do further processing. It must be done this way up to this point. Now, I wrote a c++ program to read serial port buffer with Windows APIs (DCB). The transfer speed of the serial ports are set to 115200 on both the Arduino IDE and the c++ program.
I will describe my problem first: Since I want to send the readings to my computer, I expect the data looks like the following:
124
154
342
232
...
But now it looks like
321
43
5
2
123
...
As shown, the data are concatenated. I knew it because I tried to display them with [], and the data are truly messed up.
The section of the code that is doing the serial port reading on the computer is as here:
// Read
int n = 10;
char szBuff[10 + 1] = {0};
DWORD dwBytesRead = 0;
int i;
for (i = 0; i < 200; i++){
{
if(!ReadFile(hSerial, szBuff, n, &dwBytesRead, NULL)){
//error occurred. Report to user.
printf("Cannot read.\n");
}
else{
printf("%s\n" , szBuff);
}
}
}
The Arduino code that's doing the serial port sending is:
char buffer [10] = { 0 };
int analogIn = 0;
int A0_val = 0;
void setup() {
Serial.begin(115200);
}
void loop() {
A0_val = analogRead(analogIn);
sprintf(buffer, "%d", A0_val);
Serial.println(buffer);
}
I suspect that the messing up of the data is caused by different size of the buffer used to transmit and receive data in the serial port. What is the good suggestion for the size of the buffer and even better method to guarantee the successful transmission of valid data?
Thanks very much!
Your reciever code cannot assume a single read from the serial port will yield a complete line (i.e. the 2 or 3 digits followed by a '\n' that the arduino continuously sends).
It is up to the receiver to synthetize complete lines of text on reception, and only then try to use them as meaningful numbers.
Since the serial interface is extremely slow compared with your average PC computing power, there is little point in reading more than one character at a time: literally millions of CPU cycles will be spent waiting for the next character, so you really don't need to react fast to the arduino input.
Since in that particular case it will not hinder performances in the slightest, I find it more convenient to read one character at a time. That will save you the hassle of moving bits of strings around. At least it makes writing an educational example easier.
// return the next value received from the arduino as an integer
int read_arduino (HANDLE hserial)
{
char buffer[4]; // any value longer than 3 digits must come
// from a faulty transmission
// the 4th caracter is used for a terminating '\0'
size_t buf_index = 0; // storage position of received characters
for (;;)
{
char c; // read one byte at a time
if (!ReadFile(
hSerial,
&c, // 1 byte buffer
1, // of length 1
NULL, // we will read exactly one byte or die trying,
// so length checking is pointless
NULL)){
/*
* This error means something is wrong with serial port config,
* and I assume your port configuration is hard-coded,
* so the code won't work unless you modify and recompile it.
* No point in keeping the progam running, then.
*/
fprintf (stderr, "Dang! Messed up the serial port config AGAIN!");
exit(-1);
}
else // our read succeded. That's a start.
{
if (c == '\n') // we're done receiving a complete value
{
int result; // the decoded value we might return
// check for buffer overflow
if (buf_index == sizeof (buffer))
{
// warn the user and discard the input
fprintf (stderr,
"Too many characters received, input flushed\n");
}
else // valid number of characters received
{
// add a string terminator to the buffer
buffer[buf_index] = '\0';
// convert to integer
result = atoi (buffer);
if (result == 0)
{
/*
* assuming 0 is not a legit value returned by the arduino, this means the
* string contained something else than digits. It could happen in case
* of electricval problems on the line, typically if you plug/unplug the cable
* while the arduino is sending (or Mr Fluffy is busy gnawing at it).
*/
fprintf (stderr, "Wrong value received: '%s'\n", buffer);
}
else // valid value decoded
{
// at last, return the coveted value
return res; // <-- this is the only exit point
}
}
// reset buffer index to prepare receiving the next line
buf_index = 0;
}
else // character other than '\n' received
{
// store it as long as our buffer does not overflow
if (buf_index < sizeof (buffer))
{
buffer[buf_index++] = c;
/*
* if, for some reason, we receive more than the expected max number of
* characters, the input will be discarded until the next '\n' allow us
* to re-synchronize.
*/
}
}
}
}
}
CAVEAT: this is just code off the top of my head. I might have left a few typos here and there, so don't expect it to run or even compile out of the box.
A couple of basic problems here. First, it is unlikely that the PC can reliably keep up with 115,200 baud data if you only read 10 bytes at a time with ReadFile. Try a slower baud rate and/or change the buffer size and number of bytes per read to something that will get around 20 milliseconds of data, or more.
Second, after you read some data put a nul at the end of it
szBuf[dwBytesRead] = 0;
before you pass it to printf or any other C string code.
I'm writing a file transfer client/server application
where the client is operating on windows7 and written in vb.net
and the server is operating on linux mint and written in c++ (I'm using vmware)
my problem is when i try to upload files to the server (such as images) the received data is missing many bytes which also represent the control characters (such as EOT, ETB,...) and I guess they're read as tcp control characters and ignored by the receiving OS.
I already tested the application with simple text files (size up to 4MB) without any problem.
is there a way to prevent the system from ignoring those bytes?
this is the c++ function that receives the file:
string readSockBytes(int port,int num,int size)
{
int dcmbuffSize = 1460;
int n;
stringstream temp;
string strBuffer,Sbuffer;
char Rbuffer[dcmbuffSize];
struct socketVar sockets;
sockets = setSocket(port);
sockets = sockListen(sockets);
cout<<"user connected\n";
strBuffer = readsock(sockets);
cout<<strBuffer.substr(0,strBuffer.find("$"))<<endl;
if(num == atoi(strBuffer.substr(0,strBuffer.find("$")).c_str()))
Sbuffer = "ready$";
else
{
Sbuffer = "exit$";
close(sockets.newsockfd);
close(sockets.sockfd);
}
n = writesock(sockets, Sbuffer, 100);
if (n < 0) error("ERROR writing to socket");
while(strBuffer.length() < fileSize)
{
n = read(sockets.newsockfd,Rbuffer,dcmbuffSize-1);
if (n < 0) error("ERROR reading from socket");
temp.str(Rbuffer);
strBuffer = strBuffer+temp.str();
}
strBuffer = strBuffer.substr(0,size);
return strBuffer;
}
The issue is most likely that you sent binary data. And binary data can contain zeros. And zeroes are the normal string terminator.
This means that when you do temp.str(Rbuffer) (assuming temp is a std::stringstream) then it only gets data from Rbuffer until the first zero.
Instead of using e.g. std::stringstream use std::string:
while(strBuffer.length() < fileSize)
{
char buffer[2048];
ssize_t n = read(sockets.newsockfd, buffer, sizeof(buffer));
if (n <= 0)
{
// An error, or connection closed
if (n < 0)
error("ERROR reading from socket");
break;
}
// Create a string of `n` bytes, including possible string terminators
// and add it to out current buffer
strBuffer += std::string(buffer, n);
}
The important thing to remember here is that you can't use the received data as a string! If it's binary data it will with most certainty contain the string terminator and so you have to treat is as binary data and not a string (even though you can store it in a std::string).
You also need to be aware that you can't print the data, as many binary values are either unprintable or will print as "garbage".
And lastly, if you read and write binary files, you need to open them in binary modes, or you will get errors with the bytes 0x0d and 0x0a (i.e. carriage-return and newline).
My goal is create an app client server, written in C++.
When the server read an input from the client, should process the string and give an output.
Basically, I have a simply echo server that send the same message.
But if the user types a special string (like "quit"), the program have to do something else.
My problem is that this one dont happend, because the comparison between strings is not working... I dunno why!
Here a simple code:
while(1) {
int num = recv(client,buffer,BUFSIZE,0);
if (num < 1) break;
send(client, ">> ", 3, 0);
send(client, buffer, num, 0);
char hello[6] ="hello";
if(strcmp(hello,buffer)==0) {
send(client, "hello dude! ", 12, 0);
}
buffer[num] = '\0';
if (buffer[num-1] == '\n')
buffer[num-1] = '\0';
std::cout << buffer;
strcpy(buffer, "");
}
Why the comparison is not working?
I have tried many solutions...but all failed :(
Your data in buf may not be NULL-terminated, because buf contains random data if not initialized. You only know the content of the first num bytes. Therefore you also have to check how much data you've received before comparing the strings:
const char hello[6] ="hello";
size_t hello_sz = sizeof hello - 1;
if(num == hello_sz && memcmp(hello, buffer, hello_sz) == 0) { ...
As a side note, this protocol will be fragile unless you delimit your messages, so in the event of fragmented reads (receive "hel" on first read, "lo" on the second) you can tell where one message starts and another one ends.
strcmp requires null terminated strings. The buffer you read to might have non-null characters after the received message.
Either right before the read do:
ZeroMemory(buffer, BUFSIZE); //or your compiler defined equivalent
Or right after the read
buffer[num] = '\0';
This will ensure that there is a terminating null at the end of the received message and the comparison should work.
A string is defined to be an array of chars upto and including the terminating \0 byte. Initially your buffer contains arbitrary bytes, and is not even guaranteed to contain a string. You have to set buffer[num] = '\0' to make it a string.
That of course means that recv should not read sizeof buffer bytes but one byte less.