Reading bytes from serial port C++ Windows - c++

Hey I am trying to interface with the xbees I have connected to my windows machine. I am able to write to the end device through the coordinator in AT mode, and can see the data streamed to my XCTU console. However, I am having trouble understanding how to read that incoming data.
The code I am currently using is below. Essentially the only part that is crucial is the last 5 lines or so (Specifically the read and write file lines), but I am going to post all of it just to be thorough. How do I read the data I sent to the xbee over the com port? The data I sent was simply 0x00-0x0F.
I think I am misunderstanding how the read file functions. I am assuming that the bits I send to the xbee is stored in a buffer which can than be read one at a time. Is that correct? Or do I need to write the entire byte than read the data available? Im sorry if my train of though is confusing, I am fairly new to serial communication. Any help is appreciated.
#include <cstdlib>
#include <windows.h>
#include <iostream>
using namespace std;
/*
*
*/
int main(int argc, char** argv) {
int n = 8; // Amount of Bytes to Read
HANDLE hSerial;
HANDLE hSerial2;
hSerial = CreateFile("COM3",GENERIC_WRITE,0,0,OPEN_EXISTING,FILE_ATTRIBUTE_NORMAL,0);// dont need to GENERIC _ WRITE
hSerial2 = CreateFile("COM4",GENERIC_READ,0,0,OPEN_EXISTING,FILE_ATTRIBUTE_NORMAL,0);// dont need to GENERIC _ WRITE
if(hSerial==INVALID_HANDLE_VALUE || hSerial2==INVALID_HANDLE_VALUE){
if(GetLastError()==ERROR_FILE_NOT_FOUND){
//serial port does not exist. Inform user.
cout << "Serial port error, does not exist" << endl;
}
//some other error occurred. Inform user.
cout << "Serial port probably in use" << endl;
}
DCB dcbSerialParams = {0};
dcbSerialParams.DCBlength=sizeof(dcbSerialParams);
if (!GetCommState(hSerial, &dcbSerialParams)) {
cout << "error getting state" << endl;
}
dcbSerialParams.BaudRate=CBR_9600;
dcbSerialParams.ByteSize=8;
dcbSerialParams.StopBits=ONESTOPBIT;
dcbSerialParams.Parity=NOPARITY;
if(!SetCommState(hSerial, &dcbSerialParams)){
cout << "error setting serial port state" << endl;
}
COMMTIMEOUTS timeouts = {0};
timeouts.ReadIntervalTimeout = 50;
timeouts.ReadTotalTimeoutConstant = 50;
timeouts.ReadTotalTimeoutMultiplier =10;
timeouts.WriteTotalTimeoutConstant = 50;
timeouts.WriteTotalTimeoutMultiplier = 10;
if (!SetCommTimeouts(hSerial, &timeouts)){
cout << "Error occurred" << endl;
}
DWORD dwBytesWritten = 0;
DWORD dwBytesRead = 0;
unsigned char oneChar;
for (int i=0; i<16; i++)
{
oneChar=0x00+i;
WriteFile(hSerial, (LPCVOID)&oneChar, 1, &dwBytesWritten, NULL);
ReadFile (hSerial2, &oneChar, 1, &dwBytesRead, NULL); // what I tried to do, just outputs white space
}
CloseHandle(hSerial);
return 0;
}

In your statement:
ReadFile (hSerial2, &oneChar, 1, &dwBytesRead, NULL);
You need to check the value of dwBytesRead to see if you actually read any bytes. Maybe on one side of the connection you want a simple program to send a byte every second. On the other end, you want to check for available bytes and dump them as they come in.
What's likely happening in your program is that you're filling an outbound serial buffer in a short amount of time, not waiting long enough to read any data back, and then exiting the loop and closing the serial port, likely before it finishes sending the queued data. For example, write before your CloseHandle() call, you could add:
COMSTAT stat;
if (ClearCommError(hCom, NULL, &stat))
{
printf("%u bytes in outbound queue\n", (unsigned int) stat.cbOutQue);
}
And see whether your closing the handle before it's done sending.

Related

Windows named pipe dropping messages

I am trying to write a named pipe reader to consume events from another application that provides named pipe streaming. I've run into a problem where I do not receive all events from the named pipe server when it sends batches of events very quickly together.
To simplify the issue and isolate it from the other activity that is being done by the server application, I decided to cut it out and write a simple c++ console app that writes to a named pipe and have my code listen to that.
After creating the test application to send batches of 10 messages with a three second wait between batches. The server indicates success for every write operation, but in the end my named pipe reader is consistently only receiving two messages.
I tried adding in delays between read operations using Sleep(), and that had no effect. However, when I tried adding delays between write operations using Sleep() on the server, it allows me to receive the messages in the client.
With batches of 10, a 1ms Sleep() will generally allow me to get all messages. If I send larger batches, I have to increase that Sleep() a bit.
I tried reading with a simple C# app as well, and that did not improve my ability to read all events without delays between write operations.
This is extremely confusing to me, because as I understand it the pipes are supposed to function as FIFO, so I have no idea where these messages are going.
Does anyone have some insight into why these writes are successful, but I am unable to get all of them in the reader? Or why adding a small delay between the writes allows me to read them all just fine?
I have included the writer and reader code below. I removed error handling/validation code to simplify the readability of it. For the server side piece, I got the pipe definition from the production app I'm trying to consume so I made sure the creation was the same.
Writer
#include <iostream>
#include <windows.h>
#include <stdio.h>
#include <conio.h>
#include <tchar.h>
using namespace std;
#define BUFFER_SIZE 65535
int main()
{
HANDLE namedPipeHandle;
BOOL fSuccess = FALSE;
DWORD bytesRead;
wstring pipeName = L"\\\\.\\pipe\\TestPipe";
namedPipeHandle = CreateNamedPipe(pipeName.c_str(),
PIPE_ACCESS_DUPLEX |FILE_FLAG_OVERLAPPED,
PIPE_TYPE_MESSAGE | PIPE_READMODE_MESSAGE | PIPE_WAIT,
PIPE_UNLIMITED_INSTANCES,
BUFFER_SIZE,
BUFFER_SIZE,
NMPWAIT_USE_DEFAULT_WAIT,
NULL);
bool isConnected = ConnectNamedPipe(namedPipeHandle, NULL);
DWORD bytesWritten = 0;
string message = "TEST";
int count = 0;
while (true)
{
for (int i = 0; i < 10; i++)
{
fSuccess = WriteFile(namedPipeHandle, message.c_str(), message.size(), &bytesWritten, NULL);
if (fSuccess)
{
count++;
}
//Sleep(1); // Adding the Sleep(1) allows me to receive each batch of 10 in the client.
}
cout << "Sent " << count << " total messages" << endl;
Sleep(3000);
}
}
Reader
#include <iostream>
#include <windows.h>
#include <stdio.h>
#include <conio.h>
#include <tchar.h>
using namespace std;
#define BUFFER_SIZE 65535
int main()
{
HANDLE hPipe;
char buffer[BUFFER_SIZE];
DWORD bytesRead;
wstring pipeName = L"\\\\.\\pipe\\TestPipe";
hPipe = CreateFile(pipeName.c_str(), GENERIC_READ, 0, NULL, OPEN_EXISTING, 0, NULL);
int receivedCount = 0;
int failedCount = 0;
do
{
bool fSuccess;
fSuccess = ReadFile(hPipe, buffer, BUFFER_SIZE, &bytesRead, NULL);
if (fSuccess)
{
receivedCount++;
}
else
{
failedCount++;
}
cout << "Total Events Received: " << receivedCount << endl;
cout << "Total Events Failed: " << failedCount << endl;
} while (true);
CloseHandle(hPipe);
return 0;
}

WIN32 Garbage from Reading COM Port

I am attempting to read a message that was sent on one COM port and received on another. The two ports are connected via two USB to Serial converters. When I attempt to read the transmitted message I get this:
Tx Baud rate: 9600
Rx Baud rate: 9600
Attempting to read...
Hello, is ╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠
╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠╠á☼
Done...
Press any key to continue . . .
The message should read "Hello, is there anybody out there!?"
we is the code I have written:
#include <stdio.h>
#include <stdlib.h>
#include <iostream>
#include <string>
#include <Windows.h>
typedef struct COMDevice {
HANDLE deviceHandle;
DWORD actualBytesReadOrWritten;
int deviceStatus;
std::string message;
DCB controlSettings;
} COMDevice;
int main(int argc, char *argv[]) {
// create new device
COMDevice *comWriter = new COMDevice;
COMDevice *comReader = new COMDevice;
// setup
comWriter->deviceHandle = NULL;
comWriter->actualBytesReadOrWritten = 0;
comWriter->deviceStatus = 0;
comWriter->message = "Hello, is there anybody out there!?";
comReader->deviceHandle = NULL;
comReader->actualBytesReadOrWritten = 0;
comReader->deviceStatus = 0;
comReader->message = "";
// open COM1 for writing
comWriter->deviceHandle = CreateFile(TEXT("COM5"), GENERIC_WRITE, 0, 0, OPEN_ALWAYS, 0, 0);
if(comWriter->deviceHandle == INVALID_HANDLE_VALUE) {
std::cout << "Error occurred opening port for writing...\n";
return (int)GetLastError();
}
// open COM4 for reading
comReader->deviceHandle = CreateFile(TEXT("COM4"), GENERIC_READ, 0, 0, OPEN_ALWAYS, 0, 0);
if(comReader->deviceHandle == INVALID_HANDLE_VALUE) {
std::cout << "Error occurred opening port for reading...\n";
return (int)GetLastError();
}
// check baud rates
if(GetCommState(comWriter->deviceHandle, &comWriter->controlSettings) == 0 ||
GetCommState(comReader->deviceHandle, &comReader->controlSettings) == 0) {
std::cout << "Error occurred getting the comm state...\n";
return (int)GetLastError();
}
else {
std::cout << "Tx Baud rate: " << comWriter->controlSettings.BaudRate << std::endl;
std::cout << "Rx Baud rate: " << comReader->controlSettings.BaudRate << std::endl;
}
// write message to serial port
comWriter->deviceStatus = WriteFile(comWriter->deviceHandle, comWriter->message.c_str(),
comWriter->message.length(), &comWriter->actualBytesReadOrWritten, NULL);
if(comWriter->deviceStatus == FALSE) {
std::cout << "Error occurred writing to port..\n";
return (int)GetLastError();
}
// wait a few
int i = 0, count = 4000;
while(i < count) { i++; }
// read
std::cout << "Attempting to read...\n";
char buffer[1024];
comReader->deviceStatus = ReadFile(comReader->deviceHandle, buffer, 1023,
&comReader->actualBytesReadOrWritten, NULL);
if(comReader->deviceStatus == FALSE) {
return (int)GetLastError();
}
std::cout << buffer << std::endl;
// close handles
(void)FlushFileBuffers(comReader->deviceHandle);
(void)CloseHandle(comWriter->deviceHandle);
(void)CloseHandle(comReader->deviceHandle);
// clean up...
delete comWriter;
delete comReader;
std::cout << "Done...\n";
return 0;
}
I also use the DCB structure to check the baud rates at both ends...they match. Is there something else I may be missing?
When you read from the serial port with
ReadFile(comReader->deviceHandle, buffer, 1023,
&comReader->actualBytesReadOrWritten, NULL);
the actual number of bytes read is stored in comReader->actualBytesReadOrWritten (the 4th parameter). But you are not using it for printing. The end result is that you read a few bytes, and then you try to print them, but since they are not NUL-terminated, you print the text and a lot of garbage, until it happens to find a NUL character (or crash).
The easy solution is to put a NUL character just after the ReadFile:
buffer[comReader->actualBytesReadOrWritten] = '\0';
But then, there is actually the problem that you did not receive all the bytes yet. There are a few ways to ensure that all the data has been read, retry, wait for a while... retry again...
Hint
The character '╠', if you look for it in the old OEM codepage, it is byte 0xCC, (it will be 'Ì' with ANSI western codepage) that is the byte VC++ uses to fill uninitialized stack space in debug builds. So a lot of these characters strongly suggest an uninitialized local buffer.

C++ program freezes during COM port read

I am trying to write a simple program to send single characters to a program via a COM port and read the answers I get back. I think I have working script where I can at least send commands via the com port, but when the ReadFile function begins it freezes. I have the comm timeout set for 100ms, so I don't think that it is locking the port, but I may be wrong. I am not getting any errors, and no warnings when I compile. I am very new to C++ (normally work with python), so please be as clear a possible with your answers.
// comtest.cpp : Defines the entry point for the console application.
//
#include "stdafx.h"
#include <iostream>
#include <string>
#include <sstream>
#include <dos.h>
#include <stdio.h>
#include <conio.h>
#include <string.h>
#include <windows.h>
int main(int argc, char **argv)
{
std::cout << "TOP! \n";
char buffer[1];
HANDLE file;
COMMTIMEOUTS timeouts;
DWORD read, written;
DCB port;
char init[] = ""; // e.g., "ATZ" to completely reset a modem.
// open the comm port.
file = CreateFile(L"COM1",
GENERIC_READ | GENERIC_WRITE,
0,
NULL,
OPEN_EXISTING,
0,
NULL);
std::cout << "file made \n";
// get the current DCB, and adjust a few bits to our liking.
memset(&port, 0, sizeof(port));
port.DCBlength = sizeof(port);
// set short timeouts on the comm port.
timeouts.ReadIntervalTimeout = 100;
timeouts.ReadTotalTimeoutMultiplier = 1;
timeouts.ReadTotalTimeoutConstant = 100;
timeouts.WriteTotalTimeoutMultiplier = 1;
timeouts.WriteTotalTimeoutConstant = 100;
int N = 10;
while (N > 1)
{
std::cout << "i'm in the loop!" << N << " loops left \n";
char command [1];
char * commandbuff;
std::cin >> command;
commandbuff = &command[1];
WriteFile(file, commandbuff, sizeof(commandbuff),&written, NULL);
Sleep(1000);
std::cout << "I just slept \n";
ReadFile(file, buffer, sizeof(buffer), &read, NULL);
N--;
}
// close up and go home.
CloseHandle(file);
return 0;
Your code doesn't appear to actually call SetCommTimeouts, so the timeouts you have defined would have no way to be applied.
Receiving data from a com port, don't start reading unless you have first sent a command or something that gets a response. Then, it's preferable to just read one byte at a time, but if you are sending modbus/at commands like I'm doing and know you're expecting 8 bytes back, then it's ok to use readfile to read 8 bytes. Most of the C++ com port examples have SetCommState, SetCommTimeouts, SetCommMask and WaitCommEvent before you can read that single byte.
Mine had an "&" on the second parameter of ReadFile. MS Visual C++ though.
Status = ReadFile(fileusb, &ReadData, sizeof(ReadData), &NoBytesRead, NULL);

"Connection was broken" error with UDT (UDP-based data transfer protocol)

I am programming a real-time game in which I need reliable UDP, so I've chosen to work with UDT (UDP-based data transfer protocol - http://sourceforge.net/projects/udt/).
The clients (on browsers) send real-time messages to my server via CGI scripts. The problem is that there are some messages that are being lost, and I don't know why because the server says that it sent all the messages successfully to the corresponding clients, but sometimes the client doesn't receive the message.
In my debug file, I've found that when a message is not received by the client, its script says:
error in recv();
recv: Connection was broken.
I would like to get some help on how the server shall know if the client got its message; should I send a NACK or something from the client side? I thought that UDT should do that for me. Can someone clarify this situation?
The relevant sections of the communication parts of my code are bellow, with some comments:
server's relevant code:
//...
void send_msg_in(player cur, char* xml){
/*this function stores the current message, xml, in a queue if xml!=NULL, and sends the 1st message of the queue to the client*/
/*this function is called when the player connects with the entering xml=NULL to get the 1st message of the queue,
or with xml!=NULL when a new message arrives: in this case the message is stored in the queue, and then the message will be sent in the appropriate time, i.e. the messages are ordered.*/
char* msg_ptr=NULL;
if (xml!=NULL){ //add the message to a queue (FIFO), the cur.xml_msgs
msg_ptr=(char*) calloc(strlen(xml)+1, sizeof(char));
strcpy(msg_ptr, xml);
(*(cur.xml_msgs)).push(msg_ptr);
} //get the 1st message of the queue
if (!(*(cur.xml_msgs)).empty()){
xml=(*(cur.xml_msgs)).front();
}
if (cur.get_udt_socket_in()!=NULL){
UDTSOCKET cur_udt = *(cur.get_udt_socket_in());
// cout << "send_msg_in(), cur_udt: " << cur_udt << endl;
//send the "xml", i.e. the 1st message of the queue...
if (UDT::ERROR == UDT::send(cur_udt, xml, strlen(xml)+1, 0)){
UDT::close(cur_udt);
cur.set_udt_socket_in(NULL);
}
else{ //if no error this else is reached
cout << "TO client:\n" << xml << "\n"; /*if there is no error,
i.e. on success, the server prints the message that was sent.*/
// / \
// /_!_\
/*the problem is that
the messages that are lost don't appear on the client side,
but they appear here on the server! */
if (((string) xml).find("<ack.>")==string::npos){
UDT::close(cur_udt);
cur.set_udt_socket_in(NULL); //close the socket
}
(*(cur.xml_msgs)).pop();
}
}
}
//...
client's relevant code:
//...
#define MSGBUFSIZE 1024
char msgbuf[MSGBUFSIZE];
UDTSOCKET client;
ofstream myfile;
//...
main(int argc, char *argv[]){
//...
// connect to the server, implict bind
if (UDT::ERROR == UDT::connect(client, (sockaddr*)&serv_addr, sizeof(serv_addr))){
cout << "error in connect();" << endl;
return 0;
}
myfile.open("./log.txt", ios::app);
send(xml);
char* cur_xml;
do{
cur_xml = receive(); //wait for an ACK or a new message...
myfile << cur_xml << endl << endl; // / \
/* /_!_\ the lost messages don't appear on the website
neither on this log file.*/
} while (((string) cur_xml).find("<ack.>")!=string::npos);
cout << cur_xml << endl;
myfile.close();
UDT::close(client);
return 0;
}
char* receive(){
if (UDT::ERROR == UDT::recv(client, msgbuf, MSGBUFSIZE, 0)){
// / \
/* /_!_\ when a message is not well received
this code is usually reached, and an error is printed.*/
cout << "error in recv();" << endl;
myfile << "error in recv();" << endl;
myfile << "recv: " << UDT::getlasterror().getErrorMessage() << endl << endl;
return 0;
}
return msgbuf;
}
void* send(string xml){
if (UDT::ERROR == UDT::send(client, xml.c_str(), strlen(xml.c_str())+1, 0)){
cout << "error in send();" << endl;
myfile << "error in send();" << endl;
myfile << "send: " << UDT::getlasterror().getErrorMessage() << endl << endl;
return 0;
}
}
Thank you for any help!
PS. I tried to increase the linger time on close(), after finding the link http://udt.sourceforge.net/udt4/doc/opt.htm, adding the following to the server's code:
struct linger l;
l.l_onoff = 1;
l.l_linger = ...; //a huge value in seconds...
UDT::setsockopt(*udt_socket_ptr, 0, UDT_LINGER, &l, sizeof(l));
but the problem is still the same...
PPS. the other parts of the communication in the server side are: (note: it seams for me that they are not so relevant)
main(int argc, char *argv[]){
char msgbuf[MSGBUFSIZE];
UDTSOCKET serv = UDT::socket(AF_INET, SOCK_STREAM, 0);
sockaddr_in my_addr;
my_addr.sin_family = AF_INET;
my_addr.sin_port = htons(PORT);
my_addr.sin_addr.s_addr = INADDR_ANY;
memset(&(my_addr.sin_zero), '\0', sizeof(my_addr.sin_zero));
if (UDT::ERROR == UDT::bind(serv, (sockaddr*)&my_addr, sizeof(my_addr))){
cout << "error in bind();";
return 0;
}
UDT::listen(serv, 1);
int namelen;
sockaddr_in their_addr;
while (true){
UDTSOCKET recver = UDT::accept(serv, (sockaddr*)&their_addr, &namelen);
if (UDT::ERROR == UDT::recv(recver, msgbuf, MSGBUFSIZE, 0)){
//this recv() function is called only once for each aqccept(), because the clients call CGI scripts via a browser, they need to call a new CGI script with a new UDT socket for each request (this in in agreement to the clients' code presented before).
cout << "error in recv();" << endl;
}
char* player_xml = (char*) &msgbuf;
cur_result = process_request((char*) &msgbuf, &recver, verbose); //ACK
}
}
struct result process_request(char* xml, UDTSOCKET* udt_socket_ptr, bool verbose){
//parse the XML...
//...
player* cur_ptr = get_player(me); //searches in a vector of player, according to the string "me" of the XML parsing.
UDTSOCKET* udt_ptr = (UDTSOCKET*) calloc(1, sizeof(UDTSOCKET));
memcpy(udt_ptr, udt_socket_ptr, sizeof(UDTSOCKET));
if (cur_ptr==NULL){
//register the player:
player* this_player = (player*) calloc(1, sizeof(player));
//...
}
}
else if (strcmp(request_type.c_str(), "info_waitformsg")==0){
if (udt_ptr!=NULL){
cur_ptr->set_udt_socket_in(udt_ptr);
if (!(*(cur_ptr->xml_msgs)).empty()){
send_msg_in(*cur_ptr, NULL, true);
}
}
}
else{ //messages that get instant response from the server.
if (udt_ptr!=NULL){
cur_ptr->set_udt_socket_out(udt_ptr);
}
if (strcmp(request_type.c_str(), "info_chat")==0){
info_chat cur_info;
to_object(&cur_info, me, request_type, msg_ptr); //convert the XML string values to a struct
process_chat_msg(cur_info, xml);
}
/* else if (...){ //other types of messages...
}*/
}
}
void process_chat_msg(info_chat cur_info, char* xml_in){
player* player_ptr=get_player(cur_info.me);
if (player_ptr){
int i=search_in_matches(matches, cur_info.match_ID);
if (i>=0){
match* cur_match=matches[i];
vector<player*> players_in = cur_match->followers;
int n=players_in.size();
for (int i=0; i<n; i++){
if (players_in[i]!=msg_owner){
send_msg_in(*(players_in[i]), xml, flag);
}
}
}
}
}
Looking at the UDT source code at http://sourceforge.net/p/udt/git/ci/master/tree/udt4/src/core.cpp, the error message "Connection was broken" is produced when either of the Boolean flags m_bBroken or m_bClosing is true and there is no data in the receive buffer.
Those flags are set in just a few cases:
In sections of code marked "should not happen; attack or bug" (unlikely)
In deliberate close or shutdown actions (don't see this happening in your code)
In expiration of a timer that checks for peer activity (the likely culprit)
In that source file at line 2593 it says:
// Connection is broken.
// UDT does not signal any information about this instead of to stop quietly.
// Application will detect this when it calls any UDT methods next time.
//
m_bClosing = true;
m_bBroken = true;
// ...[code omitted]...
// app can call any UDT API to learn the connection_broken error
Looking at the send() call, I don't see anywhere that it waits for an ACK or NAK from the peer before returning, so I don't think a successful return from send() on the server side is indicative of successful receipt of the message by the client.
You didn't show the code on the server side that binds to the socket and listens for responses from the client; if the problem is there then the server might be happily sending messages and never listening to the client that is trying to respond.
UDP is not a guaranteed-transmission protocol. A host will send a message, but if the recipient does not receive it, or if it is not received properly, no error will be raised. Therefore, it is commonly used in applications that require speed over perfect delivery, such as games. TCP does guarantee delivery, because it requires that a connection be set up first, and each message is acknowledged by the client.
I would encourage you to think about whether you actually need guaranteed receipt of that data, and, if you do, consider using TCP.

TCP/Ip network communication in c++

I am trying to write a threaded function that sends system information via Tcp/ip over the local network to another computer. I have been using sockets to achieve this and this has worked out quite allright thus far. But I am now at a point where this usually works but around 30% of the time I get error messages telling me that the socket can not be opened. I use the activeSocket library for the sockets.
#include "tbb/tick_count.h"
#include "ActiveSocket.h"
using namespace std;
CActiveSocket socket;
extern int hardwareStatus;
int establishTCP() {
char time[11];
int communicationFailed = 0;
memset(&time, 0, 11);
socket.Initialize();
socket.SetConnectTimeout(0, 20);
socket.SetSendTimeout(0, 20);
return communicationFailed;
}
int monitor() {
cout << "Monitor: init continious monitoring" << endl;
int communicationFailed;
tbb::tick_count monitorCounter = tbb::tick_count::now();
while (!closeProgram) {
tbb::tick_count currentTick = tbb::tick_count::now();
tbb::tick_count::interval_t interval;
interval = currentTick - monitorCounter;
if (interval.seconds() > 2) {
monitorCounter = tbb::tick_count::now();
communicationFailed = 1;
char buffer[256];
sprintf(buffer, "%d;", hardwareStatus);
establishTCP();
char *charip = new char[monitoringIP.size() + 1];
charip[monitoringIP.size()] = 0;
memcpy(charip, monitoringIP.c_str(), monitoringIP.size());
const uint8* realip = (const uint8 *) charip;
int monitorCount = 0;
cout << "Monitor: " << buffer << endl;
while (communicationFailed == 1 && monitorCount < 2) {
monitorCount++;
if (socket.Open(realip, 2417)) {
if (socket.Send((const uint8 *) buffer, strlen(buffer))) {
cout << "Monitor: Succeeded sending data" << endl;
communicationFailed = 0;
socket.Close();
} else {
socket.Close();
communicationFailed = 1;
cout << "Monitor: FAILED TO SEND DATA" << endl;
}
} else {
socket.Close();
communicationFailed = 1;
cout << "Monitor: FAILED TO OPEN SOCKET FOR DATA" << endl;
}
}
if (monitorCount == 2) cout << "Monitor: UNABLE TO SEND DATA" << endl;
}
}
return communicationFailed;
}
I think I am doing something wrong with these functions and that the problem is not on the other side of the line where this data is received. Can anyone see any obvious mistakes in this code that could cause the failure? I keep getting my own cout message "Monitor: FAILED TO OPEN SOCKET FOR DATA"
EDIT: With telnet everything works fine, 100% of the time
You can use netstat to check that the server is listening on the port and connections are being established. Snoop is another good application in your Armour for finding out what is going wrong. Another possibility is to use telnet to see if the client can connect to that IP address and port. As to the code I will take a look at it later to see if something has gone awry.
socket is a global variable. It might be re-used concurrently between two threads or sequentially inside one thread. In fact, the while(~closeProgram) loop indicates that you intend to use it sequentially.
Some documentation for CActiveSocket::Open reads: "Connection-based protocol sockets (CSocket::SocketTypeTcp) may successfully call Open() only once..."
Perhaps your program fails when you call .Open() twice on the same object.
I eventually found out the problem with my code. As the connection was unstable and working for 70% of the time it seemed to be a timeout issue. I removed the two timeout settings
socket.SetConnectTimeout(0, 20);
socket.SetSendTimeout(0, 20);
Now it works perfectly fine, thanks for the troubleshooting tips though!