Boost asio serial port initially sending bad data - c++

I am attempting to use boost asio for serial communication. I am currently working in Windows, but will eventually be moving the code into Linux. When ever I restart my computer data sent from the program is not what it should be (for example I send a null followed by a carriage return and get "00111111 10000011" in binary) and it is not consistent (multiple nulls yield different binary).
However, as soon as I use any other program to send any data to the serial port and run the program again it works perfectly. I think I must be missing something in the initialization of the port, but my research has not turned anything up.
Here is how I am opening the port:
// An IOService to get the socket to work
boost::asio::io_service *io;
// An acceptor for getting connections
boost::shared_ptr<boost::asio::serial_port> port;
// Cnstructor Functions
void Defaults() {
io = new boost::asio::io_service();
// Set Default Commands
command.prefix = 170;
command.address = 3;
command.xDot[0] = 128;
command.xDot[1] = 128;
command.xDot[2] = 128;
command.throtle = 0;
command.button8 = 0;
command.button16 = 0;
command.checkSum = 131;
}
void Defaults(char * port, int baud) {
Defaults();
// Setup the serial port
port.reset(new boost::asio::serial_port(*io,port));
port->set_option( boost::asio::serial_port_base::baud_rate( baud ));
// This is for testing
printf("portTest: %i\n",(int)port->is_open());
port->write_some(boost::asio::buffer((void*)"\0", 1));
boost::asio::write(*port, boost::asio::buffer((void*)"\0", 1));
boost::asio::write(*port, boost::asio::buffer((void*)"\r", 1));
boost::asio::write(*port, boost::asio::buffer((void*)"\r", 1));
Sleep(2000);
}
Edit: In an attempt to remove unrelated code I accidentally deleted the the line where I set the baud rate, I added it back. Also, I am checking the output with a null-modem and Docklight. Aside from the baud rate I am using all of the default serial settings specified for a boost serial port (I have also tried explicitly setting them with no effect).

You haven't said how you're checking what's being sent, but it's probably a baud rate mismatch between the two ends.
It looks like you're missing this:
port->set_option( boost::asio::serial_port_base::baud_rate( baud ) );
Number of data bits, parity, and start and stop bits will also need configuring if they're different to the default values.
If you still can't sort it, stick an oscilloscope on the output and compare the waveform of the sender and receiver. You'll see something like this.

This is the top search result when looking for this problem, so I thought I'd give what I believe is the correct answer:
Boost Asio is bugged as of this writing and does not set default values for ports on the Windows platform. Instead, it grabs the current values on the port and then bakes them right back in. After a fresh reboot, the port hasn't been used yet, so its Windows-default values are likely not useful for communication (byte size typically defaults to 9, for example). After you use the port with a program or library that sets the values correctly, the values Asio picks up afterward are correct.
To solve this until Asio incorporates the fix, just set everything on the port explicitly, ie:
using boost::asio::serial_port;
using boost::asio::serial_port_base;
void setup_port(serial_port &serial, unsigned int baud)
{
serial.set_option(serial_port_base::baud_rate(baud));
serial.set_option(serial_port_base::character_size(8));
serial.set_option(serial_port_base::flow_control(serial_port_base::flow_control::none));
serial.set_option(serial_port_base::parity(serial_port_base::parity::none));
serial.set_option(serial_port_base::stop_bits(serial_port_base::stop_bits::one));
}

Related

Communicating with a serial port in a bit bang state

My question is largely a clarification question, I am trying to communicate through a serial port in a bit bang sort of sense. Whereas I want to output out of the serial port exactly what I write to it..
If I write 0-0-0-1, I would want the output on the wire to be 0-0-0-1.
The reason I ask this question is because it seems like using the DCB object to configure the port, I have to set all sorts of settings like Stop Bits and Parity?
Is there a way to configure this port as 'raw' as possible? I don't want anything being sent that I don't send.. if that makes sense..
//Open the serial port
hComm = CreateFile( "\\\\.\\COM1", // Name of the Port to be Opened
GENERIC_WRITE, // Write Access
0, // No Sharing, ports cant be shared
NULL, // No Security
OPEN_EXISTING, // Open existing port only
0, // Non Overlapped I/O
NULL); // Null for Comm Devices
DCB dcbSerialParams = { 0 }; // Initializing DCB structure
dcbSerialParams.DCBlength = sizeof(dcbSerialParams);
//retreives the current settings
Status = GetCommState(hComm, &dcbSerialParams);
if(!Status){
//Error in GetCommState!
}
dcbSerialParams.BaudRate = 10000; // Setting BaudRate = 10000
dcbSerialParams.ByteSize = 8; // Setting ByteSize = 8
dcbSerialParams.StopBits = ONESTOPBIT; // Setting StopBits = 1
dcbSerialParams.Parity = ODDPARITY;
I wasn't quite sure from the question article what you were having problems with.
Are you worried about the bit order of the MSB/LSB or the endianness of 4-byte integers? Or will data be added or deleted or changed by operating as a terminal?
Please add the exact content.
Bit order is a hardware standard specification issue so you don't have to worry about it.
The endian problem seems to be lacking in your understanding.
Serial port communication is basically byte-by-byte communication.
You need to set the data in a byte array and send it, and the received data is a byte array.
It is a problem on the application program side to give them meanings such as 4-byte integers and single/double precision floating point numbers.
Then, looking at your source code, it seems that you are calling the Windows Win32 API directly.
Communications Functions
The Win32 API and device drivers only have raw mode.
Unlike Unix tty devices, it doesn't have any features like character completion/conversion/newline waiting buffering or control code equivalents. (Unless you explicitly specify XON / XOFF flow control)
On Windows, those features aren't APIs, but libraries/framworks that run on them.
Basically, each setting such as DCB and timeout must be done according to the specifications of the device with which you are communicating. Even if you say "all settings", the number of items will fit on one page.
DCB structure (winbase.h)

input delay with PortAudio callback and ASIO sdk

I'm trying to get the input from my guitar to be played through my computer using the portaudio library and the ASIO sdk.
I have been following some of the tutorials on the official website to get the basics set up. Currently I got it working so that portaudio is listening to the right input and output device and I have the callback setup to just output the input and do nothing with it like this:
static int paTestCallback(const void *inputBuffer, void *outputBuffer, unsigned long framesPerBuffer, const PaStreamCallbackTimeInfo* timeInfo, PaStreamCallbackFlags statusFlags, void *userData)
{
float *out = (float*)outputBuffer;
float* in = (float*)inputBuffer;
for (int i = 0; i<framesPerBuffer; i++)
{
*out++ = *in++; /* left */
*out++ = *in++; /* right */
}
return 0;
}
This callback is setup by calling this:
PaError error = Pa_OpenDefaultStream(&stream, 2, 2, paFloat32, 44100, paFramesPerBufferUnspecified, paTestCallback, &data);
Pa_StartStream(stream);
Now, this does work but I have a lot of delay (about 0.5s) when I strike a string on my guitar and when I hear it through the monitors.
Is there a way to solve this delay? Do I need to rewrite the callback method?
EDIT:
So, I got the delay to be a lot better using this code instead of the basic Pa_OpenDefaultStream()
int defaultIn = Pa_GetDefaultInputDevice();
int defaultOut = Pa_GetDefaultOutputDevice();
PaStreamParameters *inParam = new PaStreamParameters();
inParam->channelCount = 2;
inParam->device = defaultIn;
inParam->sampleFormat = paFloat32;
inParam->suggestedLatency = 0.05;
PaStreamParameters *outParam = new PaStreamParameters();
outParam->channelCount = 2;
outParam->device = defaultOut;
outParam->sampleFormat = paFloat32;
outParam->suggestedLatency = 0;
error = Pa_OpenStream(&stream, inParam, outParam, 44100, paFramesPerBufferUnspecified, paNoFlag, paTestCallback, &data);
if (error != paNoError) {
Logger::log("[PortAudioManager] Could not open default stream. Exiting function...");
return;
}
Pa_StartStream(stream);
There is still a little bit of delay though, mostly noticeable when playing more then just a single note.
EDIT:
I figured out with the help of Ross-Bencina that the windows default input device and output device doesn't change anything to the index of the host api's in PortAudio. I seemed to be using MME all this time. I did the following to get the right index for the ASIO device:
int hostNr = Pa_GetHostApiCount();
std::vector<const PaHostApiInfo*> infoVertex;
for (int t = 0; t < hostNr; ++t) {
infoVertex.push_back(Pa_GetHostApiInfo(t));
}
Then I just checked which is the one with ASIO and set the suggestedLatency in both PaStreamParameters to 0 and the delay is now gone and sound is good (although it's mono for now).
You are on the right track using paFramesPerBufferUnspecified.
The ASIO latency behavior depends on the driver. There are two possibilities:
The ASIO driver lets the code (i.e. PortAudio) request a latency (possibly with some constraints). PortAudio finds to best match between a supported driver buffer size and the latency that you request.
The other possibility is that your audio interface does not provide programmatic control over latency settings. Instead, latency is only selectable from the driver's ASIO control panel UI (and the driver will force a fixed buffer size on PortAudio). In this case, you should investigate the driver control panel UI to set the lowest workable latency.
In either case, your approach with Pa_OpenStream is close to optimal, but you should request zero latency for both input and output (in your edit you're requesting 50ms input latency, zero output latency). The end result will be that PortAudio selects the lowest available ASIO buffer size. If this turns out to be unstable (audio glitches) then you'll need to increase the requested latency.
include/pa_asio.h exposes a host-API-specific interface for querying the ASIO buffer sizes allowed by the driver (be aware that this can change if you change settings in the control panel). It also provides a function to display the driver's control panel UI.
EDIT: Note that Pa_GetDefaultInputDevice() and Pa_GetDefaultOutputDevice() will only return ASIO devices if you built PortAudio for ASIO only. If you included any other more common APIs in the build (e.g. WMME or DirectSound) they will be given priority as the (lowest common denominator) default device. You could add a check that you are actually accessing the ASIO device:
assert(Pa_GetHostApiInfo(Pa_GetDeviceInfo(Pa_GetDefaultOutputDevice())->hostApi)->type == paASIO);
If PortAudio is compiled with support for multiple host APIs: To get the default ASIO device: enumerate host APIs using Pa_GetHostApiCount and Pa_GetHostApiInfo to find the ASIO host API. Then pull the default ASIO device indices from the returned PaHostApiInfo struct.

Can't read from serial device

I'm trying to write a library to read data from a serial device, a Mipex-02 gas sensor. Unfortunately, my code doesn't seem to open the serial connection properly, and I can't figure out why.
The full source code is available on github, specifically, here's the configuration of the serial connection:
MipexSensor::MipexSensor(string devpath) {
if (!check_dev_path(devpath))
throw "Invalid devpath";
this->path = devpath;
this->debugout_ = false;
this->sensor.SetBaudRate(SerialStreamBuf::BAUD_9600);
this->sensor.SetCharSize(SerialStreamBuf::CHAR_SIZE_8);
this->sensor.SetNumOfStopBits(1);
this->sensor.SetParity(SerialStreamBuf::PARITY_NONE);
this->sensor.SetFlowControl(SerialStreamBuf::FLOW_CONTROL_NONE);
this->last_concentration = this->last_um = this->last_ur = this->last_status = 0;
cout << "Connecting to "<< devpath << endl;
this->sensor.Open(devpath);
}
I think the meaning of the enums here are obvious enough. The values are from the instruction manual:
UART characteristics:
exchange rate – 9600 baud,
8-bit message,
1 stop bit,
without check for parity
So at first I was using interceptty to test it, and it worked perfectly fine. But when I tried to connect to the device directly, I couldn't read anything. the RX LED flashes on the devices so clearly the program manages to send something, but -unlike with interceptty- the TX LED never flash.
So I don't know if it's sending data incorrectly, if it's not sending all of it, and I can't even sniff the connection since it only happens when interceptty isn't in the middle. Interceptty's command-line is interceptty -s 'ispeed 9600 ospeed 9600 -parenb -cstopb -ixon cs8' -l /dev/ttyUSB0 (-s options are passed to stty) which is in theory the same options set in the code.
Thanks in advance.

get SOCK_RAW frames with different rate [duplicate]

I'm writing code to send raw Ethernet frames between two Linux boxes. To test this I just want to get a simple client-send and server-receive.
I have the client correctly making packets (I can see them using a packet sniffer).
On the server side I initialize the socket like so:
fd = socket(PF_PACKET, SOCK_RAW, htons(MY_ETH_PROTOCOL));
where MY_ETH_PROTOCOL is a 2 byte constant I use as an ethertype so I don't hear extraneous network traffic.
when I bind this socket to my interface I must pass it a protocol again in the socket_addr struct:
socket_address.sll_protocol = htons(MY_ETH_PROTOCOL);
If I compile and run the code like this then it fails. My server does not see the packet. However if I change the code like so:
socket_address.sll_protocol = htons(ETH_P_ALL);
The server then can see the packet sent from the client (as well as many other packets) so I have to do some checking of the packet to see that it matches MY_ETH_PROTOCOL.
But I don't want my server to hear traffic that isn't being sent on the specified protocol so this isn't a solution. How do I do this?
I have resolved the issue.
According to http://linuxreviews.org/dictionary/Ethernet/ referring to the 2 byte field following the MAC addresses:
"values of that field between 64 and 1522 indicated the use of the new 802.3 Ethernet format with a length field, while values of 1536 decimal (0600 hexadecimal) and greater indicated the use of the original DIX or Ethernet II frame format with an EtherType sub-protocol identifier."
so I have to make sure my ethertype is >= 0x0600.
According to http://standards.ieee.org/regauth/ethertype/eth.txt use of 0x88b5 and 0x88b6 is "available for public use for prototype and vendor-specific protocol development." So this is what I am going to use as an ethertype. I shouldn't need any further filtering as the kernel should make sure to only pick up ethernet frames with the right destination MAC address and using that protocol.
I've worked around this problem in the past by using a packet filter.
Hand Waving (untested pseudocode)
struct bpf_insn my_filter[] = {
...
}
s = socket(PF_PACKET, SOCK_DGRAM, htons(protocol));
struct sock_fprog pf;
pf.filter = my_filter;
pf.len = my_filter_len;
setsockopt(s, SOL_SOCKET, SO_ATTACH_FILTER, &pf, sizeof(pf));
sll.sll_family = PF_PACKET;
sll.sll_protocol = htons(protocol);
sll.sll_ifindex = if_nametoindex("eth0");
bind(s, &sll, sizeof(sll));
Error checking and getting the packet filter right is left as an exercise for the reader...
Depending on your application, an alternative that may be easier to get working is libpcap.

Communication Arduino-C++ do not read Arduino

I have the following code:
QSerialPort arduPort("COM5");
arduPort.setBaudRate(QSerialPort::Baud9600);
arduPort.setDataBits(QSerialPort::Data8);
arduPort.setParity(QSerialPort::NoParity);
arduPort.setStopBits(QSerialPort::OneStop);
arduPort.setFlowControl(QSerialPort::NoFlowControl);
arduPort.open(QSerialPort::ReadWrite);
cout<<arduPort.isReadable()<<endl;
cout<<arduPort.isWritable()<<endl;
arduPort.write("a");
QByteArray s=arduPort.readAll();
cout<<QString(s).toStdString()<<endl;
And the next code in Arduino:
int inByte = 0;
void setup()
{
Serial.begin(9600);
while(!Serial){;}
int i=0;
}
void loop()
{
if(Serial.read()=='a')
Serial.write('b');
}
First I send an 'a' to the Arduino, and the ARduino must respond with 'b'. But when I read the port of the Arduino, I recieve '' only.
Anyone knows why I recieve '' instead of 'b'? Thanks for your time.
Update: See bottom of this answer for the answer. TL;DR: You have so set the baud rate (and presumably all the other settings) after you open the port.
I believe this is a bug in the Windows implementation of QSerialPort. I haven't been able to narrow down the cause yet but I have the following symptoms:
Load the Arduino (Uno in my case; Leonardo may behave very differently) with the ASCII demo. Unplug and replug the Arduino. Note that the TX light doesn't come on.
Connect to it with Putty or the Arduino serial port monitor. This resets the Arduino and then prints the ASCII table. The TX light is on continuously as expected.
Unplug/replug the Arduino and this time connect to it with a QSerialPort program. This time despite the port being opened ok the TX light never comes on and readyRead() is never triggered. Also note that the Arduino is not reset because by default QSerialPort does not change DTR. If you do QSerialPort::setDataTerminalReady(false); then pause for 10ms then set it true it will reset the Arduino as expected but it still doesn't transmit.
Note that if you have an Arduino program that transmits data continuously (ASCII example stops), if you open the port with putty so that it starts transmitting and then open it with QSerialPort without unplugging the cable it will work! However as soon as you unplug/plug the cable it stops working again.
This makes me suspect that putty is setting some serial port option that is required by the arduino and reset when you replug the cable. QSerialPort obviously doesn't change this value.
Here are the settings used by Putty as far as I can tell:
dcb.fBinary = TRUE;
dcb.fDtrControl = DTR_CONTROL_ENABLE;
dcb.fDsrSensitivity = FALSE;
dcb.fTXContinueOnXoff = FALSE;
dcb.fOutX = FALSE;
dcb.fInX = FALSE;
dcb.fErrorChar = FALSE;
dcb.fNull = FALSE;
dcb.fRtsControl = RTS_CONTROL_ENABLE;
dcb.fAbortOnError = FALSE;
dcb.fOutxCtsFlow = FALSE;
dcb.fOutxDsrFlow = FALSE;
dcb.Parity = NOPARITY;
dcb.StopBits = ONESTOPBIT;
dcb.BaudRate = ...;
dcb.ByteSize = ...;
And by QSerialPort:
dcb.fBinary = TRUE;
dcb.fDtrControl = unchanged!
dcb.fDsrSensitivity = unchanged!
dcb.fTXContinueOnXoff = unchanged!
dcb.fOutX = FALSE;
dcb.fInX = FALSE;
dcb.fErrorChar = FALSE;
dcb.fNull = FALSE;
dcb.fRtsControl = RTS_CONTROL_DISABLE;
dcb.fAbortOnError = FALSE;
dcb.fOutxCtsFlow = FALSE;
dcb.fOutxDsrFlow = unchanged!
dcb.Parity = NOPARITY;
dcb.StopBits = ONESTOPBIT;
dcb.BaudRate = ...;
dcb.ByteSize = ...;
So I think it must be one of those unchanged values which makes the Arduino think that it isn't connected. From the DCB documentation I suspect fTxContinueOnXoff.
Ok I am going to write a little program to read these settings and see what changes.
Update 1
Ok I wrote my program and made the following discovery. The differences after running putty and just my Qt program were:
BaudRate: THIS WASN'T SET BY QT!!!!!!! It turns out you can only set the baud rate after you open the port.. Otherwise it is left at the previous value which is 0 when you first plug in the cable.
fDtrControl: Set to 1 by Putty, left at 0 by Qt.
fOutX and fInX: Both also set to 1 by Putty and left at 0 by Qt.
After moving all my set...() function calls after the open it worked perfectly. I didn't have to fiddle with DtrControl or Out/InX. (Although I have also set DTR high manually.)
Update 2
While setting all the parameters I thought it would be a good idea to set the error policy to 'skip'. DON'T DO THIS! LEAVE IT ON IGNORE! Otherwise it fucks everything up and adds weird delays to all your communications.
Setting the ports before open is not quite possible before Qt 5.2. The reason is that the original design was a bit too low-level for the class rather properly object oriented. I had been considered for a long-time whether to change it and in the end I actually decided to do so.
I have just submitted a change that is now under code review that will make your original concept work, too.
Here you can find the details:
Make it possible to set the port values before opening
The summary can be read here for the change:
Make it possible to set the port values before opening
This patch also changes the behavior of the open method. We do not use port
detection anymore for good. It has been a broken concept, and it is very
unlikely that anyone has ever relied on it. If anyone had done that, they would
be in trouble anyway, getting noisy warnings needlessly and all that.
The other option was also considered to keep this behavior optionally as the
default, but that would complicate the API too much without not much gain.
The default port settings are now also the sane 9600,8,N,1, and no flow control.
Also please note that the serial port is closed automatically in the open method
if any of the settings fails.
Please update your Qt version (at least to Qt 5.2) or you can backport the change yourself. Then, it is possible to write this code and actually it is even recommended:
QSerialPort arduPort("COM5");
arduPort.setBaudRate(QSerialPort::Baud9600);
arduPort.setDataBits(QSerialPort::Data8);
arduPort.setParity(QSerialPort::NoParity);
arduPort.setStopBits(QSerialPort::OneStop);
arduPort.setFlowControl(QSerialPort::NoFlowControl);
arduPort.open(QSerialPort::ReadWrite);
BaudRate: THIS WASN'T SET BY QT!!!!!!! It turns out you can only set
the baud rate after you open the port.. Otherwise it is left at the
previous value which is 0 when you first plug in the cable.
Yes, it is true. But it has been fixed and will be available in Qt 5.3
fDtrControl: Set to 1 by Putty, left at 0 by Qt.
No. Qt do not touch the DTR signal at opening. This signal will be cleared only when was set to DTR_CONTROL_HANDSHAKE. Because QtSerialPort do not support the DTR/DSR flow control. So, in any case you can control the DTR by means of QSerialPort::setDataTerminalReady(bool).
PS: I mean current Qt 5.3 release
fOutX and fInX: Both also set to 1 by Putty and left at 0 by Qt.
This flag's are used only when you use QSerialPort::Software flow control (Xon/Xoff). But you are use the QSerialPort::NoFlowControl (as I can see from your code snippet), so, all right. So, please check that you use Putty with the "None" flow control too.
While setting all the parameters I thought it would be a good idea to
set the error policy to 'skip'.
Please use only QSerialPort::Ignore policy (by default). Because other policies is deprecated (all policies) and will be removed in future.
UPD:
dcb.fRtsControl = RTS_CONTROL_ENABLE;
Ahh, seems that your Arduino expect that RTS signal should be enabled by default. In this case you should to use QSerialPort::setRequestToSend(bool). But it is possible only in QSerialPort::NoFlowControl mode.
I.e. RTS always will be in RTS_CONTROL_DISABLE or RTS_CONTROL_HANDSHAKE after opening of port (depends on your FlowControl settings, QSerialPort::setFlowControl() ).