Problem with showing my packets in console with ACE - c++

for Debug reasons i want to show my outgoing packets in Console.
The packets arrive at the server correctly btw.
But if i want them to show in Console before sending, then it is showing just nothing:
ACE_Message_Block *m_Header;
...
size_t send_len = m_Header->length(); // Size of the Message Block
char* output = m_Header->rd_ptr();
printf("Output: %s", output); // Trying to show it in Console
// Send it
server.send(m_Header->rd_ptr(), send_len);
Someone has an idea?

Likely the data you send contains 0's - and you'll need to append a newline as well.
for (size_t i = 0; i < send_len; ++i) {
if (output[i]<32) {
printf("\\x%02hhx", (unsigned char) output[i]);
} else {
printf("%c", output[i]);
}
}
printf("\n");

Related

getting iaxclient to send audio to/get audio from buffer instead of audio-device

I'm trying to write a C++ program (altough python would've been fine as well in case someone knows a better (IAX/SIP) alternative) which connects to an Asterisk server.
After connecting, it should listen for audio and process that. It should also send audio back. I'm using https://sourceforge.net/projects/iaxclient/ for that (note that there are several versions (betas, regular releases, svn version) which all behave differently).
Now if I understood the code of the library correct, then it can call a callback function with an event. One of those events is IAXC_EVENT_AUDIO. In the structure of that IAXC_EVENT_AUDIO there's a direction; incoming outgoing. And that's where I'm lost: with some versions of iaxclient I only receive the IAXC_SOURCE_REMOTE messages, with some both. And if I switch to test-mode (which should only disable the audio-device) I often receive nothing at all. When I receive both IAXC_SOURCE_LOCAL and IAXC_SOURCE_REMOTE, I tried to set the buffers of those events to random data but that doesn't reach the other end at all (I set it to RAW mode).
As anyone any suggestions how to resolve this?
My test-code is:
#include <iaxclient.h>
#include <unistd.h>
int iaxc_event_callback(iaxc_event e)
{
if (e.type == IAXC_EVENT_TEXT) {
printf("text\n");
}
else if (e.type == IAXC_EVENT_LEVELS) {
printf("level\n");
}
else if (e.type == IAXC_EVENT_STATE) {
struct iaxc_ev_call_state *st = iaxc_get_event_state(&e);
printf("\tcallno %d state %d format %d remote %s(%s)\n", st->callNo, st->state, st->format,st->remote, st->remote_name);
iaxc_key_radio(st->callNo);
}
else if (e.type == IAXC_EVENT_NETSTAT) {
printf("\tcallno %d rtt %d\n", e.ev.netstats.callNo, e.ev.netstats.rtt);
}
else if (e.type == IAXC_EVENT_AUDIO) {
printf("\t AUDIO!!!! %d %u %d\n", e.ev.audio.source, e.ev.audio.ts, e.ev.audio.size);
for(int i=0; i<e.ev.audio.size; i++)
printf("%02x ", e.ev.audio.data[i]);
printf("\n");
}
else {
printf("type: %d\n", e.type);
}
return 1;
}
int main(int argc, char *argv[])
{
iaxc_set_test_mode(1);
printf("init %d\n", iaxc_initialize(1));
iaxc_set_formats(IAXC_FORMAT_SPEEX, IAXC_FORMAT_SPEEX);
iaxc_set_event_callback(iaxc_event_callback);
printf("get audio pref %d\n", iaxc_get_audio_prefs());
//printf("set audio pref %d\n", iaxc_set_audio_prefs(IAXC_AUDIO_PREF_RECV_REMOTE_ENCODED));
printf("set audio pref %d\n", iaxc_set_audio_prefs(IAXC_AUDIO_PREF_RECV_REMOTE_RAW | IAXC_AUDIO_PREF_RECV_LOCAL_RAW));
printf("get audio pref %d\n", iaxc_get_audio_prefs());
printf("start thread %d\n", iaxc_start_processing_thread());
int id = -1;
printf("register %d\n", id = iaxc_register("6003", "1923", "192.168.64.1"));
int callNo = -1;
printf("call %d\n", callNo = iaxc_call("6003:1923#192.168.64.1/6001"));
printf("unquelch: %d\n", iaxc_unquelch(callNo));
pause();
printf("finish\n");
printf("%d\n", iaxc_unregister(id));
printf("%d\n", iaxc_stop_processing_thread());
iaxc_shutdown();
return 0;
}
Please have a look in iaxclient_lib.c too see how the logic works. To hook or replace input/output you can change function iaxci_do_audio_callback at memcpy(e.ev.audio.data, data, size); where the buffer is set. Also have a look at service_audio to understand how you can replace the buffer/stream sent to the remote location (e.g. want_send_audio and want_local_audio). You can also create virtual input/output devices in portaudio that iaxclient use for process audio using buffers instead.
For a more concrete example please look at the main method in the simplecall source to get a good start. However, the source code is to long for copy and paste, sorry about that.

libusb_get_string_descriptor_ascii() timeout error?

I'm trying to get the serial number of a USB device using libusb-1.0.
The problem I have is that sometimes the libusb_get_string_descriptor_ascii() function returns -7 (LIBUSB_ERROR_TIMEOUT) in my code, but other times the serial number is correctly written in my array and I can't figure out what is happening. Am I using libusb incorrectly? Thank you.
void EnumerateUsbDevices(uint16_t uVendorId, uint16_t uProductId) {
libusb_context *pContext;
libusb_device **ppDeviceList;
libusb_device_descriptor oDeviceDescriptor;
libusb_device_handle *hHandle;
int iReturnValue = libusb_init(&pContext);
if (iReturnValue != LIBUSB_SUCCESS) {
return;
}
libusb_set_debug(pContext, 3);
ssize_t nbUsbDevices = libusb_get_device_list(pContext, &ppDeviceList);
for (ssize_t i = 0; i < nbUsbDevices; ++i) {
libusb_device *pDevice = ppDeviceList[i];
iReturnValue = libusb_get_device_descriptor(pDevice, &oDeviceDescriptor);
if (iReturnValue != LIBUSB_SUCCESS) {
continue;
}
if (oDeviceDescriptor.idVendor == uVendorId && oDeviceDescriptor.idProduct == uProductId) {
iReturnValue = libusb_open(pDevice, &hHandle);
if (iReturnValue != LIBUSB_SUCCESS) {
continue;
}
unsigned char uSerialNumber[255] = {};
int iSerialNumberSize = libusb_get_string_descriptor_ascii(hHandle, oDeviceDescriptor.iSerialNumber, uSerialNumber, sizeof(uSerialNumber));
std::cout << iSerialNumberSize << std::endl; // Print size of serial number <--
libusb_close(hHandle);
}
}
libusb_free_device_list(ppDeviceList, 1);
libusb_exit(pContext);
}
I see nothing wrong with your code. I would not care to much about timeouts in the context of USB. It is a bus after all and can be occupied with different traffic.
As you may know there is depending on the version of USB a portion of the bandwidth reserved for control transfers. libusb_get_string_descriptor_ascii simply sends all the required control transfers to get the string. If any of those times out it will abort. You can try to send this control transfers yourself and use bigger timeout values but I guess the possibility of a timeout will always be there to wait for you (pun intended).
So it turns out my device was getting into weird states, possibly not being closed properly or the like. Anyway, calling libusb_reset_device(hHandle); just after the libusb_open() call seems to fix my sporadic timeout issue.
libusb_reset_device()

OpenFrameworks/C++/Arduino: UDP SendAll fails at 1473 chars

am sending char * from OF to a teensy board.
Here is my OF code:
void ofApp::draw() {
string message = "";
int total = 1472;
for (int i = 0; i < total; i++) {
message += (char)ofRandom(0,255);
}
udpConnection.SendAll(message.c_str(), message.length());
}
Here is my teensy code:
void loop() {
int packetSize = Udp.parsePacket();
if (packetSize) {
Serial.println("Got it!");
Udp.read((char*)packetBuffer, 648*3);
for (int i = 0; i < 3*NUM_LEDS; i+=3) {
leds[i/3].setRGB(packetBuffer[i], packetBuffer[i+1], packetBuffer[i+2]);
}
FastLED.show();
}
}
The teensy code responds when it receives a packet of any size, which works fine up to 1472 chars. In the OF code, as soon as the length of the char * length is increased to 1473, the teensy stops receiving anything, and I am not getting any run time errors OF-side. Does anyone know why this would happen/what the fix is? I need to scale this up to 1944 chars eventually
Thanks,
Collin
What's the MTU size? 1473 sound close to the default.
You may do some experiment by increase the MTU on both side.

Reading COM port in c++, getting errors

First time poster long time reader.
I've been playing round with reading in data from a bluetooth GPS unit.
I can connect to it using hyperterm and see the data
The following log is from the hyperterm
$GPRMC,195307.109,A,5208.2241,N,00027.7689,W,000.0,345.8,310712,,,A*7E
$GPVTG,345.8,T,,M,000.0,N,000.0,K,A*07
$GPGGA,195308.109,5208.2242,N,00027.7688,W,1,04,2.1,58.9,M,47.3,M,,0000*7E
$GPGSA,A,3,19,03,11,22,,,,,,,,,5.5,2.1,5.0*3F
$GPRMC,195308.109,A,5208.2242,N,00027.7688,W,000.0,345.8,310712,,,A*73
$GPVTG,345.8,T,,M,000.0,N,000.0,K,A*07
$GPGGA,195309.109,5208.2243,N,00027.7688,W,1,04,2.1,58.9,M,47.3,M,,0000*7E
END LOG
The following log is from my C++ program
$GPGSV,3,3,12,14,20,105,16,28,18,323,,08,07,288,,16,01,178,*7A
$GPRMC,195,3,2ÿþÿÿÿL.š945.109,A,5208.2386,N,00027.7592,W,000.0,169.5,8,323,,08,07,288,,16,01,178,*7A
$GPRMC,195,3,2ÿþÿÿÿL.š310712,,,A*70
$GPVTG,169.5,T,,M,000.0,N,000.0,K,A*06
8,07,288,,16,01,178,*7A
$GPRMC,195,3,2ÿþÿÿÿL.š310712,,,A*70
$GPVTG,169.5,T,,M,000.0,N,000.0,K,A*06
8,07,288,,16,01,178,*7A
$GPRMC,195,3,2ÿþÿÿÿL.š$GPGGA,195946.109,5208.2386,N,00027.7592,W,1.0,K,A*06
8,07,288,,16,01,178,*7A
END LOG
THE PROBLEM
I've left the line feeds as they come, the C++ output has extra line feeds, not sure why?
The C++ log also has some funky chars...?
The Code
for (int n=0;n<100;n++) {
char INBUFFER[100];
cv::waitKey(1000);
bStatus = ReadFile(comport, // Handle
&INBUFFER, // Incoming data
100, // Number of bytes to read
&bytes_read, // Number of bytes read
NULL);
cout << "bStatus " << bStatus << endl;
if (bStatus != 0)
{
// error processing code goes here
}
LogFile << INBUFFER;
}
I'm using settings...
comSettings.BaudRate = 2400;
comSettings.StopBits = ONESTOPBIT;
comSettings.ByteSize = 8;
comSettings.Parity = NOPARITY;
comSettings.fParity = FALSE;
...which as far as I can tell are the same as the settings used by hyperterm.
Any hints on what I'm doing wrong?
cheers!
UPDATE
So after updating to use bytes_read and account for the extra LF at the end of NMEA data I now have...
if (bytes_read!=0) {
for (int i=0; i < bytes_read; i++) {
LogFile << INBUFFER[i];
}
}
Which appears to have fixed things!
$GPGGA,215057.026,5208.2189,N,00027.7349,W,1,04,6.8,244.6,M,47.3,M,,0000*41
$GPGSA,A,3,32,11,01,19,,,,,,,,,9.7,6.8,7.0*3D
$GPRMC,215057.026,A,5208.2189,N,00027.7349,W,002.0,208.7,310712,,,A*74
$GPVTG,208.7,T,,M,002.0,N,003.8,K,A*09
$GPGGA,215058.026,5208.2166,N,00027.7333,W,1,04,6.8,243.1,M,47.3,M,,0000*42
Thanks folks, your help was much appreciated.
You have a bytes_read var, but you don't do anything with it? Seems to me that you're dumping the entire INBUFFER to the file, no matter how many/few bytes are actually loaded into it?

Losing characters in TCP Telnet transmission

I'm using Winsock to send commands through Telnet ; but for some reason when I try to send a string, a few characters get dropped occasionally. I use send:
int SendData(const string & text)
{
send(hSocket,text.c_str(),static_cast<int>(text.size()),0);
Sleep(100);
send(hSocket,"\r",1,0);
Sleep(100);
return 0;
}
Any suggestions?
Update:
I checked and the error still occurs even if all the characters are sent. So I decided to change the Send function so that it sends individual characters and checks if they have been sent:
void SafeSend(const string &text)
{
char char_text[1];
for(size_t i = 0; i <text.size(); ++i)
{
char_text[0] = text[i];
while(send(hSocket,char_text,1,0) != 1);
}
}
Also, it drops characters in a peculiar way ; i.e. in the middle of the sentence. E.g.
set variable [fp]exit_flag = true
is sent as
ariable [fp]exit_flag = true
Or
set variable [fp]app_flag = true
is sent as
setrable [fp]app_flag = true
As mentioned in the comments you absolutely need to check the return value of send as it can return after sending only a part of your buffer.
You nearly always want to call send in a loop similar to the following (not tested as I don't have a Windows development environment available at the moment):
bool SendString(const std::string& text) {
int remaining = text.length();
const char* buf = text.data();
while (remaining > 0) {
int sent = send(hSocket, buf, remaining, 0);
if (sent == SOCKET_ERROR) {
/* Error occurred check WSAGetLastError() */
return false;
}
remaining -= sent;
buf += sent;
}
return true;
}
Update:
This is not relevant for the OP, but calls to recv should also structured in the same way as above.
To debug the problem further, Wireshark (or equivalent software) is excellent in tracking down the source of the problem.
Filter the packets you want to look at (it has lots of options) and check if they include what you think they include.
Also note that telnet is a protocol with numerous RFCs. Most of the time you can get away with just sending raw text, but it's not really guaranteed to work.
You mention that the windows telnet client sends different bytes from you, capture a minimal sequence from both clients and compare them. Use the RFCs to figure out what the other client does different and why. You can use "View -> Packet Bytes" to bring up the data of the packet and can easily inspect and copy/paste the hex dump.