RESOLVED: Problem was primarily with the simulink blockset that was reading in the UDP packet rather than data transmission.
I am trying to send a 20 byte numerical array out of a c++ program by using winsock. The issue I am running into is data packaging, in the end I want each number in the array to go out as its own byte so that my simulink model that is receiving these values does not need an additional processing script.
My array contains 14 boolean values (0|1) and then 6 values that range from -100 to 100. These are reporting the status of a controller input. for example the array would look like
int array msgint[20] = [1,0,1,0,0,1,1,0,0,0,0,0,0,0,50,80,-90,40,90,-20];
I have tried using typecasting and sending multiple strings but all just appear to rearrange the gibberish I am getting or cause a socket error. Currently my sendto function looks like
sendto(sd,message,80,0,(struct sockadd *) &server,server_length)
I know this line works as the packet makes it through it just does not appear as I would like it to. In the send to, message is the formatted string I am trying to create to properly send all of contents of the array. Currently is is arbitrary and has little significance I have it in for debugging purposes essentially.
you are starting at the wrong point. Network communications should start with the design of the wire protocol.
How will you represent something on the wire. Binary or text. Most 'modern' protocols use text (json or xml). A few years ago binary was hot (asn1/ber/der). I suggest json
Then how will you wrap up the payload. Do you need to say 'here is a set of xxxs. now here is a set of yyyys'. I dont know what you are doing so its hard to say what you need
If you want to send 20 bytes, and you know that every integer value in your array will be in the [-100, +100] range, you should not use the int type, which usually contains either 32-bit or 64-bit values on modern platforms.
You might instead want to use the char type, which usually represents a 8-bit value.
For even more certainty, if you can use C++11 features, you should use the <cstdint> header, which defines the int8_t type, guaranteed to be a signed 8-bit type. See http://en.cppreference.com/w/cpp/types/integer.
Your array should look like:
#include <cstdint>
std::int8_t msgint[20] = {1,0,1,0,0,1,1,0,0,0,0,0,0,0,50,80,-90,40,90,-20};
or
char msgint[20] = {1,0,1,0,0,1,1,0,0,0,0,0,0,0,50,80,-90,40,90,-20};
and your sendto will be:
sendto(sd,msgint,20,0,(struct sockadd *) &server,server_length);
"in the end I want each number in the array to go out as its own byte"
Then change your array to
char msgint[20] = ...
Related
I am using a tcp connection to send data to a c++ server. I have read about ei library which when given a binary has a bunch of functions to get the decoded value out of the binary. Egs, ei_decode_string, ei_decode_long and others.
I am trying to do these simple things:
1. create a socket and connect to it.
{ok, Socket} = gen_tcp:connect({127,0,0,1}, 8986, []).
2. Use gen_tcp:send/2 to send data.
gen_tcp:send(Socket, term_to_binary("Stackoverflow")).
Therefore, I am sending a binary format of a string to the server.
My Server, C++ code, gets the data and I am trying to get whatever the client sends me over the socket using ei_decode_string like:
Ideally, when decoded I should get back the string, "Stackoverflow" since I told it to decode_as_string from the binary. Made sure I had enough space in the resulting buffer.
char *p = (char*)malloc(sizeof(char) * 100);
int index = 0;
int decoded = ei_decode_string(buff, &index, p);
cout<<"The decoded value is "<<p<<endl;
I am not able to decode the string which I sent.! Am I missing something? How can I send data and decode it on the server side, if this is not the right approach.
Thank you for the help!
1> term_to_binary("Stackoverflow").
<<131,107,0,13,83,116,97,99,107,111,118,101,114,102,108,
111,119>>
Here the first item in the binary is not the string as small integers, but a version number (131) as described in External Term Format. After that comes the terms each prefixed by a tag and then corresponding data. In this case tag 107 STRING_EXT, two bytes for length (13) and then 13 8bit integers, first one 83 corresponds to letter 'S' (83 in ascii).
To handle this binary correctly correctly, call to ei_decode_version has to be before ei_decode_string
int ei_decode_version(const char *buf, int *index, int *version)
This function decodes the version magic number for the erlang binary
term format. It must be the first token in a binary term.
I am trying to write a string command to the serial port of my Raspberry Pi 2 B without success. I followed this http://www.raspberry-projects.com/pi/programming-in-c/uart-serial-port/using-the-uart, but I need to send and receive QStrings (or arrays of bytes). Are there specific c++ functions that send and receive strings through the RPi Serial Port?
Could someone share some sample code?
Many thanks in advance!
Andrea
If you look at the QString docs you will see a number of methods that convert QString to other types. QString is just a wrapper around a character string.
To get at the underlying char buffer you can do something like this:
std::string stdStr = qString.toStdString();
char* buffer = stdStr.c_str();
Make sure that stdStr stays in scope as long as you wish to use buffer otherwise you will end up using a pointer that points to deallocated memory.
Let's try explain my problem. I have to receive a message from a server (programmed in delphi) and do some things with that message in the client side (which is the side I programm, in c++).
Let's say that the message is: "Hello €" that means that I have to work with std::wstring as €(euro sign) needs 2 bytes instead of 1 byte, so knowing that I have made all my work with wstrings and if I set the message it works fine. Now, I have to receive the real one from the server, and here comes the problem.
The person on the server side is sending that message as a string. He uses a EncodeString() function in delphi and he says that he is not gonna change it. So my question is: If I Decode that string into a string in c++, and then I convert it into a wstring, will it work? Or will I have problems and have other message on my string var instead of "Hello €".
If yes, if I can receive that string with no problem, then I have another problem. The function that I have to use to decode the string is void DecodeString(char *buffer, int length);
so normally if you receive a text, you do something like:
char Text[255];
DescodeString(Text, length); // length is a number decoded before
So... can I decode it with no problem and have in Text the "Hello €" message? with that I'll just need to convert it and get the wstring.
Thank you
EDIT:
I'll add another example. If i know that the server is going to send me always a text of length 30 max, in the server they do something like:
EncodeByte(lengthText);
EncodeString(text)
and in the client you do:
int length;
char myText[30];
DecodeByte(length);
DecodeString(myText,length);
and then, you can work with myText as a string lately.
Hope that helps a little more. I'm sorry for not having more information but I'm new in that work and I don't know much more about the server.
EDIT 2
Trying to summarize... The thing is that I have to receive a message and do something with it, with the tool I said I have to decode it. So as de DecodeString() needs a char and I need a wstring, I just need a way to get the data received by the server, decode it with decodeString() and get it into a wstring, but I don't really know if its possible, and if it is, I'm not sure about how to do it and what type of vars use to get it
EDIT 3
Finally! I know what code pages are using. Seems that the client uses the ANSI ones and that the server doesn't, so.. I'll have to tell to the person who does that part to change it to the ANSI ones. Thanks everybody for helping me with my big big ignorance about the existence of code pages.
Since you're using wstring, I guess that you are on Windows (wstring isn't popular on *nix).
If so, you need the Delphi app to send you UTF-16, which you can use in the wstring constructor. Example:
char* input = "\x0ac\x020"; // UTF-16 encoding for euro sign
wchar_t* input2 = reinterpret_cast<wchar_t*>(input);
wstring ws(input2);
If you're Linux/Mac, etc, you need to receive UTF-32.
This method is far from perfect though. There can be pitfalls and edge cases for unicodes beyond 0xffff (chinese, etc). Supporting that probably requires a PhD.
I'd like to use libusb to retrieve information about my devices. I can read every descriptor and print every number associated inside theese descriptors.
But I have troubles with the strings. How can I manage the string descriptors in a good way with c++?
I'd like to implement a simple function like this:
std::string get_string(std::uint8_t index);
which internally retrieves the string associated a index. The device handle is got from the attributes of the class(the function is a class member) and the buffer where the libusb_get_string_descriptor is allocated statically because seeing that the dimension is contained in a 8bit field the length must be at most 256 charachters, mustn't it?
How can I manage unicode with theese things? Any ideas? Is right the use of the std::string?
I am currently evaluating Protocol Buffers for use in a project (no code written as of yet). One of the things I'm unclear on is how you would read part of an encoded message, for example say I have a common header:
message Header {
required uint16 msg_type = 1;
required uint16 length = 2;
}
And say I deliver multiple different messages to a queue. How would the consumer work out how much data to read per message and what message type is should be constructed as?
There should be no need for a Header message here; the most common approach is to follow the "streaming" advice from here. Within that, you could either treat it as a sequence of identical union type messages, or (my preference) when writing, instead of just writing a length-prefix before each, include a varint that indicates the message type then the length (as a varint). The number that indicates the message type is some arbitrary map you invent, so 1 = Foo, 2 = Bar, 3 = Blap, etc). If you left-shift the message-type by 3 bits then "or" 2, then it will also be a well-formed protobuf stream itself, 100% identical to a repeated YourUnionType.
Basically, this is exactly the same as this answer, but instead of being field 1 each time, the number varies per message-type. Most implementations have a reader/writer API that make it possible to read and write raw varints, and to length-restrict the reader API. Some implementations have helper mechanisms to support streams of heterogeneous messages directly (basically, doing all the above for you).
In a recent project, I used Protocol Buffers like this:
We had one 'container' message that included all the actual messages as optional members:
message ContainerMessage {
optional Message1 message_1 = 1;
optional Message2 message_2 = 2;
//...
optional MessageN message_N = N;
}
Inside an application, you could just use ContainerMessage as a discriminated union of the real Messages.
Between applications, we serialized/deserialized the ContainerMessage and sent the serialized content, prefixed with a simple header containing the length of the serialized content.
That will depend on the protocol you are using.
Note that e.g. a lot of protocols go via serial interfaces, where you might have extra lines telling when a message starts and stops.
Often, messages will have there length at a fixed offset after the message start.
In other cases, you might need to parse the message element by element to find out how much of the message is left. So a string embedded in the message may be of fixed length, or have the length at the beginning, or might have \0 as end marker.
Mostly, when you store messages in a queue for further processing, you will want to add some more information to make your life easier - like when you just have an extra signal telling you when the message stops, you might store the message internally with its length.