C++ array with command line variable - c++

My array looks something like this;
unsigned char send_bytes[] = { 0x0B, 0x11, 0xA6, 0x05, 0x00, 0x00, 0x70 };
One of the values is a variable that can change all the time.. so I tried something like this;
const char* input = "0x05";
unsigned char send_bytes[] = { 0x0B, 0x11, 0xA6, input, 0x00, 0x00, 0x70 };
When I compile I get a warning;
warning: initialization makes integer from pointer without a cast
I am a little confused on the conversion I need to do.. since the array has hex strings in it.. and the input string is a char..

in the first line you are declaring a pointer to const char, and initializing to the beginning of string "0x05", that's fine, but it is not the thing you are trying to do.
in the second line, you try to initialize the fourth array element (an unsigned char) with the value of the pointer you assigned to the input variable in the first line. The compiler says you are pretending to embed a pointer value (the address of "0x05" string) into a char variable, so that's why it complained. And also it is not what you intend.
also, take into account that if you are using binary data (from the fact you are initializing arrays with hex numbers) you had better to use unsigned char for binaries, as signed char is valid only for -128 to +127 values, you can expect some more unpredictable behaviour. Perhaps, a declaration typedef unsigned char byte; can do things easier.
typedef unsigned char byte;
byte send_bytes[] = { 0x0b, 0x11, 0xa6, 0x00, 0x00, 0x00, 0x70 };
byte &input = send_bytes[3]; /* input is an alias of send_bytes[3] */
BR,
Luis

Maybe explaining exactly what const char* input = "0x05"; does will clear things up for you.
First the compiler computes the string data and creates it as a static object:
const char data[5] = { 0x30, 0x78, 0x30, 0x35, 0x0 };
Then your variable is initialized:
const char *input = &data[0];
Note that input is a pointer with a value that depends entirely upon the location the compiler chooses to store the string data at, and has nothing to do with the contents of the string. So if you say char c = input; then c basically gets assigned a random number.
So you should be asking yourself "Where is the value 0x05 that I want to store in the send_bytes array?" In your code it's encoded as text, rather than as a number that your program can use directly. You need to figure out how to convert from a string of symbols following the hexadecimal scheme of representing numbers into C++'s native representation of numbers.
Here are a couple hints. Part of the operation involves associating values with each digit symbol. The symbol '0' is associated with the value zero, '1' with the value one, and so on, according to the usual hexadecimal system. Second, once you can get the associated value of a symbol, then you can use those values in some basic arithmetic operations to figure out the value of the number represented by the whole string of symbols.
For example, if you have the symbols '1' '2' and 'a', in that order from left to right then the arithmetic to compute what number is represented is 1 * 16 * 16 + 2 * 16 + 10.

The error string is pretty much telling you exactly what's wrong.
input is of type const char* (a pointer to a const char), whereas your array send_bytes is of type unsigned char[] (an array of unsigned chars).
First, signed and unsigned values are still different types, though your error message isn't referring to that specifically.
In reality, your input value isn't a string (as there is no true string type in C++), but a pointer to a character. This means that the input string doesn't hold the byte x05, but rather the bytes {x30, x78, x30, x35, x00}.
The compiler is saying Hey, I've no idea what you're trying to do, so I'm just converting the address that string I don't understand (input) to an unsigned char and adding it to the array.
That means if the string "0x05" starts at location 0xAB, your array will ultimately contain { 0x0B, 0x11, 0xA6, 0xAB, 0x00, 0x00, 0x70 }.
You're going to either have to convert from a string to an integer using a radix of 16, or just not use a string at all.
I'd also recommend reading up on pointers.

The array doesn't have "hex strings" in it - if they were, they would be enclosed in quotation marks, like all strings.
The literals are integers written in hexadecimal notation, and equivalent to
unsigned char send_bytes[] = { 11, 17, 166, input, 0, 0, 112 };
Since it's an array of unsigned char you should put an unsigned char there:
unsigned char input = 0x05;
unsigned char send_bytes[] = { 0x0B, 0x11, 0xA6, input, 0x00, 0x00, 0x70 };

You had better to put in your code:
unsigned char send_bytes[] = { 0x0b, 0x11, 0xa6, 0x00, 0x00, 0x00, 0x70 };
unsigned char &input = send_bytes[3]; /* input is an alias of send_bytes[3] */
this way you can do things like:
input = 0x26;
send_packet(send_bytes);

Related

Why is my pointer array assignment not working?

I have this method prototype
bool setMacParam(const char* paramName, const uint8_t* paramValue, uint16_t size)
{
debugPrint("[setMacParam] "); debugPrint(paramName); debugPrint("= [array]");
this->loraStream->print(STR_CMD_SET);
this->loraStream->print(paramName);
for (uint16_t i = 0; i < size; ++i) {
this->loraStream->print(static_cast<char>(NIBBLE_TO_HEX_CHAR(HIGH_NIBBLE(paramValue[i]))));
this->loraStream->print(static_cast<char>(NIBBLE_TO_HEX_CHAR(LOW_NIBBLE(paramValue[i]))));
}
this->loraStream->print(CRLF);
return expectOK();}
I would like to assign my variable devEUI to paramValue, I am doing this call
uint8_t DevEUI2[8] = { 0x00, 0x00, 0x00, 0x00, 0x41, 0x47, 0x30, 0x39 };
setMacParam(STR_DEV_EUI,DevEUI2,8);
However my terminal shows that paramValue is empty
[setMacParam] deveui = [array]
What do I do wrong?
debugPrint is interpretating your array as a byte array in which each byte is a char; because the first value is 0x00, incidentally is the same value for the '\0' character, that represent the "end of string".
Also the other value will be represented by their ascii representation, which is never the same as the byte value.
The print() of Serial accept some parameter that tell the function to print the ascii representation of the hex, decimal, octal or binary; maybe your SerialUSB support them too.

Passing the Number of Elements in an Array to Function?

I am writing a DLL that passes a char array to a function. I define that char array with 22 elements here:
unsigned char data[22] = { 0x00, 0x0A, 0x00, 0x09, 0x70, 0x00, 0x72, 0x00,
0x6F, 0x00, 0x74, 0x00, 0x68, 0x00, 0x65, 0x00, 0x67, 0x00, 0x75, 0x00,
0x79, 0x00 };
Now, I try to pass this array to my function declared as:
bool sendData(unsigned char* sData, unsigned long sSize);
With these arguments:
sendData(data, 22);
This code compiles, but crashes the program when this function is called. Taking a closer look while debugging, I can see that there's an access violation in my function sendData. Looking even further, I see the values of data and sData at run-time:
data points to the 22 byte char array with correct values (obviously)
sData points to a char array that is null-terminated by the first byte, only containing one value (0)
It is clear to me that the compiler does not know to allocate 22 bytes for sData, simply because I do not specify any length for it. So my question is:
How do I specify the length of the sData so that the argument
passed won't terminate early?
If I'm wrong about the issue, please correct me and explain it further. Thanks for any help in advance!
EDIT:
I understand that \0 (the first byte and many more in data) is a null-terminator and will prematurely end the array. What I am asking is how to avoid this. My understanding is that sData is never given a specific length and therefore stops on \0, but I may be wrong.
I was asked to supply my sendData function:
bool sendData(unsigned char* sData, unsigned long sSize)
{
try
{
Send(sData, sSize);
return true;
}
catch (...)
{
return false;
}
}
Send is calling a function from another module, but isn't relevant to the issue, as the error occurs beforehand when the sData argument is passed to sendData.
No allocation of sData is going to happen, it just points to your array. It displays as empty in the debugger because it displays a char* as a string, and strings end when there is a '/0', your first byte. This does not mean sData does not have the correct data. Write sData[0]. sData[1], etc, in your debugger to see the correct values.

char type and re-encoding ASCII text into UTF-16

I am using libiconv to convert my char array into a UTF-16 string. I have doubts.
signature of iconv function
size_t iconv(iconv_t cd,
const char* * inbuf, size_t *inbytesleft,
char* * outbuf, size_t *outbytesleft);
that means, char is used to hold whatever type of characters being converted to (char vs wide char).
My C teacher at school teaches me that for odd or unreadable characters, we should use wchar_t. I'm so much confused now.
I tested this method on an input = "KOTEX" as ASCII encoded type and wish to output another string of double length encoded as UTF-16. It fails immediately. But if I change the destined code page into UTF-8, it'll work but the data returned is lost. Why is that ?
The buffer arguments to iconv are, in effect, char * but that is not intended to imply that they actually represent C strings. (It might have been less confusing had the interface used uint8_t* instead, but that's anachronic; iconv was around before stdint.h)
The Posix standard (and the Linux manpage) try to make this clear:
The type of inbuf and outbuf, char **, does not imply that the objects pointed to are interpreted as null-terminated C strings or arrays of characters. Any interpretation of a byte sequence that represents a character in a given character set encoding scheme is done internally within the codeset converters. (Posix.2008
So if you are planning on converting to UTF-16, you should provide an output buffer with an appropriate datatype for UTF-16. wchar_t is not an appropriate datatype; on many systems, it will be too big. uint16_t would be fine.
Note that there are actually three different UTF-16 encodings (the names are system-dependent; the ones here are recognized by Gnu iconv):
UTF16LE (or UTF-16LE): "Little endian" UTF-16. In this format, the low-order byte of each character is first, followed by the high-order byte. KOTEX is
{0x4B, 0x00, 0x4F, 0x00, 0x54, 0x00, 0x45, 0x00, 0x58, 0x00}
UTF16BE (or UTF-16BE): "Big endian" UTF-16. In this format, the high-order byte of each character is first, followed by the low-order byte. KOTEX is:
{0x00, 0x4B, 0x00, 0x4F, 0x00, 0x54, 0x00, 0x45, 0x00, 0x58}
UTF16 (or UTF-16): either UTF16BE or UTF16LE, depending on whether the machine is big-endian or little-endian; converted strings start with a Byte Order Mark (BOM). On a little-endian machine (mine), KOTEX is
{0xFF, 0xFE, 0x4B, 0x00, 0x4F, 0x00, 0x54, 0x00, 0x45, 0x00, 0x58, 0x00}
On a big-endian machine, it would be:
{0xFE, 0xFF, 0x00, 0x4B, 0x00, 0x4F, 0x00, 0x54, 0x00, 0x45, 0x00, 0x58}
The fact that UTF16 (unadorned with endian specification) always starts with a BOM means that you have to remember to provide an extra (2-byte) character in the output buffer. Otherwise, you'll end up with E2BIG.
In all three of these encodings, characters outside of the basic multilingual plane (BMP) require two (two-byte) character positions, a so-called surrogate pair. All ascii characters are on the BMP, so you don't need to worry about this for ascii-to-utf16 conversion, but you would if you were doing utf8-to-utf16.

Output is not what it should be

So there is a program that I am working on, that requires me to access data from a char array containing hex values. I have to use a function called func(), in this example, in order to do access the data structure. Func() contains 3 pointer variables, each of different types, and I can use any of them to access the data in the array. Whichever datatype I choose will affect what values will be stored to the pointer. Soo heres the code:
unsigned char data[]
{
0xBA, 0xDA, 0x69, 0x50,
0x33, 0xFF, 0x33, 0x40,
0x20, 0x10, 0x03, 0x30,
0x66, 0x03, 0x33, 0x40,
}
func()
{
unsigned char *ch;
unsigned int i*;
unsigned short* s;
unsigned int v;
s = (unsigned short*)&data[0];
v = s[6];
printf("val:0x%x \n",v);
}
Output:
Val:0x366
The problem with this output is that it should be 0x0366 with the zero in front of the 3, but it gets cut off at the printf statement, and I'm not allowed to modify that. How else could I fix this?
Use a format that specifies leading zeros: %04x.
Without changing the format passed to printf or replacing it entirely I'm afraid there's no way to affect the output.

C++ Send bytes from a string?

I am writing a little program that talks to the serial port. I got the program working fine with one of these lines;
unsigned char send_bytes[] = { 0x0B, 0x11, 0x00, 0x02, 0x00, 0x69, 0x85, 0xA6, 0x0e, 0x01, 0x02, 0x3, 0xf };
However the string to send is variable and so I want make something like this;
char *blahstring;
blahstring = "0x0B, 0x11, 0x00, 0x02, 0x00, 0x69, 0x85, 0xA6, 0x0e, 0x01, 0x02, 0x3, 0xf"
unsigned char send_bytes[] = { blahstring };
It doesn't give me an error but it also doesnt work.. any ideas?
a byte-string is something like this:
char *blahString = "\x0B\x11\x00\x02\x00\x69\x85\xA6\x0E\x01\x02\x03\x0f"
Also, remember that this is not a regular string. It will be wise if you explicitly state it as an array of characters, with a specific size:
Like so:
unsigned char blahString[13] = {"\x0B\x11\x00\x02\x00\x69\x85\xA6\x0E\x01\x02\x03\x0f"};
unsigned char sendBytes[13];
memcpy(sendBytes, blahString, 13); // and you've successfully copied 13 bytes from blahString to sendBytes
not the way you've defined..
EDIT:
To answer why your first send_bytes works, and the second doesn't is this:
The first one, creates an array of individual bytes. Where as, the second one, creates a string of ascii characteres. So the length of first send_bytes is 13 bytes, where as the length of the second send_bytes is much higher, since the sequence of bytes is ascii equivalent of individual characters in the second blahstring.
blahstring is a string of characters.
1st character is 0, 2nd character is x, 3rd character is 0, 4th character is B etc. So the line
unsigned char send_bytes[] = { blahstring };
is an array (assuming that you preform a cast!) will have one item.
But the example that works is an array with the 1st character has a value 0x0B, 2nd character is of value 0x11.