I am writing a little program that talks to the serial port. I got the program working fine with one of these lines;
unsigned char send_bytes[] = { 0x0B, 0x11, 0x00, 0x02, 0x00, 0x69, 0x85, 0xA6, 0x0e, 0x01, 0x02, 0x3, 0xf };
However the string to send is variable and so I want make something like this;
char *blahstring;
blahstring = "0x0B, 0x11, 0x00, 0x02, 0x00, 0x69, 0x85, 0xA6, 0x0e, 0x01, 0x02, 0x3, 0xf"
unsigned char send_bytes[] = { blahstring };
It doesn't give me an error but it also doesnt work.. any ideas?
a byte-string is something like this:
char *blahString = "\x0B\x11\x00\x02\x00\x69\x85\xA6\x0E\x01\x02\x03\x0f"
Also, remember that this is not a regular string. It will be wise if you explicitly state it as an array of characters, with a specific size:
Like so:
unsigned char blahString[13] = {"\x0B\x11\x00\x02\x00\x69\x85\xA6\x0E\x01\x02\x03\x0f"};
unsigned char sendBytes[13];
memcpy(sendBytes, blahString, 13); // and you've successfully copied 13 bytes from blahString to sendBytes
not the way you've defined..
EDIT:
To answer why your first send_bytes works, and the second doesn't is this:
The first one, creates an array of individual bytes. Where as, the second one, creates a string of ascii characteres. So the length of first send_bytes is 13 bytes, where as the length of the second send_bytes is much higher, since the sequence of bytes is ascii equivalent of individual characters in the second blahstring.
blahstring is a string of characters.
1st character is 0, 2nd character is x, 3rd character is 0, 4th character is B etc. So the line
unsigned char send_bytes[] = { blahstring };
is an array (assuming that you preform a cast!) will have one item.
But the example that works is an array with the 1st character has a value 0x0B, 2nd character is of value 0x11.
Related
I am writing a DLL that passes a char array to a function. I define that char array with 22 elements here:
unsigned char data[22] = { 0x00, 0x0A, 0x00, 0x09, 0x70, 0x00, 0x72, 0x00,
0x6F, 0x00, 0x74, 0x00, 0x68, 0x00, 0x65, 0x00, 0x67, 0x00, 0x75, 0x00,
0x79, 0x00 };
Now, I try to pass this array to my function declared as:
bool sendData(unsigned char* sData, unsigned long sSize);
With these arguments:
sendData(data, 22);
This code compiles, but crashes the program when this function is called. Taking a closer look while debugging, I can see that there's an access violation in my function sendData. Looking even further, I see the values of data and sData at run-time:
data points to the 22 byte char array with correct values (obviously)
sData points to a char array that is null-terminated by the first byte, only containing one value (0)
It is clear to me that the compiler does not know to allocate 22 bytes for sData, simply because I do not specify any length for it. So my question is:
How do I specify the length of the sData so that the argument
passed won't terminate early?
If I'm wrong about the issue, please correct me and explain it further. Thanks for any help in advance!
EDIT:
I understand that \0 (the first byte and many more in data) is a null-terminator and will prematurely end the array. What I am asking is how to avoid this. My understanding is that sData is never given a specific length and therefore stops on \0, but I may be wrong.
I was asked to supply my sendData function:
bool sendData(unsigned char* sData, unsigned long sSize)
{
try
{
Send(sData, sSize);
return true;
}
catch (...)
{
return false;
}
}
Send is calling a function from another module, but isn't relevant to the issue, as the error occurs beforehand when the sData argument is passed to sendData.
No allocation of sData is going to happen, it just points to your array. It displays as empty in the debugger because it displays a char* as a string, and strings end when there is a '/0', your first byte. This does not mean sData does not have the correct data. Write sData[0]. sData[1], etc, in your debugger to see the correct values.
I am having a problem with a school assignment. The assignment is to write a metamorphic Hello World program. This program will produce 10 .com files that print "Hello World!" when executed. Each of the 10 .com files must be different from the others. I understand the concept of metamorphic vs oligomorphic vs polymorphic. My program currently creates 10 .com files and then writes the machine code to the files. I began by simply writing only the machine code to print hello world and tested it. It worked just fine. I then tried to add a decryption routine to the beginning of the machine code. Here is my current byte array:
#define ARRAY_SIZE(array) (sizeof((array))/sizeof((array[0])))
BYTE pushCS = 0x0E;
BYTE popDS = 0x1F;
BYTE movDX = 0xBA;
BYTE helloAddr1 = 0x1A;
BYTE helloAddr2 = 0x01;
BYTE movAH = 0xB4;
BYTE nine = 0x09;
BYTE Int = 0xCD;
BYTE tOne = 0x21;
BYTE movAX = 0xB8;
BYTE ret1 = 0x01;
BYTE ret2 = 0x4C;
BYTE movBL = 0xB3;
BYTE keyVal = 0x03; // Encrypt/Decrypt key
typedef unsigned char BYTE;
BYTE data[] = { 0x8D, 0x0E, 0x01, 0xB7, 0x1D, 0xB3, keyVal, 0x30, 0x1C, 0x46, 0xFE, 0xCF, 0x75, 0xF9,
movDX, helloAddr1, helloAddr2, movAH, nine, Int, tOne, movAX, ret1, ret2, Int, tOne,
0x48, 0x65, 0x6C, 0x6C, 0x6F, 0x20, 0x57, 0x6F, 0x72, 0x6C, 0x64, 0x21, 0x0D, 0x0D, 0x0A, 0x24 };
The decryption portion of the machine code is the first 14 bytes of "data". This decryption routine would take the obfuscated machine code bytes and decrypt them by xor-ing the bytes with the same key that was used to encrypt them. I am encrypting the bytes in my C++ code with this:
for (int i = 15; i < ARRAY_SIZE(data); i++)
{
data[i] ^= keyVal;
}
I have verified over and over again that my addressing is correct considering that the code begins at offset 100. What I have noticed is that when keyVal is 0x00, my code runs fine and I get 10 .com files that print Hello World!. However, this does me no good as 0x00 leaves everything unchanged. When I provide an actual key like 0x02, my program no longer works. It simply hangs until I close out DosBox. Any hints as to the cause of this would be a great help. I have some interesting plans for junk insertion (The actual metamorphic part) but I don't want to move on to that until I figure out this encrypt/decrypt issue.
The decryption portion of the machine code is the first 14 bytes of "data".
and
for (int i = 15; i < ARRAY_SIZE(data); i++)
do not match since in C++ array indexes start at 0.
In your array data[15] == helloAddr1 which means you are not encrypting the data[14] == movDX element. Double-check which elements should be encrypted and start at i = 14 if required.
My array looks something like this;
unsigned char send_bytes[] = { 0x0B, 0x11, 0xA6, 0x05, 0x00, 0x00, 0x70 };
One of the values is a variable that can change all the time.. so I tried something like this;
const char* input = "0x05";
unsigned char send_bytes[] = { 0x0B, 0x11, 0xA6, input, 0x00, 0x00, 0x70 };
When I compile I get a warning;
warning: initialization makes integer from pointer without a cast
I am a little confused on the conversion I need to do.. since the array has hex strings in it.. and the input string is a char..
in the first line you are declaring a pointer to const char, and initializing to the beginning of string "0x05", that's fine, but it is not the thing you are trying to do.
in the second line, you try to initialize the fourth array element (an unsigned char) with the value of the pointer you assigned to the input variable in the first line. The compiler says you are pretending to embed a pointer value (the address of "0x05" string) into a char variable, so that's why it complained. And also it is not what you intend.
also, take into account that if you are using binary data (from the fact you are initializing arrays with hex numbers) you had better to use unsigned char for binaries, as signed char is valid only for -128 to +127 values, you can expect some more unpredictable behaviour. Perhaps, a declaration typedef unsigned char byte; can do things easier.
typedef unsigned char byte;
byte send_bytes[] = { 0x0b, 0x11, 0xa6, 0x00, 0x00, 0x00, 0x70 };
byte &input = send_bytes[3]; /* input is an alias of send_bytes[3] */
BR,
Luis
Maybe explaining exactly what const char* input = "0x05"; does will clear things up for you.
First the compiler computes the string data and creates it as a static object:
const char data[5] = { 0x30, 0x78, 0x30, 0x35, 0x0 };
Then your variable is initialized:
const char *input = &data[0];
Note that input is a pointer with a value that depends entirely upon the location the compiler chooses to store the string data at, and has nothing to do with the contents of the string. So if you say char c = input; then c basically gets assigned a random number.
So you should be asking yourself "Where is the value 0x05 that I want to store in the send_bytes array?" In your code it's encoded as text, rather than as a number that your program can use directly. You need to figure out how to convert from a string of symbols following the hexadecimal scheme of representing numbers into C++'s native representation of numbers.
Here are a couple hints. Part of the operation involves associating values with each digit symbol. The symbol '0' is associated with the value zero, '1' with the value one, and so on, according to the usual hexadecimal system. Second, once you can get the associated value of a symbol, then you can use those values in some basic arithmetic operations to figure out the value of the number represented by the whole string of symbols.
For example, if you have the symbols '1' '2' and 'a', in that order from left to right then the arithmetic to compute what number is represented is 1 * 16 * 16 + 2 * 16 + 10.
The error string is pretty much telling you exactly what's wrong.
input is of type const char* (a pointer to a const char), whereas your array send_bytes is of type unsigned char[] (an array of unsigned chars).
First, signed and unsigned values are still different types, though your error message isn't referring to that specifically.
In reality, your input value isn't a string (as there is no true string type in C++), but a pointer to a character. This means that the input string doesn't hold the byte x05, but rather the bytes {x30, x78, x30, x35, x00}.
The compiler is saying Hey, I've no idea what you're trying to do, so I'm just converting the address that string I don't understand (input) to an unsigned char and adding it to the array.
That means if the string "0x05" starts at location 0xAB, your array will ultimately contain { 0x0B, 0x11, 0xA6, 0xAB, 0x00, 0x00, 0x70 }.
You're going to either have to convert from a string to an integer using a radix of 16, or just not use a string at all.
I'd also recommend reading up on pointers.
The array doesn't have "hex strings" in it - if they were, they would be enclosed in quotation marks, like all strings.
The literals are integers written in hexadecimal notation, and equivalent to
unsigned char send_bytes[] = { 11, 17, 166, input, 0, 0, 112 };
Since it's an array of unsigned char you should put an unsigned char there:
unsigned char input = 0x05;
unsigned char send_bytes[] = { 0x0B, 0x11, 0xA6, input, 0x00, 0x00, 0x70 };
You had better to put in your code:
unsigned char send_bytes[] = { 0x0b, 0x11, 0xa6, 0x00, 0x00, 0x00, 0x70 };
unsigned char &input = send_bytes[3]; /* input is an alias of send_bytes[3] */
this way you can do things like:
input = 0x26;
send_packet(send_bytes);
So there is a program that I am working on, that requires me to access data from a char array containing hex values. I have to use a function called func(), in this example, in order to do access the data structure. Func() contains 3 pointer variables, each of different types, and I can use any of them to access the data in the array. Whichever datatype I choose will affect what values will be stored to the pointer. Soo heres the code:
unsigned char data[]
{
0xBA, 0xDA, 0x69, 0x50,
0x33, 0xFF, 0x33, 0x40,
0x20, 0x10, 0x03, 0x30,
0x66, 0x03, 0x33, 0x40,
}
func()
{
unsigned char *ch;
unsigned int i*;
unsigned short* s;
unsigned int v;
s = (unsigned short*)&data[0];
v = s[6];
printf("val:0x%x \n",v);
}
Output:
Val:0x366
The problem with this output is that it should be 0x0366 with the zero in front of the 3, but it gets cut off at the printf statement, and I'm not allowed to modify that. How else could I fix this?
Use a format that specifies leading zeros: %04x.
Without changing the format passed to printf or replacing it entirely I'm afraid there's no way to affect the output.
I am using MSVC++ 2010 Express, and I would love to know how to convert
BYTE Key[] = {0x50,0x61,0x73,0x73,0x77,0x6F,0x72,0x64};
to "Password" I am having a lot of trouble doing this. :( I will use this knowledge to take things such as...
BYTE Key[] { 0xC2, 0xB3, 0x72, 0x3C, 0xC6, 0xAE, 0xD9, 0xB5, 0x34, 0x3C, 0x53, 0xEE, 0x2F, 0x43, 0x67, 0xCE };
And other various variables and convert them accordingly.
Id like to end up with "Password" stored in a char.
Key is an array of bytes. If you want to store it in a string, for example, you should construct the string using its range constructor, that is:
string key_string(Key, Key + sizeof(Key)/sizeof(Key[0]));
Or if you can compile using C++11:
string key_string(begin(Key), end(Key));
To get a char* I'd go the C way and use strndup:
char* key_string = strndup(Key, sizeof(Key)/sizeof(Key[0]));
However, if you're using C++ I strongly suggest you use string instead of char* and only convert to char const* when absolutely necessary (e.g. when calling a C API). See here for good reasons to prefer std::string.
All you are lacking is a null terminator, so after doing this:
char Key_str[(sizeof Key)+1];
memcpy(Key_str,key,sizeof Key);
Key_str[sizeof Key] = '\0';
Key_str will be usable as a regular char * style string.