I have a function in c++ that accepts
char[] arg. If char[] is hardcode like
char t[] = { 0x41, 0x54, 0x2b, 0x4d,
0x53, 0x4c, 0x53, 0x45, 0x43, 0x55,
0x52, 0x3d, 0x31, 0x2c, 0x30 }
it works. I need to be able to get the char[] from cin, the data from cin will be ascii AT+MSLSECUR=1,0 and need to convert it to equivalent of the hardcoded char[] i show above
I dont know where to start i tried simply making cin read to char[] but doesnt seem to work char[] data is wrong
I am new to this so please forgive my lack of knowledge
I guess that's that what you are asking for :
std::string input ;
std::cin << input ;
char *YourCharArray = new char [ input.size() + 1 ] ;
strcpy(YourCharArray , input.c_str());
Related
So there is a program that I am working on, that requires me to access data from a char array containing hex values. I have to use a function called func(), in this example, in order to do access the data structure. Func() contains 3 pointer variables, each of different types, and I can use any of them to access the data in the array. Whichever datatype I choose will affect what values will be stored to the pointer. Soo heres the code:
unsigned char data[]
{
0xBA, 0xDA, 0x69, 0x50,
0x33, 0xFF, 0x33, 0x40,
0x20, 0x10, 0x03, 0x30,
0x66, 0x03, 0x33, 0x40,
}
func()
{
unsigned char *ch;
unsigned int i*;
unsigned short* s;
unsigned int v;
s = (unsigned short*)&data[0];
v = s[6];
printf("val:0x%x \n",v);
}
Output:
Val:0x366
The problem with this output is that it should be 0x0366 with the zero in front of the 3, but it gets cut off at the printf statement, and I'm not allowed to modify that. How else could I fix this?
Use a format that specifies leading zeros: %04x.
Without changing the format passed to printf or replacing it entirely I'm afraid there's no way to affect the output.
How could insert text by argument and automatically transform it to hex?
I tried with:
unsigned char aesKey[32] = argv[1];
but get errors
The output would be like this:
unsigned char aesKey[32] = {
0x53, 0x28, 0x40, 0x6e, 0x2f, 0x64, 0x63, 0x5d, 0x2d, 0x61, 0x77, 0x40, 0x76, 0x71, 0x77, 0x28,
0x74, 0x61, 0x7d, 0x66, 0x61, 0x73, 0x3b, 0x5d, 0x66, 0x6d, 0x3c, 0x3f, 0x7b, 0x66, 0x72, 0x36
};
unsigned char *buf;
aes256_context ctx;
aes256_init(&ctx, aesKey);
for (unsigned long i = 0; i < lSize/16; i++) {
buf = text + (i * 16);
aes256_encrypt_ecb(&ctx, buf);
}
aes256_done(&ctx);
Thanks in advance
In C and C++, when you have code like
char name[]="John Smith";
The compiler knows at compile time what the size of that char array, and all the values will be. So it can allocate it on the stack frame and assign it the value.
When you have code like
char * strptr = foo();
char str[] = strptr;
The compiler doesn't know what the size and value of the string pointed by strptr is. That is why this is not allowed in C/C++.
In other words, only string literals can be assigned to char arrays, and that too only at the time of declaration.
So
char name[] = "John Smith";
is allowed.
char name[32];
name = "John Smith";
is not allowed.
Use memcpy
So you could use memcpy. (Or use c++ alternative that others have alluded to)
unsigned char *aesKey;
size_t len = (strlen(argv[1])+1)*sizeof(unsigned char);
aesKey = malloc(len);
memcpy(aesKey, argv[1], len);
The old solution
(here is my previous answer, the answer above is better)
So you need to use strncpy.
unsigned char aesKey[32];
strncpy((char *) aesKey, argv[1], 32);
Notice the routine is strncpy not strcpy. strcpy is unsafe. (Thanks PRouleau for the arg fix)
If strncpy is not available in Visual Studio then you may have to try strcpy_s (Thanks Google: user:427390)
In C/C++, the compiler does not automatically manipulate the arrays. You have to specify how to copy them.
The good old way is with memcpy(). A more modern way is with std::copy(). In any case, you have to validate the length of argv[1] before copying into aesKey.
For the conversion into hex, you probably have to transform a string like "AAEE3311" (up to 2*32 chars) into bytes. You should use std::istringstream and fill your aesKey position by position.
Ex:
std::istringstream Input(argv[1]);
Input >> std::hex >> aesKey[0];
I would imagine a program being called as below -
myprog 0x53 0x28 0x40 0x6e 0x2f 0x64 0x63
Inside the program I would have a loop to assign the arguments to the array -
const int size = 32;
unsigned char aesKey[size];
char* p;
for (int i = 1; i < argc || i < size; ++i)
{
aesKey[i] = (unsigned char)strtol(argv[i], &p, 16);
}
I am using MSVC++ 2010 Express, and I would love to know how to convert
BYTE Key[] = {0x50,0x61,0x73,0x73,0x77,0x6F,0x72,0x64};
to "Password" I am having a lot of trouble doing this. :( I will use this knowledge to take things such as...
BYTE Key[] { 0xC2, 0xB3, 0x72, 0x3C, 0xC6, 0xAE, 0xD9, 0xB5, 0x34, 0x3C, 0x53, 0xEE, 0x2F, 0x43, 0x67, 0xCE };
And other various variables and convert them accordingly.
Id like to end up with "Password" stored in a char.
Key is an array of bytes. If you want to store it in a string, for example, you should construct the string using its range constructor, that is:
string key_string(Key, Key + sizeof(Key)/sizeof(Key[0]));
Or if you can compile using C++11:
string key_string(begin(Key), end(Key));
To get a char* I'd go the C way and use strndup:
char* key_string = strndup(Key, sizeof(Key)/sizeof(Key[0]));
However, if you're using C++ I strongly suggest you use string instead of char* and only convert to char const* when absolutely necessary (e.g. when calling a C API). See here for good reasons to prefer std::string.
All you are lacking is a null terminator, so after doing this:
char Key_str[(sizeof Key)+1];
memcpy(Key_str,key,sizeof Key);
Key_str[sizeof Key] = '\0';
Key_str will be usable as a regular char * style string.
Can't figure out why I am getting seemingly random output from the Crypto++ RC2 decoder. The input is always the same, but the output is always different.
const char * cipher ("o4hk9p+a3+XlPg3qzrsq5PGhhYsn+7oP9R4j9Yh7hp08iMnNwZQnAUrZj6DWr37A4T+lEBDMo8wFlxliuZvrZ9tOXeaTR8/lUO6fXm6NQpa5P5aQmQLAsmu+eI4gaREvZWdS0LmFxn8+zkbgN/zN23x/sYqIzcHU");
int keylen (64);
unsigned char keyText[] = { 0x1a, 0x1d, 0xc9, 0x1c, 0x90, 0x73, 0x25, 0xc6, 0x92, 0x71, 0xdd, 0xf0, 0xc9, 0x44, 0xbc, 0x72, 0x00 };
std::string key((char*)keyText);
std::string data;
CryptoPP::RC2Decryption rc2(reinterpret_cast<const byte *>(key.c_str()), keylen);
CryptoPP::ECB_Mode_ExternalCipher::Decryption rc2Ecb(rc2);
CryptoPP::StringSource
( cipher
, true
, new CryptoPP::Base64Decoder
( new CryptoPP::StreamTransformationFilter
( rc2Ecb
, new CryptoPP::StringSink(data)
, CryptoPP::BlockPaddingSchemeDef::NO_PADDING
)
)
);
std::cout << data << '\n';
The parameters to the RC2::Decryption constructor are: (pointer to key-bytes, length of key-bytes). You are giving it a pointer to 16 bytes but using a length of 64 bytes. Crypto++ is reading uninitialized memory when reading the key, so you get random results.
If you want to indicate an effective key-length, you can use the other constructor like this:
CryptoPP::RC2Decryption rc2(keyText, 16, keylen);
Note that you should not use a std::string to hold your key. It is completely legal for a key to contain a 0x00-byte, and std::string is not designed to hold those.
RC2Decryption should have been defined as:
CryptoPP::RC2Decryption rc2(reinterpret_cast<const byte *>(key.c_str()), key.size(), keylen);
I am interfacing with an external device which is sending data in hex format. It is of form
> %abcdefg,+xxx.x,T,+yy.yy,T,+zz.zz,T,A*hhCRLF
CR LF is carriage return line feed
hh->checksum
%abcdefg -> header
Each character in above packet is sent as a hex representation (the xx,yy,abcd etc are replaced with actual numbers). The problem is at my end I store it in a const char* and during the implicit conversion the checksum say 0x05 is converted to \0x05. Here \0 being null character terminates my string. This is perceived as incorrect frames when it is not. Though I can change the implementation to processing raw bytes (in hex form) but I was just wondering whether there is another way out, because it greatly simplifies processing of bytes. And this is what programmers are meant to do.
Also in cutecom (on LINUX RHEL 4) I checked the data on serial port and there also we noticed \0x05 instead of 5 for checksum.
Note that for storing incoming data I am using
//store data from serial here
unsigned char Buffer[SIZE];
//convert to a QString, here is where problem arises
QString str((const char*)Buffer); of \0
QString is "string" clone of Qt. Library is not an issue here I could use STL also, but C++ string library is also doing the same thing. Has somebody tried this type of experiment before? Do share your views.
EDIT
This is the sample code you can check for yourself also:
#include <iostream>
#include <string>
#include <QString>
#include <QApplication>
#include <QByteArray>
using std::cout;
using std::string;
using std::endl;
int main(int argc,char* argv[])
{
QApplication app(argc,argv);
int x = 0x05;
const char mydata[] = {
0x00, 0x00, 0x03, 0x84, 0x78, 0x9c, 0x3b, 0x76,
0xec, 0x18, 0xc3, 0x31, 0x0a, 0xf1, 0xcc, 0x99};
QByteArray data = QByteArray::fromRawData(mydata, sizeof(mydata));
printf("Hello %s\n",data.data());
string str("Hello ");
unsigned char ch[]={22,5,6,7,4};
QString s((const char*)ch);
qDebug("Hello %s",qPrintable(s));
cout << str << x ;
cout << "\nHello I am \0x05";
cout << "\nHello I am " << "0x05";
return app.exec();
}
QByteArray text = QByteArray::fromHex("517420697320677265617421");
text.data(); // returns "Qt is great!"
If your 0x05 is converted to the char '\x05', then you're not having hexadecimal values (that only makes sense if you have numbers as strings anyway), but binary ones. In C and C++, a char is basically just another integer type with very little added magic. So if you have a 5 and assign this to a char, what you get is whatever character your system's encoding defines as the fifth character. (In ASCII, that would be the ENQ char, whatever that means nowadays.)
If what you want instead is the char '5', then you need to convert the binary value into its string representation. In C++, this is usually done using streams:
const char ch = 5; // '\0x5'
std::ostringstream oss;
oss << static_cast<int>(ch);
const std::string& str = oss.str(); // str now contains "5"
Of course, the C std library also provides functions for this conversion. If streaming is too slow for you, you might try those.
I think c++ string classes are usually designed to handle zero-terminated char sequences. If your data is of known length (as it appears to be) then you could use a std::vector. This will provide some of the functionality of a string class, whilst ignoring nulls within data.
As I see you want to eliminate control ASCII symbols. You could do it in the following way:
#include <iostream>
#include <string>
#include <QtCore/QString>
#include <QtCore/QByteArray>
using namespace std;
// test data from your comment
char data[] = { 0x49, 0x46, 0x50, 0x4a, 0x4b, 0x51, 0x52, 0x43, 0x2c, 0x31,
0x32, 0x33, 0x2e, 0x34, 0x2c, 0x54, 0x2c, 0x41, 0x2c, 0x2b,
0x33, 0x30, 0x2e, 0x30, 0x30, 0x2c, 0x41, 0x2c, 0x2d, 0x33,
0x30, 0x2e, 0x30, 0x30, 0x2c, 0x41, 0x2a, 0x05, 0x0d, 0x0a };
// functor to remove control characters
struct toAscii
{
// values >127 will be eliminated too
char operator ()( char value ) const { if ( value < 32 && value != 0x0d && value != 0x0a ) return '.'; else return value; }
};
int main(int argc,char* argv[])
{
string s;
transform( &data[0], &data[sizeof(data)], back_inserter(s), toAscii() );
cout << s; // STL string
// convert to QString ( if necessary )
QString str = QString::fromStdString( s );
QByteArray d = str.toAscii();
cout << d.data(); // QString
return 0;
}
The code above prints the following in console:
IFPJKQRC,123.4,T,A,+30.00,A,-30.00,A*.
If you have continuous stream of data you'll get something like:
IFPJKQRC,123.4,T,A,+30.00,A,-30.00,A*.
IFPJKQRC,123.4,T,A,+30.00,A,-30.00,A*.
IFPJKQRC,123.4,T,A,+30.00,A,-30.00,A*.
IFPJKQRC,123.4,T,A,+30.00,A,-30.00,A*.
IFPJKQRC,123.4,T,A,+30.00,A,-30.00,A*.