So I need a little help, I've currently got a text file with following data in it:
myfile.txt
-----------
b801000000
What I want to do is read that b801 etc.. data as bits so I could get values for
0xb8 0x01 0x00 0x00 0x00.
Current I'm reading that line into a unsigned string using the following typedef.
typedef std::basic_string <unsigned char> ustring;
ustring blah = reinterpret_cast<const unsigned char*>(buffer[1].c_str());
Where I keep falling down is trying to now get each char {'b', '8' etc...} to really be { '0xb8', '0x01' etc...}
Any help is appreciated.
Thanks.
I see two ways:
Open the file as std::ios::binary and use std::ifstream::operator>> to extract hexadecimal double bytes after using the flag std::ios_base::hex and extracting to a type that is two bytes large (like stdint.h's (C++0x/C99) uint16_t or equivalent). See #neuro's comment to your question for an example using std::stringstreams. std::ifstream would work nearly identically.
Access the stream iterators directly and perform the conversion manually. Harder and more error-prone, not necessarily faster either, but still quite possible.
strtol does string (needs a nullterminated C string) to int with a specified base
Kind of a dirty way to do it:
#include <stdio.h>
int main ()
{
int num;
char* value = "b801000000";
while (*value) {
sscanf (value, "%2x", &num);
printf ("New number: %d\n", num);
value += 2;
}
return 0;
}
Running this, I get:
New number: 184
New number: 1
New number: 0
New number: 0
New number: 0
Related
How to send data in hex on SerialPort?
I used this function, I receive the "yes, I can write to port" but I do not receive the data I entered
QByteArray send_data;
if(serialPort->isWritable())
{
qDebug()<<"Yes, I can write to port!";
int size = sizeof(send_data);
serialPort->write(send_data,size);
}
send_data += static_cast<char>(0xAA);
serialPort->write(send_data);
Data are transmitted in binary (essentially a sequence of 0 and 1). No matter what. Showing data in hexadecimal rather than a string of characters is just a choice.
In the following example, you can see that the array string_c is initialized with the same string that you are using in your code. Next, I print the data in both, as hex and as a string. You can see that the only difference is in the way I decided to print the data. The source data is the same for both.
#include <stdio.h>
#include <stdint.h>
#include <inttypes.h>
void printCharInHexadecimal(const char* str, int len)
{
for (int i = 0; i < len; ++ i) {
uint8_t val = str[i];
char tbl[] = "0123456789ABCDEF";
printf("0x");
printf("%c", tbl[val / 16]);
printf("%c", tbl[val % 16]);
printf(" ");
}
printf("\n");
}
int main()
{
char string_c[] = "Yes, i can write to port";
// string printed in hex
printCharInHexadecimal(string_c, 24);
// same string printed as "text"
printf("%s\n",string_c);
return 0;
}
You can see the above code running here: https://onlinegdb.com/Y7fwaMTDoq
Note: I got the function printCharInHexadecimal from here: https://helloacm.com/the-c-function-to-print-a-char-array-string-in-hexadecimal/
As suspected, your use of sizeof is wrong. It is not returning the size of the contained data, it is returning a non-zero constant that is the size of a QByteArray object itself. Since that object was freshly constructed it should be empty, and any size you use in the first write other than zero will lead to undefined behavior. Use:
int size = (int)send_data.size();
Skip the first write entirely, and use the above for your second write.
You need to be clear about what you expect. 0xAA in your source code is simply an integer value using hex representation. It complies to exactly the same code regardless of the source code presentation: 0xAA == 170 == 0263.
If you actually intended to output a string of characters at run time representing a value in hexadecimal, you need to convert that value from an integer to a string. For example;
char hexbyte[3] ;
sprintf( hexbyte, "%02X", 170 ) ;
serialPort->write(send_data) ;
will output ASCII characters AA, whilst demonstrating the equivalence of 170 to 0xAA. That is the hex notation in the source does not affect the value or how it is stored or represented in the compiled machine code.
I'm writing a Huffman encoding program in C++, and am using this website as a reference:
http://algs4.cs.princeton.edu/55compression/Huffman.java.html
I'm now at the writeTrie method, and here is my version:
// write bitstring-encoded tree to standard output
void writeTree(struct node *tempnode){
if(isLeaf(*tempnode)){
tempfile << "1";
fprintf(stderr, "writing 1 to file\n");
tempfile << tempnode->ch;
//tempfile.write(&tempnode->ch,1);
return;
}
else{
tempfile << "0";
fprintf(stderr, "writing 0 to file\n");
writeTree(tempnode->left);
writeTree(tempnode->right);
}
}
Look at the line commented - let's say I'm writing to a text file, but I want to write the bytes that make up the char at tempnode->ch (which is an unsigned char, btw). Any suggestions for how to go about doing this? The line commented gives an invalid conversion error from unsigned char* to const char*.
Thanks in advance!
EDIT: To clarify: For instance, I'd like my final text file to be in binary -- 1's and 0's only. If you look at the header of the link I provided, they give an example of "ABRACADABRA!" and the resulting compression. I'd like to take the char (such as in the example above 'A'), use it's unsigned int number (A='65'), and write 65 in binary, as a byte.
A char is identical to a byte. The preceding line tempfile << tempnode->ch; already does exactly what you seem to want.
There is no overload of write for unsigned char, but if you want, you can do
tempfile.write(reinterpret_cast< char * >( &tempnode->ch ),1);
This is rather ugly, but it does exactly the same thing as tempfile << tempnode->ch.
EDIT: Oh, you want to write a sequence of 1 and 0 characters for the bits in the byte. C++ has an obscure trick for that:
#include <bitset>
tempfile << std::bitset< 8 >( tempnode->ch );
I am relatively new to c++ programming and I have hit one of my first major snags in all of this..
I am trying to figure out how to read a value/character from a generic ".txt" file that is on notepad. With that comparison I want to determine whether or not to read that entire line, but I can't seem to just read the single one or two digit number, I got it to read the whole line using { 'buffername'.getline(variable, size) } but when I try to change the 'size' to a specific number it gives me a comparison error saying that its invalid to switch to 'int' or 'char' (depending on how I declare the variable).
Any help is appreciated.
Thanks
int length = 2;
char * buffer;
ifstream is;
is.open ("test.txt", ios::binary );
// allocate memory:
buffer = new char [length];
// read 2 char
is.read (buffer,length);
//Compare the character and decide
delete[] buffer;
return 0;
You'll want to use an ifstream to get the value (ref 1).
Something like the following should work. Here I use a word of type std::string, but you can replace that with other types to read them (ie: int, double, etc...).
std::ifstream f("somefile.txt");
std::string word;
std::string line;
if(f >> word){
if(<the comparison>){
line = f.getline();
}
}
Here's an extended example of how to use the ifstream
First of all, for performance reasons it is a bad idea to read 1 byte at a time.
I suggest this alternative:
You would be better off reading in the whole line, and then using character array.
char variable[1000];
read your line in from the file into variable.
if (variable[1]=='c') { printf("Byte 2 (remember 0 offset) is compared for the letter c";}
getting a 2 digit #
number=((variable[3]-48)*10)+(variable[4]-48);
You have to subtract 48 because in ASCII the number 0 is 48.
I'm a beginning user in C++ and I want to know how to do this:
How can I 'create' a byte from a string/int. So for example I've:
string some_byte = "202";
When I would save that byte to a file, I want that the file is 1 byte instead of 3 bytes.
How is that possible?
Thanks in advance,
Tim
I would use C++'s String Stream class <sstream> to convert the string to an unsigned char.
And write the unsigned char to a binary file.
so something like [not real code]
std::string some_byte = "202";
std::istringstream str(some_byte);
int val;
if( !(str >> val))
{
// bad conversion
}
if(val > 255)
{
// too big
}
unsigned char ch = static_cast<unsigned char>(val);
printByteToFile(ch); //print the byte to file.
The simple answer is...
int value = atoi( some_byte ) ;
There are a few other questions though.
1) What size is an int and is it important? (for almost all systems it's going to be more than a byte)
int size = sizeof(int) ;
2) Is the Endianness important? (if it is look in to the htons() / ntohs() functions)
In C++, casting to/from strings is best done using string streams:
#include <sstream>
// ...
std::istringstream iss(some_string);
unsigned int ui;
iss >> ui;
if(!iss) throw some_exception('"' + some_string + "\" isn't an integer!");
unsigned char byte = i;
To write to a file, you use file streams. However, streams usually write/read their data as strings. you will have to open the file in binary mode and write binary, too:
#include <fstream>
// ...
std::ofstream ofs("test.bin", std::ios::binary);
ofs.write( reinterpret_cast<const char*>(&byte), sizeof(byte)/sizeof(char) );
Use boost::lexical_cast
#include "boost/lexical_cast.hpp"
#include <iostream>
int main(int, char**)
{
int a = boost::lexical_cast<int>("42");
if(a < 256 && a > 0)
unsigned char c = static_cast<unsigned char>(a);
}
You'll find the documentation at http://www.boost.org/doc/libs/1_43_0/libs/conversion/lexical_cast.htm
However, if the goal is to save space in a file, I don't think it's the right way to go. How will your program behave if you want to convert "257" into a byte? Juste go for the simplest. You'll work out later any space use concern if it is relevant (thumb rule: always use "int" for integers and not other types unless there is a very specific reason other than early optimization)
EDIT
As the comments say it, this only works for integers, and switching to bytes won't (it will throw an exception).
So what will happen if you try to parse "267"?
IMHO, it should go through an int, and then do some bounds tests, and then only cast into a char. Going through atoi for example will result extreamly bugs prone.
Friends
I want to integrate the following code into the main application code. The junk characters that come populated with the o/p string dumps the application
The following code snipette doesnt work..
void stringCheck(char*);
int main()
{
char some_str[] = "Common Application FE LBS Serverr is down";
stringCheck(some_str);
}
void stringCheck(char * newString)
{
for(int i=0;i<strlen(newString);i++)
{
if ((int)newString[i] >128)
{
TRACE(" JUNK Characters in Application Error message FROM DCE IS = "<<(char)newString[i]<<"++++++"<<(int)newString[i]);
}
}
}
Can someone please show me the better approaches to find junk characters in a string..
Many Thanks
Your char probably is represented signed. Cast it to unsigned char instead to avoid that it becomes a negative integer when casting to int:
if ((unsigned char)newString[i] >128)
Depending on your needs, isprint might do a better job, checking for a printable character, including space:
if (!isprint((unsigned char)newString[i]))
...
Note that you have to cast to unsigned char: input for isprint requires values between 0 and UCHAR_MAX as character values.