Is there a simple way to convert a binary bitset to hexadecimal? The function will be used in a CRC class and will only be used for standard output.
I've thought about using to_ulong() to convert the bitset to a integer, then converting the integers 10 - 15 to A - F using a switch case. However, I'm looking for something a little simpler.
I found this code on the internet:
#include <iostream>
#include <string>
#include <bitset>
using namespace std;
int main(){
string binary_str("11001111");
bitset<8> set(binary_str);
cout << hex << set.to_ulong() << endl;
}
It works great, but I need to store the output in a variable then return it to the function call rather than send it directly to standard out.
I've tried to alter the code but keep running into errors. Is there a way to change the code to store the hex value in a variable? Or, if there's a better way to do this please let me know.
Thank you.
You can send the output to a std::stringstream, and then return the resultant string to the caller:
stringstream res;
res << hex << uppercase << set.to_ulong();
return res.str();
This would produce a result of type std::string.
Here is an alternative for C:
unsigned int bintohex(char *digits){
unsigned int res=0;
while(*digits)
res = (res<<1)|(*digits++ -'0');
return res;
}
//...
unsigned int myint=bintohex("11001111");
//store value as an int
printf("%X\n",bintohex("11001111"));
//prints hex formatted output to stdout
//just use sprintf or snprintf similarly to store the hex string
Here is the easy alternative for C++:
bitset <32> data;
/*Perform operation on data*/
cout << "data = " << hex << data.to_ulong() << endl;
Related
In c++,
I don't understand about this experience. I need your help.
in this topic, answers saying use to_string.
but they say 'to_string' is converting bitset to string and cpp reference do too.
So, I wonder the way converting something data(char or string (maybe ASCII, can convert unicode?).
{It means the statement can be divided bit and can be processed it}
The question "How to convert char to bits?"
then answers say "use to_string in bitset"
and I want to get each bit of my input.
Can I cleave and analyze bits of many types and process them? If I can this, how to?
#include <iostream>
#include <bitset>
#include <string>
using namespace std;
int main() {
char letter;
cout << "letter: " << endl;
cin >> letter;
cout << bitset<8>(letter).to_string() << endl;
bitset<8> letterbit(letter);
int lettertest[8];
for (int i = 0; i < 8; ++i) {
lettertest[i] = letterbit.test(i);
}
cout << "letter bit: ";
for (int i = 0; i < 8; ++i) {
cout << lettertest[i];
}
cout << endl;
int test = letterbit.test(0);
}
When executing this code, I get result I want.
But I don't understand 'to_string'.
An important point is using of "to_string"
{to_string is function converting bitset to string(including in name),
then Is there function converting string to bitset???
Actually, in my code, use the function with a letter -> convert string to bitset(at fitst, it is result I want)}
help me understand this action.
Q: What is a bitset?
https://www.cplusplus.com/reference/bitset/bitset/
A bitset stores bits (elements with only two possible values: 0 or 1,
true or false, ...).
The class emulates an array of bool elements, but optimized for space
allocation: generally, each element occupies only one bit (which, on
most systems, is eight times less than the smallest elemental type:
char).
In other words, a "bitset" is a binary object (like an "int", a "char", a "double", etc.).
Q: What is bitset<>.to_string()?
Bitsets have the feature of being able to be constructed from and
converted to both integer values and binary strings (see its
constructor and members to_ulong and to_string). They can also be
directly inserted and extracted from streams in binary format (see
applicable operators).
In other words, to_string() allows you to convert the binary bitset to text.
Q: How to to I convert convert char(or string, other type) -> bits?
A: Per the above, simply use bitset<>.to_ulong()
Here is an example:
https://en.cppreference.com/w/cpp/utility/bitset/to_string
Code:
#include <iostream>
#include <bitset>
int main()
{
std::bitset<8> b(42);
std::cout << b.to_string() << '\n'
<< b.to_string('*') << '\n'
<< b.to_string('O', 'X') << '\n';
}
Output:
00101010
**1*1*1*
OOXOXOXO
This question already has answers here:
C++ convert hex string to signed integer
(10 answers)
Closed 8 years ago.
I have searched online, but it doesn't seem to be a solution to my problem. Basically I have a std::string which contains a hexadecimal memory address (like 0x10FD7F04, for example). This number is read from a text file and saved as a std::string, obviously.
I need to convert this string to an int value, but keeping the hex notation, 0x. Is there any way to do this?
You can use C++11 std::stoi function:
#include <iostream>
#include <iomanip>
#include <string>
int main()
{
std::string your_string_rep{"0x10FD7F04"};
int int_rep = stoi(your_string_rep, 0, 16);
std::cout << int_rep << '\n';
std::cout << std::hex << std::showbase << int_rep << '\n';
}
Outputs:
285048580
0x10fd7f04
I need to convert this string to an int value, but keeping the hex notation, 0x. Is there any way to do this?
There are two parts to your question:
Convert the string hexadecimal representation to an integer.
std::string your_string_rep{ "0x10FD7F04" };
std::istringstream buffer{ your_string_rep };
int value = 0;
buffer >> std::hex >> value;
Keeping the hex notation on the resulting value. This is not necessary/possible, because an int is already a hexadecimal value (and a decimal value and a binary value, depending on how you interpret it).
In other words, with the code above, you can just write:
assert(value == 0x10FD7F04); // will evaluate to true (assertion passes)
Alternatively you can use something like this
std::string hexstring("0x10FD7F04");
int num;
sscanf( hexstring.data(), "%x", &num);
I have a BitSet of 8 bits.
How would I convert those 8 bits to a byte then write to file?
I have looked everywhere and only find converting the other way.
Thanks alot!
Assuming that you are talking about C++ STL bitsets, the answer is to convert the bitset to int (ulong to be precise), and casting the result into a char.
Example:
#include <bitset>
#include <iostream>
using namespace std;
main()
{
bitset<8> x;
char byte;
cout << "Enter a 8-bit bitset in binary: " << flush;
cin >> x;
cout << "x = " << x << endl;
byte = (char) x.to_ulong();
cout << "As byte: " << (int) byte << endl;
}
http://www.cplusplus.com/reference/stl/bitset/
They can also be directly inserted and extracted from streams in binary format.
You don't need to convert anything, you just write them to the output stream.
Aside from that, if you really wanted to extract them into something you're used to, to_ulong and to_string methods are provided.
If you have more bits in the set than an unsigned long can hold and don't want to write them out directly to the stream, then you're either going to have convert to a string and go that route, or access each bit using the [] operator and shift them into bytes that you're writing out.
You could use fstream std::ofstream:
#include <fstream>
std::ofstream os("myfile.txt", std::ofstream::binary);
os << static_cast<uint_fast8_t>(bitset<8>("01101001").to_ulong());
os.close();
I have a long long holding ASCII hex values and want to convert it to a string. I have this code:
char myBuffer[8];
long long myLongLong = 0x7177657274797569;
sprintf(myBuffer,"%c%c%c%c%c%c%c%c",myLongLong);
int x;
cout << myBuffer;
cin >> x;
return 0;
The hex code should be "qwertyui", but it always gives other value.
I tried with %c, %s, %X but it doesn't give me the output I need, the closest was %c but it prints out only one char.
That code is wrong in so many ways I don't know where to start...
myBuffer is too small to hold the 8 chars + the NUL terminator, ie. should be myBuffer[9].
sprintf is expecting 8 arguments, you're only passing 1. The other required arguments will be whatever's on the stack.
myLongLong is not a char
You don't take into account endianness.
You're using C functions and doing things in a C way in C++. Why don't you use std::strings as opposed to C-style strings and stringstreams as an alternative to sprintf?
The closest almost working example of what you want, as similar to your example, is something like:
#include <cstdio>
#include <iostream>
using namespace std;
int main(void)
{
char myBuffer[9];
long long myLongLong = 0x7177657274797569;
char *c_ptr = (char*)&myLongLong;
sprintf(myBuffer,"%c%c%c%c%c%c%c%c", c_ptr[0], c_ptr[1], c_ptr[2], c_ptr[3], c_ptr[4], c_ptr[5], c_ptr[6], c_ptr[7]);
int x;
cout<<myBuffer;
cin>>x;
return 0;
}
Which will output "iuytrewq" on my little-endian machine. As I mentioned, that doesn't take into account the endianness. If the machine is little-endian then you could read/print the bytes in reverse.
I really don't understand why you're trying to do this though...
You could try
union { char buf[8]; long long num; } u;
u.num = 0x7177657274797569LL;
cout << u.str << endl;
But I don't understand what you want really to do. What about endianness ?
Use a string stream
long long myLongLong = 0x7177657274797569;
std::stringstream ss;
ss << std::hex << myLongLong;
std::cout << ss << std::endl
You want to print each byte of the long-long as an ascii char?
Then you need to loop over the long long extracting one byte at a time, look at bit shifts and masking.
Hint it's generally easier (if you know the length) to work from the last byte and shift right
or - you could just memcpy the long-long into the char array - except for any byte ordering issues
Try the following code.
#include <iostream>
using namespace std;
int main(void)
{
char myBuffer[8];
long long myLongLong = 0x7177657274797569;
for(int i = 0; i<8;i++)
{
myBuffer[i] = myLongLong>>(64-(i+1)*8);
}
cout<<myBuffer<<endl;
return 0;
}
I've got a homework assignment in my C++ programming class to write a function that outputs the binary value of a variable's value.
So for example, if I set a value of "a" to a char I should get the binary value of "a" output.
My C++ professor isn't the greatest in the whole world and I'm having trouble getting my code to work using the cryptic examples he gave us. Right now, my code just outputs a binary value of 11111111 no matter what I set it too (unless its NULL then I get 00000000).
Here is my code:
#include <iostream>
#define useavalue 1
using namespace std;
void GiveMeTehBinary(char bin);
int main(){
#ifdef useavalue
char b = 'a';
#else
char b = '\0';
#endif
GiveMeTehBinary(b);
system("pause");
return 0;
}
void GiveMeTehBinary(char bin){
long s;
for (int i = 0; i < 8; i++){
s = bin >> i;
cout << s%2;
}
cout << endl << endl;
}
Thanks a ton in advance guys. You're always extremely helpful :)
Edit: Fixed now - thanks a bunch :D The problem was that I was not storing the value from the bit shift. I've updated the code to its working state above.
The compiler should warn you about certain statements in your code that have no effect1. Consider
bin >> i;
This does nothing, since you don’t store the result of this operation anywhere.
Also, why did you declare tehbinary as an array? All you ever use is one element (the current one). It would be enough to store just the current bit.
Some other things:
NULL must only be used with pointer values. Your usage works but it’s not the intended usage. What you really want is a null character, i.e. '\0'.
Please use real, descriptive names. I vividly remember myself using variables called tehdataz etc. but this really makes the code hard to read and once the initial funny wears off it’s annoying both for you when you try to read your code, and for whoever is grading your code.
Formatting the code properly helps understanding a lot: make the indentation logical and consistent.
1 If you’re using g++, always pass the compiler flags -Wall -Wextra to get useful diagnostics about your code.
Try this:
#include <bitset>
#include <iostream>
int main()
{
std::bitset<8> x('a');
std::cout << x << std::endl;
}
it's actually really simple. to convert from decimal to binary you will need to include #include <bitset> in your program. inside here, it gives you a function that allows you to convert from decimal to binary form. and the function looks like this:
std::cout << std::bitset<8>(0b01000101) << std::endl;
the number 8 in the first argument means the length of the output string. the second argument is the value you want to convert. by the way, you can input a variable in binary form by declaring a 0b in front of the number to write it in binary form. note that to write in binary form is a feature added in c++14 so using any version lower than that won't work. here is the full code if you want to test it out.
#include <iostream>
#include <bitset>
int main()
{
std::cout << std::bitset<8>(0b01000101) << std::endl;
}
note that you don't have to input a binary number to do this.
#include <iostream>
#include <bitset>
int main()
{
std::cout << std::bitset<8>(34) << std::endl;
}
output:
00100010
Why not just check each bit in the unsigned char variable?
unsigned char b=0x80|0x20|0x01; //some test data
int bitbreakout[8];
if(b&0x80) bitbreakout[7]=1;
//repeat above for 0x40, 0x20, etc.
cout << bitbreakout;
There are a TON of ways to optimize this, but this should give you an idea of what do to.
#include <iostream>
using namespace std;
int main(){
int x = 255;
for(int i = numeric_limits<int>::digits; i >=0; i--){
cout << ((x & (1 << i)) >> i);
}
}
it's actually really simple. if you know how to convert decimal to binary, then you can code it easily in c++. in fact I have gone ahead and created a header file that allows you not only to convert from decimal to binary, it can convert from decimal to any number system. here's the code.
#pragma once
#include <string>
char valToChar(const uint32_t val)
{
if (val <= 9)
return 48 + val;
if (val <= 35)
return 65 + val - 10;
return 63;
}
std::string baseConverter(uint32_t num, const uint32_t &base)
{
std::string result;
while (num != 0)
{
result = valToChar(num % base) + result;
num /= base;
}
return result;
}
now, here is how you can use it.
int main()
{
std::cout << baseConverter(2021, 2) << "\n";
}
output:
11111100101