Converting string of 1s and 0s into binary value - c++

I'm trying to convert an incoming sting of 1s and 0s from stdin into their respective binary values (where a string such as "11110111" would be converted to 0xF7). This seems pretty trivial but I don't want to reinvent the wheel so I'm wondering if there's anything in the C/C++ standard libs that can already perform such an operation?

#include <stdio.h>
#include <stdlib.h>
int main(void) {
char * ptr;
long parsed = strtol("11110111", & ptr, 2);
printf("%lX\n", parsed);
return EXIT_SUCCESS;
}
For larger numbers, there as a long long version, strtoll.

You can use std::bitset (if then length of your bits is known at compile time)
Though with some program you could break it up into chunks and combine.
#include <bitset>
#include <iostream>
int main()
{
std::bitset<5> x(std::string("01011"));
std::cout << x << ":" << x.to_ulong() << std::endl;
}

You can use strtol
char string[] = "1101110100110100100000";
char * end;
long int value = strtol (string,&end,2);

You can use Boost Dynamic Bitset:
boost::dynamic_bitset<> x(std::string("01011"));
std::cout << x << ":" << x.to_ulong() << std::endl;

#include <iostream>
#include <stdio.h>
#include <string>
using namespace std;
string getBinaryString(int value, unsigned int length, bool reverse) {
string output = string(length, '0');
if (!reverse) {
for (unsigned int i = 0; i < length; i++) {
if ((value & (1 << i)) != 0) {
output[i] = '1';
}
}
}
else {
for (unsigned int i = 0; i < length; i++) {
if ((value & (1 << (length - i - 1))) != 0) {
output[i] = '1';
}
}
}
return output;
}
unsigned long getInteger(const string& input, size_t lsbindex, size_t msbindex) {
unsigned long val = 0;
unsigned int offset = 0;
if (lsbindex > msbindex) {
size_t length = lsbindex - msbindex;
for (size_t i = msbindex; i <= lsbindex; i++, offset++) {
if (input[i] == '1') {
val |= (1 << (length - offset));
}
}
}
else { //lsbindex < msbindex
for (size_t i = lsbindex; i <= msbindex; i++, offset++) {
if (input[i] == '1') {
val |= (1 << offset);
}
}
}
return val;
}
int main() {
int value = 23;
cout << value << ": " << getBinaryString(value, 5, false) << endl;
string str = "01011";
cout << str << ": " << getInteger(str, 1, 3) << endl;
}

Related

How to work with integer limits in C++. The number "-91283472332" is out of the range of a 32-bit signed integer

**Some problems in code. I must convert a string to an integer. But have limited, 32-bit signed integer. I used the `stoi() function, and did not remember about spaces in string, however using big numbers are impossible. **
My code:
#include <iostream>
#include <string>
#include <limits.h>
using namespace std;
int myAtoi(string str)
{
int Res = 0;
for (int i = 0; i < str.size(); i++)
{
if (str[0] >= 'a' && str[0] <= 'z')
{
return 0;
}
}
for (int i = 0; i < str.size(); i++)
{
if (str[i] == ' ')
i++;
// if(str[i]>='0' || str[i]<='9' )
//// if(str[i]==' ')
//// i++;
Res = stoi(str);
cout << "Res:" << Res << endl;
if (Res <= INT_MIN)
{
return INT_MIN;
}
if (Res >= INT_MAX)
{
return INT_MAX;
}
}
cout << "MIN=" << INT_MAX << endl;
cout << "Res=" << Res;
// return Res;
}
Get the help of stoll() which converts a std::string object into long long type integer:
void printLongNumber(std::string str) {
auto number = stoll(str);
std::cout << number << std::endl;
}

C++ binary input as a string to a decimal

I am trying to write a code that takes a binary number input as a string and will only accept 1's or 0's if not there should be an error message displayed. Then it should go through a loop digit by digit to convert the binary number as a string to decimal. I cant seem to get it right I have the fact that it will only accept 1's or 0's correct. But then when it gets into the calculations something messes up and I cant seem to get it correct. Currently this is the closest I believe I have to getting it working. could anyone give me a hint or help me with what i am doing wrong?
#include <iostream>
#include <string>
using namespace std;
string a;
int input();
int main()
{
input();
int decimal, x= 0, length, total = 0;
length = a.length();
// atempting to make it put the digits through a formula backwords.
for (int i = length; i >= 0; i--)
{
// Trying to make it only add the 2^x if the number is 1
if (a[i] = '1')
{
//should make total equal to the old total plus 2^x if a[i] = 1
total = total + pow(x,2);
}
//trying to let the power start at 0 and go up each run of the loop
x++;
}
cout << endl << total;
int stop;
cin >> stop;
return 0;
}
int input()
{
int x, x2, count, repeat = 0;
while (repeat == 0)
{
cout << "Enter a string representing a binary number => ";
cin >> a;
count = a.length();
for (x = 0; x < count; x++)
{
if (a[x] != '0' && a[x] != '1')
{
cout << a << " is not a string representing a binary number>" << endl;
repeat = 0;
break;
}
else
repeat = 1;
}
}
return 0;
}
I don't think that pow suits for integer calculation. In this case, you can use shift operator.
a[i] = '1' sets the value of a[i] to '1' and return '1', which is always true.
You shouldn't access a[length], which should be meaningless.
fixed code:
int main()
{
input();
int decimal, x= 0, length, total = 0;
length = a.length();
// atempting to make it put the digits through a formula backwords.
for (int i = length - 1; i >= 0; i--)
{
// Trying to make it only add the 2^x if the number is 1
if (a[i] == '1')
{
//should make total equal to the old total plus 2^x if a[i] = 1
total = total + (1 << x);
}
//trying to let the power start at 0 and go up each run of the loop
x++;
}
cout << endl << total;
int stop;
cin >> stop;
return 0;
}
I would use this approach...
#include <iostream>
using namespace std;
int main()
{
string str{ "10110011" }; // max length can be sizeof(int) X 8
int dec = 0, mask = 1;
for (int i = str.length() - 1; i >= 0; i--) {
if (str[i] == '1') {
dec |= mask;
}
mask <<= 1;
}
cout << "Decimal number is: " << dec;
// system("pause");
return 0;
}
Works for binary strings up to 32 bits. Swap out integer for long to get 64 bits.
#include <iostream>
#include <stdio.h>
#include <string>
using namespace std;
string getBinaryString(int value, unsigned int length, bool reverse) {
string output = string(length, '0');
if (!reverse) {
for (unsigned int i = 0; i < length; i++) {
if ((value & (1 << i)) != 0) {
output[i] = '1';
}
}
}
else {
for (unsigned int i = 0; i < length; i++) {
if ((value & (1 << (length - i - 1))) != 0) {
output[i] = '1';
}
}
}
return output;
}
unsigned long getInteger(const string& input, size_t lsbindex, size_t msbindex) {
unsigned long val = 0;
unsigned int offset = 0;
if (lsbindex > msbindex) {
size_t length = lsbindex - msbindex;
for (size_t i = msbindex; i <= lsbindex; i++, offset++) {
if (input[i] == '1') {
val |= (1 << (length - offset));
}
}
}
else { //lsbindex < msbindex
for (size_t i = lsbindex; i <= msbindex; i++, offset++) {
if (input[i] == '1') {
val |= (1 << offset);
}
}
}
return val;
}
int main() {
int value = 23;
cout << value << ": " << getBinaryString(value, 5, false) << endl;
string str = "01011";
cout << str << ": " << getInteger(str, 1, 3) << endl;
}
I see multiple misstages in your code.
Your for-loop should start at i = length - 1 instead of i = length.
a[i] = '1' sets a[i] to '1' and does not compare it.
pow(x,2) means and not . pow is also not designed for integer operations. Use 2*2*... or 1<<e instead.
Also there are shorter ways to achieve it. Here is a example how I would do it:
std::size_t fromBinaryString(const std::string &str)
{
std::size_t result = 0;
for (std::size_t i = 0; i < str.size(); ++i)
{
// '0' - '0' == 0 and '1' - '0' == 1.
// If you don't want to assume that, you can use if or switch
result = (result << 1) + str[i] - '0';
}
return result;
}

C++: Change of base function (i.e. hex to octal, decimal, etc.) - Output slightly off for hex values

I need to create a generic function that changes from any starting base, to any final base. I have everything down, except my original function took (and takes) an int value for the number that it converts to another base. I decided to just overload the function. I am Ok with changing between every base, but am slightly off when using my new function to take in a string hex value.
The code below should output 1235 for both functions. It does for the first one, but for the second, I am currently getting 1347. Decimal to Hex works fine - It's just the overloaded function (Hex to anything else) that is slightly off.
Thanks.
#include <iostream>
#include <stack>
#include <string>
#include <cmath>
using namespace std;
void switchBasesFunction(stack<int> & myStack, int startBase, int finalBase, int num);
void switchBasesFunction(stack<int> & myStack, int startBase, int finalBase, string s);
int main()
{
stack<int> myStack;
string hexNum = "4D3";
switchBasesFunction(myStack, 8, 10, 2323);
cout << endl << endl;
switchBasesFunction(myStack, 16, 10, hexNum);
return 0;
}
void switchBasesFunction(stack<int> & myStack, int startBase, int finalBase, int num)
{
int totalVal = 0;
string s = to_string(num);
for (int i = 0; i < s.length(); i++)
{
myStack.push(s.at(i) - '0');
}
int k = 0;
while (myStack.size() > 0)
{
totalVal += (myStack.top() * pow(startBase, k++));
myStack.pop();
}
string s1;
while (totalVal > 0)
{
int temp = totalVal % finalBase;
totalVal = totalVal / finalBase;
char c;
if (temp < 10)
{
c = temp + '0';
s1 += c;
}
else
{
c = temp - 10 + 'A';
s1 += c;
}
}
for (int i = s1.length() - 1; i >= 0; i--)
{
cout << s1[i];
}
cout << endl << endl;
}
void switchBasesFunction(stack<int> & myStack, int startBase, int finalBase, string s)
{
int totalVal = 0;
for (int i = 0; i < s.length(); i++)
{
myStack.push(s.at(i) - '0');
}
int k = 0;
while (myStack.size() > 0)
{
totalVal += (myStack.top() * pow(startBase, k++));
myStack.pop();
}
string s1;
while (totalVal > 0)
{
int temp = totalVal % finalBase;
totalVal = totalVal / finalBase;
char c;
if (temp < 10)
{
c = temp + '0';
s1 += c;
}
else
{
c = temp - 10 + 'A';
s1 += c;
}
}
for (int i = s1.length() - 1; i >= 0; i--)
{
cout << s1[i];
}
cout << endl << endl;
}
Sorry, but I'm having issues understanding your code, so I thought I'd simplify it.
Here's the algorithm / code (untested):
void convert_to_base(const std::string& original_value,
unsigned int original_base,
std::string& final_value_str,
unsigned int final_base)
{
static const std::string digit_str =
"0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ";
if ((original_base > digit_str.length()) || (final_base > digit_str.length())
{
std::cerr << "Base value exceeds limit of " << digit_str.length() << ".\n";
return;
}
// Parse string from right to left, smallest value to largest.
// Convert to decimal.
unsigned int original_number = 0;
unsigned int digit_value = 0;
int index = 0;
for (index = original_value.length(); index > 0; --index)
{
std::string::size_type posn = digit_str.find(original_value[index];
if (posn == std::string::npos)
{
cerr << "unsupported digit encountered: " << original_value[index] << ".\n";
return;
}
digit_value = posn;
original_number = original_number * original_base + digit_value;
}
// Convert to a string of digits in the final base.
while (original_number != 0)
{
digit_value = original_number % final_base;
final_value_str.insert(0, 1, digit_str[digit_value]);
original_number = original_number / final_base;
}
}
*Warning: code not tested via compiler.**

Quickly Converting uint32_t to binary

The main problem I'm having is to read out values in binary in C++ (python had some really quick/easy functions to do this)
I just need the same. So at the moment I have:
ValWord< uint32_t> data1=//[SOME READ FUNCTION]
When I use cout << data1; It gives me a number e.g 2147581953
I want this to be in binary and eventually each "bit" needs to be in its own bin including all '0's e.g:
for (int i = 31; i >= 0; i--) {
cout << binary[i];
}
Would give me this 32 bit long binary number. When I've had it as a straight forwward int, I've used:
int data[32];
bitset<32>(N) = data1;
for(int i=31; i >=0; i--) {
data[i]=(bitset<32>(N[i]).to_ulong());
}
for (int i = 31; i >= 0; i--) {
cout << data[i];
}
But this just gives me error messages. Any ideas?
Maybe this:
#define CPlusPlus11 0
#if CPlusPlus11
int main()
{
std::uint32_t value(42);
std::bitset<32> bits(value);
std::cout << bits.to_string() << std::endl;
// Storing integral values in the string:
for(auto i: bits.to_string(char(0), char(1))) {
std::cout << (int)i;
}
std::cout << std::endl;
return 0;
}
#else
int main()
{
std::uint32_t value(42);
std::bitset<32> bits(value);
std::cout << bits.to_string() << std::endl;
char data[32];
for(unsigned i = 0; i < 32; ++i) {
data[i] = bits[i];
}
for(unsigned i = 32; i; --i) {
std::cout << int(data[i-1]);
}
std::cout << std::endl;
return 0;
}
#endif
Note: Your expressions bitset<32>(N) = data1 and bitset<32>(N[i]) are code smell.
In general, transforming a number into a string for a given base, is quite ubiquitous:
#include <cassert>
#include <string>
static char const Digits[] = "0123456789abcdefghijklmnopqrstuvwxyz";
static size_t const DigitsSize = sizeof(Digits) - 1;
static size_t const BufferSize = 32;
std::string convert(unsigned number, unsigned base) {
assert(base >= 2 and base <= DigitsSize);
char buffer[BufferSize] = {};
char* p = buffer + BufferSize;
while (number != 0) {
*(--p) = Digits[number % base];
number /= base;
}
return std::string(p, (buffer + BufferSize) - p);
}
Note: BufferSize was computed for the minimum base of 2, base 1 and base 0 are non-sensical.
Note: if the number can be negative, the simplest is to test for it beforehand, and then use its opposite; a special caveat is that the opposite of the minimum value of a 32 bits integer cannot be represented by a 32 bits integer in base 2.
I have some simple functions i use for this, using stl, here is one for binary:
#include <iostream>
#include <algorithm>
using namespace std;
string binary( unsigned long n )
{
string result;
do result.push_back( '0' + (n & 1) );
while (n >>= 1);
reverse( result.begin(), result.end() );
return result;
}
int main()
{
cout << binary(1024) << endl;
cout << "Hello World" << endl;
return 0;
}
Hope this is of use to you!
Let me know if you need something more performant and I can try to rustle up some Assembler code for you.

Missing punctuation from C++ hex2bin

While trying to duplicate PHP's bin2hex($s) and pack('H*',$s) (aka hex2bin($s) in PHP 5.4.3+) in GCC/Linux C++, I seem to have it figured out except that it's dropping punctuation for some strange reason. Can you figure out what I might be doing wrong in the hex2bin() function? I compared PHP's bin2hex() with mine and it appears to be working there properly, so the problem is in hex2bin().
#include <strings.h>
#include <string>
#include <stdio.h>
#include <stdlib.h>
using namespace std;
string bin2hex(string s) {
int nLen = s.length();
string sOut;
char cBuff[2];
for (int i = 0; i < nLen; i++) {
sprintf(cBuff,"%.2x",s[i]);
sOut.append(cBuff);
cBuff[0] = '\0';
}
return sOut;
}
string hex2bin(string s) {
int nLen = s.length();
string sOut;
char cBuff1[2];
char cBuff2[2];
char cBuff[1];
int n,n1,n2;
for (int i = 0; i <= nLen; i+=2) {
sprintf(cBuff1,"%c",s[i]);
sprintf(cBuff2,"%c",s[i+1]);
n1 = atoi(cBuff1);
n2 = atoi(cBuff2);
n = (n1 * 16) + n2;
sprintf(cBuff,"%c",n);
sOut.append(cBuff);
cBuff[0] = '\0';
cBuff1[0] = '\0';
cBuff2[0] = '\0';
}
return sOut;
}
int main() {
string s;
string sResult;
s = "This is a 123 test.";
sResult = bin2hex(s);
printf("ENCODED: %s\n",sResult.c_str());
sResult = hex2bin(sResult);
printf("UNENCODED: %s\n",sResult.c_str());
return 1;
}
This emits:
ENCODED: 5468697320697320612031323320746573742e
UNENCODED: This is a 123 test
Okay, sleeves rolled up: let's look at C++ version:
Live on Coliru
Don't use C strings unless you need to (sprintf to build a two-char string is not... very efficient)
Use iostreams to encode/decode the hex digits (std::hex)
The hex2bin could optimized, but I went for "simpler"
I added a modicum of input sanitizing on hex2bin
#include <string>
#include <sstream>
#include <iomanip>
std::string bin2hex(std::string const &s) {
std::ostringstream oss;
for (unsigned char ch : s)
oss << std::hex << std::setw(2) << std::setfill('0') << (int) ch;
return oss.str();
}
#include <cassert>
std::string hex2bin(std::string const& s) {
assert(s.length() % 2 == 0);
std::string sOut;
sOut.reserve(s.length()/2);
std::string extract;
for (std::string::const_iterator pos = s.begin(); pos<s.end(); pos += 2)
{
extract.assign(pos, pos+2);
sOut.push_back(std::stoi(extract, nullptr, 16));
}
return sOut;
}
#include <iostream>
int main() {
std::cout << "ENCODED: " << bin2hex("This is a 123 test.") << "\n";
std::cout << "DECODED: " << hex2bin(bin2hex("This is a 123 test.")) << "\n";
}
Output:
ENCODED: 5468697320697320612031323320746573742e
DECODED: This is a 123 test.
With all but the period '.' you just went lucky: the hex digits didn't use an actual hexadecimal value. However, for the period you got 2e but you tried to decode the e using atoi("e"), roughly: that won't work as atoi() requires a decimal value. You could use strtol(str, 0, 16) instead to decode the hexadecimal value.
Note that you have a few buffer overruns when you are using sprintf(): this function writes a terminating null character. In general, you are much better off to snprintf() to avoid buffer overruns. Also, in your decoding routine you access values beyond the end of your string (you use i <= nLen with nLen = s.length() and then access s[i] and s[i+1]). Of course, the code is far too complex:
#include <string>
#include <sstream>
#include <iostream>
#include <iomanip>
std::string bin2hex(std::string s) {
std::ostringstream out;
out << std::hex << std::setfill('0');
for (char c: s) {
out << std::setw(2) << int(c);
}
return out.str();
}
std::string hex2bin(std::string s) {
std::string rc;
int nLen = s.length();
int tmp;
for (int i(0); i + 1 < nLen; i += 2) {
if (std::istringstream(s.substr(i, 2)) >> std::hex >> tmp) {
rc.push_back(tmp);
}
}
return rc;
}
int main() {
std::string s;
std::string sResult;
s = "This is a 123 test.";
sResult = bin2hex(s);
std::cout << "ENCRYPTED: " << sResult << '\n';
sResult = hex2bin(sResult);
std::cout << "UNENCRYPTED: " << sResult << '\n';
return 1;
}
Your code does not convert hexadecimal digits correctly because atoi can only handle decimal digits. Try this
sprintf(cBuff1,"%c",s[i]);
sprintf(cBuff2,"%c",s[i+1]);
n1 = strtoul(cBuff1, 0, 16);
n2 = strtoul(cBuff2, 0, 16);
Also your for loop should be
for (int i = 0; i < nLen; i+=2) {
n1 = atoi(cBuff1);
n2 = atoi(cBuff2);
n = (n1 * 16) + n2;
if cbuff1 is, say, "a", then this won't work, since a is not a digit. It works fine for digits that are '0-9', but not 'a-f'.
You will need to translate non-digits to numeric values.
There are quite a few ways to convert a hex value string to a byte. I think this is pretty decent:
int hexchar(char c)
{
if (c >= '0' && c <= '9') return c - '0';
// if you need to support upper-case hex:
// c = tolower(c);
if (c >= 'a' && c <= 'f') return c - 'a' + 10;
// If we get here, panic
cout << "Error, invalid hex digit:" << c << endl;
return -1;
}
int hexbyte(string s)
{
for(i = 0; i < s.length(); i+=2)
{
char c = hexbyte(s[i]);
c <<= 4;
c += hexbyte(s[i+1];
cout << c;
}
}
Try these trivial routines, good for C and C ++
/*------------------------------------------+
| bin2hex bin2hex bin2hex |
+------------------------------------------*/
static char *bin2hex(unsigned char *s, long L)
{
static char hex[2048];
long i,l=0;
for (i=0; i<L; i++) l+=sprintf(&hex[l], "%02x", 0xFF & (*(s+i)));
hex[l]=0;
return hex;
}
/*------------------------------------------+
| hex2bin hex2bin hex2bin |
+------------------------------------------*/
static char *hex2bin( char *s)
{
static char bin[2048];
unsigned int i,e,l=0,L=strlen(s);
for (i=0; i<L; i+=2) { sscanf(s+i, "%02x",&e); bin[l++]=(char)e; }
bin[l]=0;
return bin;
}