Extracting int for char array in C++ - c++

I am trying to extract single character from char array and converting it in to integer.
I need to extract number from code for example if user enters A23B,I need to extract 23 and store it in a single variable here is my code
#include <iostream>
#include <stdlib.h>
#include <string.h>
using namespace std;
int main()
{
char code[5] ={'\0'};
cout << "Enter Your Four Digit Code\nExample A23B\n";
cin.getline(code,5);
cout << "You typed:\n" << code;
int a = atoi(code[1]);
int b = atoi(code[2]);
cout << endl <<a <<"\t"<<b;
//Other processing related to number a and b goes here
}
but it's not working and produces the following errors
C:\helo\clan\Test\main.cpp||In function 'int main()':|
C:\helo\clan\Test\main.cpp|12|error: invalid conversion from 'char' to 'const char*'|
C:\helo\clan\Test\main.cpp|12|error: initializing argument 1 of 'int atoi(const char*)'|
C:\helo\clan\Test\main.cpp|13|error: invalid conversion from 'char' to 'const char*'|
C:\helo\clan\Test\main.cpp|13|error: initializing argument 1 of 'int atoi(const char*)'|
||=== Build finished: 4 errors, 0 warnings ===|

atoi takes a const char*, not char.
If you need to get '2' and '3' from "A23B":
int b = atoi(code + 2);
code[2] = 0;
int a = atoi(code + 1);
If you need to get '23' from "A23B" then:
int a = atoi(code + 1);

For common situation , why not using std::string and std::stringstream like this:
#include <string>
#include <sstream>
template <class T>
std::string num2string (const T &in)
{
static std::stringstream out;
out.str ("");
out.clear();
out << in;
return out.str();
}
template <class T>
T string2num (const std::string &in)
{
T out;
static std::stringstream tmp;
tmp.str ("");
tmp.clear();
tmp << in;
tmp >> out;
return out;
}
You can use these two functions converting string between nums(int,double...).

Why not
int a = int(code[1]-'0') * 10 + int(code[2] - '0');
i.e. Convert the two ASCII characters to the appropriate integers and then do the maths.
EDIT
You should check to ensure that the string is 4 characters long and characters 2 & are digits.

You pass a char to a function that expects char *, and you can't do that. Why not doing:
int a = code[1] & 0xff;
int b = code[2] & 0xff;

I think you want something like this:
#include <iostream>
#include <stdlib.h>
#include <string.h>
using namespace std;
int main()
{
char code[5] ={'\0'};
cout << "Enter Your Four Digit Code\nExample A23B\n";
cin.getline(code,5);
cout << "You typed:\n" << code;
char aStr[2] = {code[1], 0};
char bStr[2] = {code[2], 0};
int a = atoi(aStr);
int b = atoi(bStr);
cout << endl <<a <<"\t"<<b;
//Other processing related to number a and b goes here
}
And if you want 23 in one variable then maybe this:
char aStr[3] = {code[1], code[2], 0};
int a = atoi(aStr);

Your code is horribly unsafe (what if the user enters more than four digits? What if a line break takes up more than one byte?)
Try something like this:
int a, b;
std::string line;
std::cout << "Enter Your Four Digit Code (Example: A23B): ";
while (std::getline(std::cin, line))
{
if (line.length() != 4 || !isNumber(line[1]) || !isNumber(line[2]))
{
std::cout << "You typed it wrong. Try again: ";
continue;
}
a = line[1] - '0';
b = line[2] - '0';
break;
}
We need the isNumber helper:
inline bool isNumber(char c) { return '0' <= c && c <= '9'; }

If anyone wants to do this code with out including to many library's(Some schools require just basic libraries like mine)
This function is done with cmath and cctype
It will take in a course number like cs163 and turn it into an int of 163
ascii value for 1 = 49 that is why the - 48 is there.
Something like this will do it as well, but with out using a lot of function like string class.
This isn't super optimised I know, but it will work.
int courseIndexFunction(char * name){
int courseIndex = 0;
int courseNameLength = 5;
if (name[2] && name[3] && name[4]){
//This function will take cs163 and turn
//it into int 163
for (int i = 2; i < courseNameLength; ++i){
double x = name[i] - 48;
x = x * pow(10.0 , 4-i);
courseIndex += x;
}
}
return courseIndex;
}

Related

C++: Convert CONTENT of String to char [duplicate]

I want to convert a hex string to a 32 bit signed integer in C++.
So, for example, I have the hex string "fffefffe". The binary representation of this is 11111111111111101111111111111110. The signed integer representation of this is: -65538.
How do I do this conversion in C++? This also needs to work for non-negative numbers. For example, the hex string "0000000A", which is 00000000000000000000000000001010 in binary, and 10 in decimal.
use std::stringstream
unsigned int x;
std::stringstream ss;
ss << std::hex << "fffefffe";
ss >> x;
the following example produces -65538 as its result:
#include <sstream>
#include <iostream>
int main() {
unsigned int x;
std::stringstream ss;
ss << std::hex << "fffefffe";
ss >> x;
// output it as a signed type
std::cout << static_cast<int>(x) << std::endl;
}
In the new C++11 standard, there are a few new utility functions which you can make use of! specifically, there is a family of "string to number" functions (http://en.cppreference.com/w/cpp/string/basic_string/stol and http://en.cppreference.com/w/cpp/string/basic_string/stoul). These are essentially thin wrappers around C's string to number conversion functions, but know how to deal with a std::string
So, the simplest answer for newer code would probably look like this:
std::string s = "0xfffefffe";
unsigned int x = std::stoul(s, nullptr, 16);
NOTE: Below is my original answer, which as the edit says is not a complete answer. For a functional solution, stick the code above the line :-).
It appears that since lexical_cast<> is defined to have stream conversion semantics. Sadly, streams don't understand the "0x" notation. So both the boost::lexical_cast and my hand rolled one don't deal well with hex strings. The above solution which manually sets the input stream to hex will handle it just fine.
Boost has some stuff to do this as well, which has some nice error checking capabilities as well. You can use it like this:
try {
unsigned int x = lexical_cast<int>("0x0badc0de");
} catch(bad_lexical_cast &) {
// whatever you want to do...
}
If you don't feel like using boost, here's a light version of lexical cast which does no error checking:
template<typename T2, typename T1>
inline T2 lexical_cast(const T1 &in) {
T2 out;
std::stringstream ss;
ss << in;
ss >> out;
return out;
}
which you can use like this:
// though this needs the 0x prefix so it knows it is hex
unsigned int x = lexical_cast<unsigned int>("0xdeadbeef");
For a method that works with both C and C++, you might want to consider using the standard library function strtol().
#include <cstdlib>
#include <iostream>
using namespace std;
int main() {
string s = "abcd";
char * p;
long n = strtol( s.c_str(), & p, 16 );
if ( * p != 0 ) { //my bad edit was here
cout << "not a number" << endl;
}
else {
cout << n << endl;
}
}
Andy Buchanan, as far as sticking to C++ goes, I liked yours, but I have a few mods:
template <typename ElemT>
struct HexTo {
ElemT value;
operator ElemT() const {return value;}
friend std::istream& operator>>(std::istream& in, HexTo& out) {
in >> std::hex >> out.value;
return in;
}
};
Used like
uint32_t value = boost::lexical_cast<HexTo<uint32_t> >("0x2a");
That way you don't need one impl per int type.
Working example with strtoul will be:
#include <cstdlib>
#include <iostream>
using namespace std;
int main() {
string s = "fffefffe";
char * p;
long n = strtoul( s.c_str(), & p, 16 );
if ( * p != 0 ) {
cout << "not a number" << endl;
} else {
cout << n << endl;
}
}
strtol converts string to long. On my computer numeric_limits<long>::max() gives 0x7fffffff. Obviously that 0xfffefffe is greater than 0x7fffffff. So strtol returns MAX_LONG instead of wanted value. strtoul converts string to unsigned long that's why no overflow in this case.
Ok, strtol is considering input string not as 32-bit signed integer before convertation. Funny sample with strtol:
#include <cstdlib>
#include <iostream>
using namespace std;
int main() {
string s = "-0x10002";
char * p;
long n = strtol( s.c_str(), & p, 16 );
if ( * p != 0 ) {
cout << "not a number" << endl;
} else {
cout << n << endl;
}
}
The code above prints -65538 in console.
Here's a simple and working method I found elsewhere:
string hexString = "7FF";
int hexNumber;
sscanf(hexString.c_str(), "%x", &hexNumber);
Please note that you might prefer using unsigned long integer/long integer, to receive the value.
Another note, the c_str() function just converts the std::string to const char* .
So if you have a const char* ready, just go ahead with using that variable name directly, as shown below [I am also showing the usage of the unsigned long variable for a larger hex number. Do not confuse it with the case of having const char* instead of string]:
const char *hexString = "7FFEA5"; //Just to show the conversion of a bigger hex number
unsigned long hexNumber; //In case your hex number is going to be sufficiently big.
sscanf(hexString, "%x", &hexNumber);
This works just perfectly fine (provided you use appropriate data types per your need).
I had the same problem today, here's how I solved it so I could keep lexical_cast<>
typedef unsigned int uint32;
typedef signed int int32;
class uint32_from_hex // For use with boost::lexical_cast
{
uint32 value;
public:
operator uint32() const { return value; }
friend std::istream& operator>>( std::istream& in, uint32_from_hex& outValue )
{
in >> std::hex >> outValue.value;
}
};
class int32_from_hex // For use with boost::lexical_cast
{
uint32 value;
public:
operator int32() const { return static_cast<int32>( value ); }
friend std::istream& operator>>( std::istream& in, int32_from_hex& outValue )
{
in >> std::hex >> outvalue.value;
}
};
uint32 material0 = lexical_cast<uint32_from_hex>( "0x4ad" );
uint32 material1 = lexical_cast<uint32_from_hex>( "4ad" );
uint32 material2 = lexical_cast<uint32>( "1197" );
int32 materialX = lexical_cast<int32_from_hex>( "0xfffefffe" );
int32 materialY = lexical_cast<int32_from_hex>( "fffefffe" );
// etc...
(Found this page when I was looking for a less sucky way :-)
Cheers,
A.
just use stoi/stol/stoll
for example:
std::cout << std::stol("fffefffe", nullptr, 16) << std::endl;
output: 4294901758
This worked for me:
string string_test = "80123456";
unsigned long x;
signed long val;
std::stringstream ss;
ss << std::hex << string_test;
ss >> x;
// ss >> val; // if I try this val = 0
val = (signed long)x; // However, if I cast the unsigned result I get val = 0x80123456
Try this. This solution is a bit risky. There are no checks. The string must only have hex values and the string length must match the return type size. But no need for extra headers.
char hextob(char ch)
{
if (ch >= '0' && ch <= '9') return ch - '0';
if (ch >= 'A' && ch <= 'F') return ch - 'A' + 10;
if (ch >= 'a' && ch <= 'f') return ch - 'a' + 10;
return 0;
}
template<typename T>
T hextot(char* hex)
{
T value = 0;
for (size_t i = 0; i < sizeof(T)*2; ++i)
value |= hextob(hex[i]) << (8*sizeof(T)-4*(i+1));
return value;
};
Usage:
int main()
{
char str[4] = {'f','f','f','f'};
std::cout << hextot<int16_t>(str) << "\n";
}
Note: the length of the string must be divisible by 2
For those looking to convert number base for unsigned numbers, it is pretty trivial to do yourself in both C/C++ with minimal dependency (only operator not provided by the language itself is pow() function).
In mathematical terms, a positive ordinal number d in base b with n number of digits can be converted to base 10 using:
Example: Converting base 16 number 00f looks like:
= 0*16^2 + 0*16^1 + 16*16^0 = 15
C/C++ Example:
#include <math.h>
unsigned int to_base10(char *d_str, int len, int base)
{
if (len < 1) {
return 0;
}
char d = d_str[0];
// chars 0-9 = 48-57, chars a-f = 97-102
int val = (d > 57) ? d - ('a' - 10) : d - '0';
int result = val * pow(base, (len - 1));
d_str++; // increment pointer
return result + to_base10(d_str, len - 1, base);
}
int main(int argc, char const *argv[])
{
char n[] = "00f"; // base 16 number of len = 3
printf("%d\n", to_base10(n, 3, 16));
}

How to extract hex value from a string in C++ [duplicate]

I want to convert a hex string to a 32 bit signed integer in C++.
So, for example, I have the hex string "fffefffe". The binary representation of this is 11111111111111101111111111111110. The signed integer representation of this is: -65538.
How do I do this conversion in C++? This also needs to work for non-negative numbers. For example, the hex string "0000000A", which is 00000000000000000000000000001010 in binary, and 10 in decimal.
use std::stringstream
unsigned int x;
std::stringstream ss;
ss << std::hex << "fffefffe";
ss >> x;
the following example produces -65538 as its result:
#include <sstream>
#include <iostream>
int main() {
unsigned int x;
std::stringstream ss;
ss << std::hex << "fffefffe";
ss >> x;
// output it as a signed type
std::cout << static_cast<int>(x) << std::endl;
}
In the new C++11 standard, there are a few new utility functions which you can make use of! specifically, there is a family of "string to number" functions (http://en.cppreference.com/w/cpp/string/basic_string/stol and http://en.cppreference.com/w/cpp/string/basic_string/stoul). These are essentially thin wrappers around C's string to number conversion functions, but know how to deal with a std::string
So, the simplest answer for newer code would probably look like this:
std::string s = "0xfffefffe";
unsigned int x = std::stoul(s, nullptr, 16);
NOTE: Below is my original answer, which as the edit says is not a complete answer. For a functional solution, stick the code above the line :-).
It appears that since lexical_cast<> is defined to have stream conversion semantics. Sadly, streams don't understand the "0x" notation. So both the boost::lexical_cast and my hand rolled one don't deal well with hex strings. The above solution which manually sets the input stream to hex will handle it just fine.
Boost has some stuff to do this as well, which has some nice error checking capabilities as well. You can use it like this:
try {
unsigned int x = lexical_cast<int>("0x0badc0de");
} catch(bad_lexical_cast &) {
// whatever you want to do...
}
If you don't feel like using boost, here's a light version of lexical cast which does no error checking:
template<typename T2, typename T1>
inline T2 lexical_cast(const T1 &in) {
T2 out;
std::stringstream ss;
ss << in;
ss >> out;
return out;
}
which you can use like this:
// though this needs the 0x prefix so it knows it is hex
unsigned int x = lexical_cast<unsigned int>("0xdeadbeef");
For a method that works with both C and C++, you might want to consider using the standard library function strtol().
#include <cstdlib>
#include <iostream>
using namespace std;
int main() {
string s = "abcd";
char * p;
long n = strtol( s.c_str(), & p, 16 );
if ( * p != 0 ) { //my bad edit was here
cout << "not a number" << endl;
}
else {
cout << n << endl;
}
}
Andy Buchanan, as far as sticking to C++ goes, I liked yours, but I have a few mods:
template <typename ElemT>
struct HexTo {
ElemT value;
operator ElemT() const {return value;}
friend std::istream& operator>>(std::istream& in, HexTo& out) {
in >> std::hex >> out.value;
return in;
}
};
Used like
uint32_t value = boost::lexical_cast<HexTo<uint32_t> >("0x2a");
That way you don't need one impl per int type.
Working example with strtoul will be:
#include <cstdlib>
#include <iostream>
using namespace std;
int main() {
string s = "fffefffe";
char * p;
long n = strtoul( s.c_str(), & p, 16 );
if ( * p != 0 ) {
cout << "not a number" << endl;
} else {
cout << n << endl;
}
}
strtol converts string to long. On my computer numeric_limits<long>::max() gives 0x7fffffff. Obviously that 0xfffefffe is greater than 0x7fffffff. So strtol returns MAX_LONG instead of wanted value. strtoul converts string to unsigned long that's why no overflow in this case.
Ok, strtol is considering input string not as 32-bit signed integer before convertation. Funny sample with strtol:
#include <cstdlib>
#include <iostream>
using namespace std;
int main() {
string s = "-0x10002";
char * p;
long n = strtol( s.c_str(), & p, 16 );
if ( * p != 0 ) {
cout << "not a number" << endl;
} else {
cout << n << endl;
}
}
The code above prints -65538 in console.
Here's a simple and working method I found elsewhere:
string hexString = "7FF";
int hexNumber;
sscanf(hexString.c_str(), "%x", &hexNumber);
Please note that you might prefer using unsigned long integer/long integer, to receive the value.
Another note, the c_str() function just converts the std::string to const char* .
So if you have a const char* ready, just go ahead with using that variable name directly, as shown below [I am also showing the usage of the unsigned long variable for a larger hex number. Do not confuse it with the case of having const char* instead of string]:
const char *hexString = "7FFEA5"; //Just to show the conversion of a bigger hex number
unsigned long hexNumber; //In case your hex number is going to be sufficiently big.
sscanf(hexString, "%x", &hexNumber);
This works just perfectly fine (provided you use appropriate data types per your need).
I had the same problem today, here's how I solved it so I could keep lexical_cast<>
typedef unsigned int uint32;
typedef signed int int32;
class uint32_from_hex // For use with boost::lexical_cast
{
uint32 value;
public:
operator uint32() const { return value; }
friend std::istream& operator>>( std::istream& in, uint32_from_hex& outValue )
{
in >> std::hex >> outValue.value;
}
};
class int32_from_hex // For use with boost::lexical_cast
{
uint32 value;
public:
operator int32() const { return static_cast<int32>( value ); }
friend std::istream& operator>>( std::istream& in, int32_from_hex& outValue )
{
in >> std::hex >> outvalue.value;
}
};
uint32 material0 = lexical_cast<uint32_from_hex>( "0x4ad" );
uint32 material1 = lexical_cast<uint32_from_hex>( "4ad" );
uint32 material2 = lexical_cast<uint32>( "1197" );
int32 materialX = lexical_cast<int32_from_hex>( "0xfffefffe" );
int32 materialY = lexical_cast<int32_from_hex>( "fffefffe" );
// etc...
(Found this page when I was looking for a less sucky way :-)
Cheers,
A.
just use stoi/stol/stoll
for example:
std::cout << std::stol("fffefffe", nullptr, 16) << std::endl;
output: 4294901758
This worked for me:
string string_test = "80123456";
unsigned long x;
signed long val;
std::stringstream ss;
ss << std::hex << string_test;
ss >> x;
// ss >> val; // if I try this val = 0
val = (signed long)x; // However, if I cast the unsigned result I get val = 0x80123456
Try this. This solution is a bit risky. There are no checks. The string must only have hex values and the string length must match the return type size. But no need for extra headers.
char hextob(char ch)
{
if (ch >= '0' && ch <= '9') return ch - '0';
if (ch >= 'A' && ch <= 'F') return ch - 'A' + 10;
if (ch >= 'a' && ch <= 'f') return ch - 'a' + 10;
return 0;
}
template<typename T>
T hextot(char* hex)
{
T value = 0;
for (size_t i = 0; i < sizeof(T)*2; ++i)
value |= hextob(hex[i]) << (8*sizeof(T)-4*(i+1));
return value;
};
Usage:
int main()
{
char str[4] = {'f','f','f','f'};
std::cout << hextot<int16_t>(str) << "\n";
}
Note: the length of the string must be divisible by 2
For those looking to convert number base for unsigned numbers, it is pretty trivial to do yourself in both C/C++ with minimal dependency (only operator not provided by the language itself is pow() function).
In mathematical terms, a positive ordinal number d in base b with n number of digits can be converted to base 10 using:
Example: Converting base 16 number 00f looks like:
= 0*16^2 + 0*16^1 + 16*16^0 = 15
C/C++ Example:
#include <math.h>
unsigned int to_base10(char *d_str, int len, int base)
{
if (len < 1) {
return 0;
}
char d = d_str[0];
// chars 0-9 = 48-57, chars a-f = 97-102
int val = (d > 57) ? d - ('a' - 10) : d - '0';
int result = val * pow(base, (len - 1));
d_str++; // increment pointer
return result + to_base10(d_str, len - 1, base);
}
int main(int argc, char const *argv[])
{
char n[] = "00f"; // base 16 number of len = 3
printf("%d\n", to_base10(n, 3, 16));
}

C++: Constant Reference Parameters

why do we use Constant Reference Parameters in this code
#include <iostream>
#include <string>
#include <cctype>
using namespace std;
// Converts a hex number as a string to decimal
int hex2Dec(const string& hex); // here why we use reference and const it seems we do not change hex in the function body
// Converts a hex character to a decimal value
int hexCharToDecimal(char ch);
int main()
{
// Prompt the user to enter a hex number as a string
cout << "Enter a hex number: ";
string hex;
cin >> hex;
cout << "The decimal value for hex number " << hex
<< " is " << hex2Dec(hex) << endl;
return 0;
}
int hex2Dec(const string& hex)
{
int decimalValue = 0;
for (unsigned i = 0; i < hex.size(); i++)
decimalValue = decimalValue * 16 + hexCharToDecimal(hex[i]);
return decimalValue;
}
int hexCharToDecimal(char ch)
{
ch = toupper(ch); // Change it to uppercase
if (ch >= 'A' && ch <= 'F')
return 10 + ch - 'A';
else // ch is '0', '1', ..., or '9'
return ch - '0';
}
what is the use of calling by reference in this problem ?
in this segment of code
int hex2Dec(const string& hex)
{
int decimalValue = 0;
for (unsigned i = 0; i < hex.size(); i++)
decimalValue = decimalValue * 16 + hexCharToDecimal(hex[i]);
return decimalValue;
}
we do not change the hex.
what is the purpose of const reference in this example and in genral?
The naive solution is to use int hex2Dec(string hex) here. However because function arguments are copied in C++ calling this function would cause a new string to be created for hex, copied from the function argument every time hex2Dec is called. This can lead to needless performance issues, specially if the strings are large.
The solution to this problem is to pass the argument by reference. Using int hex2Dec(string & hex) fixes the first problem, now calling the function never causes a new string to be created. It always refer to whatever string was given.
This introduces a new problem. First, because the argument is a reference, it's possible for the function to change the argument. Because we can see the function implementation, we know it doesn't. But anyone trying to use that function can't know that. Second, because of this, it is not possible to call the function with a const string. The compiler knows it is forbidden to change a const string and it sees that the function doesn't promise not to change it so it will produce an error if you try it. This is very limiting, for example it wouldn't be possible to call the function with a string literal (ex. hex2Dec("test")).
#include <string>
int hex2Dec(std::string& hex);
int main()
{
std::string foo = "foo";
const std::string bar = "bar";
hex2Dec(foo); // Okay
//hex2Dec(bar); Compilation Error
//hex2Dec("baz"); Compilation Error
}
The solution to this new problem is to add const to the argument type : int hex2Dec(const string & hex). The const means that this reference can never be used to modify the argument. Now, users of the function and the compiler both know its safe to use the function with const strings, and the calling the function never copies the argument.
#include <string>
int hex2Dec(const std::string& hex);
int main()
{
std::string foo = "foo";
const std::string bar = "bar";
hex2Dec(foo); // Okay
hex2Dec(bar); // Okay
hex2Dec("baz"); // Okay
}

What is the quickest way to translate char* to number?

What is the quickest way to translate char* to number ? I need to convert 4 chars to int, or two chars to short int.
I tried like
char* ar;
//fill ar with values
int x= ar[1]+1[2]<<8+ar[3]<<16+ar[4]<<24; // ar[0] number of chars for number (short 2, int 4)
but result is always zero.( to explain I convert numbers to char* and than send over network, on another side I am trying to reverse process).
Use atoi function:
#include <iostream>
#include <cstdlib>
int main ()
{
int i;
char * num = "325";
i = atoi (num);
std::cout << i << std::endl;
return 0;
}
Edit
As pointed in comments, you should not use atoi function, because you can't see if there was an error in conversion (atoi will return 0 if failed, but what about this case int i = atoi("0");). As you are using C++, there is option to use stringstream
#include <iostream>
#include <sstream>
using namespace std;
int main()
{
char * num = "3443";
int result;
stringstream ss;
ss << num;
ss >> result;
if (!ss.fail()) {
cout << result << endl;
}
return 0;
}
Unfortunately, I don't have C++11 compiler here, so I cannot try variant with std::stoi.
Edit 2
I've done some quick research, and here is topic that suggests use strtol function: How to parse a string to an int in C++?
ar[1]+1[2]<<8+ar[3]<<16+ar[4]<<24;
With c++ operator precedence is:
(ar[1]+1[2]) << (8+ar[3]) << (16+ar[4]) << 24
No wonder it's always 0. Use parens. You can also use |, but I would suggest parens anyway.
Guessing based on your sample code I think this is what you are looking for (you really have a void* not a char*???)
unsigned int getValue(char* c) {
if (c[0] == 0x2) {
return *(reinterpret_cast<unsigned short*>(c + 1));
} else if (c[0] == 0x4) {
return *(reinterpret_cast<unsigned int*>(c + 1));
} else {
assert(false);
}
}
int main() {
char c[5];
char d[5];
c[0] = 0x2;
d[0] = 0x4;
char* cStart = &c[1];
*(reinterpret_cast<unsigned short*>(cStart)) = 1000;
char* dStart = &d[1];
*(reinterpret_cast<unsigned int*>(dStart)) = 1123124;
std::cout << getValue(c) << std::endl;
std::cout << getValue(d) << std::endl;
return 0;
}

Parsing command line char arguments as ints with error checking in C++

I'm trying to write a program that takes in two ints as command line arguments. The ints both need to be greater than 0. I understand that I need to convert from char but I have only ever done that using atoi which I now know that I shouldn't do. I've seen people use sstreams and strtol but I'm not sure how those would work in this case. What is the best way to accomplish this?
#include <iostream>
#include <string>
#include <string.h>
#include <stdlib.h>
#include <stdio.h>
using namespace std;
const int N = 7;
const int M = 8;//N is number of lines, M number of values
//--------------
//-----Main-----
//--------------
int main(int argc, char* argv[])
{
if((argc != 0) && (argv[0] != NULL) && (argv[1] != NULL))
{
N = argv[0];
M = argv[1];
}
else
{
cout << "Invalid or no command line arguments found. Defaulting to N=7 M=8.\n\n" << endl;
}
//Blah blah blah code here
return 0;
}
In C++11 there's stoi, stol, stoll for this: http://en.cppreference.com/w/cpp/string/basic_string/stol
Those throw invalid_argument or out_of_range exceptions if the string isn't in the right format.
There's nothing particularly wrong about using atoi, except it doesn't have a mechanism to report exceptions because it's a C function. So you only have the return value - the problem is all return values of atoi are valid values, so there's no way to differentiate between the return value of 0 as the correct parsing of "0" or the failure of parsing. Also, atoi doesn't do any checks for whether the value is outside the available value range. The first problem is easy to fix by doing the check yourself, the second is more difficult because it involves actually parsing the string - which kind of defeats the point of using an external function in the first place.
You can use istringstream like this:
Pre-C++11:
int val;
std::istringstream iss(arg[i]);
iss >> val;
if (iss.fail()) {
//something went wrong
} else {
//apparently it worked
}
C++11:
int val;
std::istringstream iss(arg[i]);
iss >> val;
if(iss.fail()) {
if(!value) {
//wrong number format
} else if(value == std::numeric_limits<int>::max() ||
value == std::numeric_limits<int>::min()
{
//number too large or too small
}
} else {
//apparently it worked
}
The difference is that pre C++11, only format errors were detected (according to standard), also it wouldn't overwrite the value on error. In C++11, values are overwritten by either 0 if it's a format error or max/min if the number is too large or too small to fit into the type. Both set the fail flag on the stream to indicate errors.
In this specific case, atoi will work fine. The problem with atoi is that you can't differentiate between its returning 0 to signify an error of some sort, and its returning 0 to indicate that the input was 0.
In your case, however, a valid input must be greater than 0. You don't care whether the input was 0 or something else that couldn't be converted. Either way you're doing to set it to the default value.
As such, I'd do something like:
int convert(char *input, int default) {
int x = atoi(input);
return x==0 ? default : x;
}
if (argc > 1)
N = convert(argv[1], 7);
if (argc > 2)
M = convert(argv[2], 8);
Note that argv[0] traditionally holds the name of the program being invoked. Arguments passed on the command line are received as argv[1] through argv[argc-1].
First, you can't use const qualifier for M and N, since you will change their value:
int N = 7;
int M = 8;//N is number of lines, M number of values
Second, you don't need to check for (argv[0] != NULL) && (argv[1] != NULL), just check if argc (argument count) is greater or equal to 3:
if(argc >= 3)
Then you need to convert this into integers. If you don't want to use atoi, and if you don't have C++11 compiler you should use C++'s stringstream or C's strtol
stringstream ss;
int temp;
ss << argv[1]; // Put string into stringstream
ss >> temp; // Get integer from stringstream
// Check for the error:
if(!ss.fail())
{
M = temp;
}
// Repeat
ss.clear(); // Clear the current content!
ss << argv[2]; // Put string into stringstream
ss >> temp; // Get integer from stringstream
// Check for the error:
if(!ss.fail())
{
N = temp;
}
so, whole code will look like this:
#include <iostream>
#include <string>
#include <cstring>
#include <cstdlib>
#include <cstdio>
#include <sstream>
using namespace std;
int N = 7;
int M = 8;//N is number of lines, M number of values
//--------------
//-----Main-----
//--------------
int main(int argc, char* argv[])
{
if(argc >= 3)
{
stringstream ss;
int temp;
ss << argv[1]; // Put char into stringstream
ss >> temp; // Get integer from stringstream
// Check for the error:
if(!ss.fail())
{
M = temp;
}
// Repeat
// empty
ss.clear();
ss << argv[2]; // Put char into stringstream
ss >> temp; // Get integer from stringstream
// Check for the error:
if(!ss.fail())
{
N = temp;
}
cout << M << " " << N;
}
else
{
cout << "Invalid or no command line arguments found. Defaulting to N=7 M=8.\n\n" <<
endl;
}
//Blah blah blah code here
return 0;
}
Also, C header files include with c prefix, not with .h suffix (<cstdio> instead of <stdio.h>)