Hexadecimal String to Int conversion - sml

I have a hexadecimal string like "0xff", and want to convert it to int, which is 255.
But Int.fromString "0xff" gives the answer 0.
I though that Int.scan would help. But I'm new to SML.

You can do
StringCvt.scanString (Int.scan StringCvt.HEX) "0xff"
or
StringCvt.scanString (Int.scan StringCvt.HEX) "ff"

Related

How do I convert a string of binary numbers to their signed decimal representation using C++?

I have a string:
1010
The unsigned representation of the string comes out be 10 after doing:
string immediateValue = "1010";
char immediateChars[5];
strcpy(immediateChars, immediateValue.c_str());
char * ptr;
long parsedInteger = strtol(immediateChars, &ptr, 2);
As I understand strtol can be used to only get unsigned representation. Is there a way to get the 2's complement value which would be -6?
Check your first character, if it is 0 then use n = strtol normally, if it is 1 then flip the bits, e.g. "1010" to "0101", then take strtol of the flipped string, the negative of that value minus one is your answer.
The numbers are binary strings and you have to convert them to denary.
Nuisance. The strings are likely too long to simply be packed in a long long, which makes life a lot easier (shove the bits in the long long, then call sprintf()). You'll havew to write your own binary division routine, divide by binary ten ("1010") and take the remainder. That's your last digit. Then repeat until the number goes to zero. As a last step, reverse the denary digits to match the most significant digit to the left convention.
It's quite a hunk of code but reasonable as learning exercise.

Why '1' and (char)1 are not equal when compared in c++?

My main goal is to convert int to char type. I used (char)1 to type cast, but it doesn't seem to work due to the following result:
When I compare '1' and (char)1 in c++ in the following code
if ('1' == (char)1)
{
return 1;
}
However, it seems that the comparison is either invalid due to different variable type or they are actually not the same thing. I always thought converting integer 1 to character is (char)1. Can anyone tell me how I can convert integer 1 to char '1'?
'1' is equal to (char)49 according to http://www.asciitable.com/
(char)1 is equal to SOH (start of heading) which is a non-printable character.
Because the ASCII equivalent of '1' is 49, not 1.
'1' == The character CODE value for the printable 1, traditionally ASCII value, but today, the code point value in whatever charset is used.
The old trick is (ch - '0') to get the numeric value.
Depending on the language you should use a conversion function for a full string.
C++ - stoi, stol or strol or stringstream
C - atoi or atol (these work in C++ too)
As ibiza said, char(49) is in fact what 1 is. This is because char draws from the ASCII library.
Because when you do (char)X with X a number, you are just converting X into the range of a char, either -128 to 127 or 0 to 255 (like a modulo).
For example, (char)300 gives 44 (because 300 % 256 = 44) and (char)1 gives 1. As said in the others comments, 1 is the ASCII equivalent of SOH (Start of Heading), and not of the character '1'.

Unknown integer conversion for string length >10 in cocos2d/Xcode applications(iOS 7.0)

I am working with cocos2d 2.0 and iOS7.0. And while trying to get the integer value or float value of a string with larger length(usually > 10), I'm always getting some unknown outputs as below:
when string length <= 10:
NSString *amount = #"1234567890";
NSLog(#"AmountStr=|%#|",amount);
NSLog(#"Amount =|%d|",[amount integerValue]);
Output(getting correct integer value):
AmountStr=|1234567890|
Amount =|1234567890| --
But, when string length >10, that is :
NSString *amount = #"12345678901"; -- added a '1' after the string, so length = 11
NSLog(#"AmountStr=|%#|",amount);
NSLog(#"Amount =|%d|",[amount integerValue]);
then I am getting the output as :
AmountStr=|12345678901| -- This is correct
Amount =|2147483647| -- But what about this..!!! :O
I have tried integerValue, intValue, and floatValue. Every time, same error occurs. So how do I findout the int value of a string with length greater than 10. Please help me.
NSLog(#"Amount =|%lli|",[amount longLongValue]);
You're trying to print a number as an integer which is larger than the largest number an integer can hold. It's not even about number of digits. Trying to do this with 3000000000 would replicate the "error".
There's also doubleValue method for NSString, which will give you more significant digits than floatValue.
Moreover, I'm a little surprised that using %d with the call to integerValue even works. intValue returns an int. But integerValue returns an NSInteger. Normally, when using format specifiers with NSInteger, you need to use %ld and cast the NSInteger to a long...
And for up to 38-digits, you can always use NSDecimalNumber.
NSDecimalNumber *myNum = [NSDecimalNumber decimalNumberWithString:amount];
NSLog(#"%#", [myNum descriptionWithLocale:[NSLocale systemLocale]]);

Converting an unsigned char* to readable string & whats this function doing

I have googled alot to learn how to convert my unsigned char* to a printable hex string. So far I am slightly understanding how it all works & the difference between signed & unsigned chars.
Can you tell me what this function I found does? And help me devlop a function that converts a unsigned char*(which is a hashed string) to a printable string?
Does the following function do this:
- it iterates over every second character of the char array string
- on each loop it reads the char at the position string[x], converts it to an unsigned number(with a precision of 2 decimal places) then copies that converted char(number?) to the variables uChar.
- finally it stores the unsigned char uChar in hexstring
void AppManager :: stringToHex( unsigned char* hexString, char* string, int stringLength )
{
// Post:
unsigned char uChar = 0;
for ( int x = 0; x<stringLength; x+=2 )
{
sscanf_s(&string[x], "%02x", &uChar);
hexString[x] = uChar;
}
}
So I guess that means that it converts the character in string to unsigned(& 2dcp) to ensure that it can be correctly stored the hexstring. Why to 2 decimal places, & wont a simple conversion from signed(if that character is signed) to unsigned result in a completely different string?
If I have a unsigned char* how can I go about converting it to something that will let me print it out on screen?
Those aren't decimal places, they're digits. You're saying "don't give me a string shorter than 2; if it's shorter than 2 digits, then pad it with a zero."
This is so that if you have a hex sequence 0x0A it'll actually print 0A and not just A.
Also, there is no signed/unsigned conversion here. Hex strings are hex strings - they don't have a sign. They're a binary representation of the data, and depending on how they're interpreted may be read as two's complement signed integers, unsigned integers, strings, or anything else.

converting string to int

hello i have a problem i am trying to convert a string like "12314234" to int
so i will get the first number in the string.
from the example of string that is shown above i want to get '1' in int.
i tried :
string line = "12314234";
int command = line.at(0);
but it puts inside command the ascii value of 1 and not the number 1.
thanks in advance.
int command = line.at(0) - '0';
The numerical values of digit characters are required to be next to each other. So this works everywhere and always. No matter what character-set your compiler uses.
To convert a numerical character ('0' – '9') to its corresponding value, just substract the ASCII code of '0' from the result.
int command = line.at(0) - '0';
The standard function to convert an ascii to a integral value is strtol (string to long) from stdlib (#include <cstdlib>). For information, see http://en.wikipedia.org/wiki/Strtol and then the link off that page entitled Using strtol correctly and portably. With strtol you can convert one numeric character, or a sequence of numeric characters, and they can be multiple bases (dec, hex, etc).
i am trying to convert a string like "12314234" to int
Boost has a library for this:
int i = boost::lexical_cast<int>(str);
Sorry to join this party late but what is wrong with:
int value;
std::string number = "12345";
std::istringstream iss(number);
iss >> value;
If you are passing hexadecimal around (and who isn't these days) then you can do:
int value;
std::string number = "0xff";
std::istringstream iss(number);
iss >> std::hex >> value;
It's C++ and has none of this hackish subtraction of ASCii stuff.
The boost::lexical_cast<>() way is nice and clean if you are using boost. If you can't guarantee the string being passed will be a number then you should catch the thrown error.