Converting an integer to to ascii values C++ - c++

I am attempting to turn a number into letters using ascii, at the moment I can do it one letter at a time:
EDIT: The output of an RSA encryption that I've been working on is currently in the form of an integer, I'm trying to work out how to convert it to the word/sentence which was the original input. I've nearly finished but I'm completely stuck at the last "hurdle". I'm adding context due to a comment asking why I would want to do this (or words to that effect).
EDIT: If during the encryption process I used the ASCII value - 87, all letters would be 2 digits long, eliminating the problem of some ASCII characters being 3 letters and some being 2, does this make the problem more approachable? (it limits me to only letter but that's fine for its purpose)
#include <string>
#include <iostream>
char returnChar(int x)
{
return (char) x;
}
int main()
{
std::cout << returnChar (119);
}
This converts 32 --> w.
How could I adapt this function to allow me to change "3232" --> "ww" or any other integer to ascii characters, e.g. "32242713" --> "word".
EDIT: I think using some kind of mod function to split it into chunks of two numbers which could then be converted to characters might work?
How do I overcome the problem of some ascii characters having 2 digits and some having 3 digits? I think this problem has been solved as described in the second edit
If you can see that I've approached this in entirely the wrong way, could you suggest a viable alternative approach for me to try please?
Thanks for any feedback.

What you're asking for is not possible. You have a few alternatives:
Change the int to a string and put white spaces/other characters inside the string:
std::string test = "119 119";
Convert the total value to binary, and parse byte by byte:
unsigned int test = 30583; // 119*256+119
char a = (test>>8)&0xff;
char b = test&0xff;
Pass the data in a vector and convert one element at a time:
std::vector<char> returnChar(const std::vector<int> &data){
std::vector<char> output;
for(unsigned int i=0;i<data.size();i++)
output.push_back(char(data[i]));
return output;
}
I would probably stick with the second method, since - a wild guess here - it shouldn't change much things inside where you actually generate the numbers.

Related

Slicing string character correctly in C++

I'd like to count number 1 in my input, for example,111 (1+1+1) must return 3and
101must return 2 (1+1)
To achieve this,I developed sample code as follows.
#include <iostream>
using namespace std;
int main(){
string S;
cout<<"input number";
cin>>S;
cout<<"S[0]:"<<S[0]<<endl;
cout<<"S[1]:"<<S[1]<<endl;
cout<<"S[2]:"<<S[2]<<endl;
int T = (int) (S[0]+S[1]+S[2]);
cout<<"T:"<<T<<endl;
return 0;
}
But when I execute this code I input 111 for example and my expected return is 3 but it returned 147.
[ec2-user#ip-10-0-1-187 atcoder]$ ./a.out
input number
111
S[0]:1
S[1]:1
S[2]:1
T:147
What is the wrong point of that ? I am totally novice, so that if someone has opinion,please let me know. Thanks
It's because S[0] is a char. You are adding the character values of these digits, rather than the numerical value. In ASCII, numerical digits start at value 48. In other words, each of your 3 values are exactly 48 too big.
So instead of doing 1+1+1, you're doing 49+49+49.
The simplest way to convert from character value to digit is to subtract 48, which is the value of 0.
e.g, S[0] - '0'.
Since your goal is to count the occurrences of a character, it makes no sense to sum the characters together. I recommend this:
std::cout << std::ranges::count(S, '1');
To explain the output that you get, characters are integers whose values represent various symbols (and non-printable control characters). The value that represents the symbol '1' is not 1. '1'+'1'+'1' is not '3'.

I am having problem with this question my output is not correct

Write a program to display the sizes of basic four datatypes i.e integer, double, float, character.
Input:
The first line of input contains integer T denoting the number of test cases. For each test case, the user can input any of the above data types.
Output:
For each test case, there is a single line output displaying the size of that data type.
Constraints:
1<=T<=100
Example:
Input:
4
1
#
7.98
9.985647851
Output:
4
1
4
8
I tried this
int main() {
//code
int x;
cin>>x;
std::string s;
for(int i = 0;i<x;i++){
cin>>s;
cout << sizeof(s) << endl;
}
return 0;
}
Output was
32
32
Wrong Answer
I started to work on this problem to provide the OP with some insight however I have come to conclusion that many others have within the comment section of the OPs original question: and that is that this question or problem has an indeterminate solution!
Reasoning:
If the user enters any of the following digits as a single entity for input { 0, 1, ... 9 } into the console this can be at the least interpreted as an int or a char type and at worst even a possible double, but without the '.' character being present we could eliminate that as a candidate. This problem definitely has ambiguity to it. Checking to see if it is a float or a double is easy; all one has to do is check the string to see if there is at least a '.' then it's either a double or a float then check the last character of the string to see if it is a f and if so then it is a float. Checking to see if it is a character that is a non digit is easy, however distinguishing a single character digit between a char and an int is the troublesome part!
Work around:
You could conclude that if the input string is a single character and is a non digit then it is definitely a char type.
You could conclude that if the input string is a single character and is a digit ASCII[48 - 57] then you could conclude that it is an int type. This would be considered a restraint.
You could conclude that if it isn't the above two it is at least a float or a double and it is a float if and only if the last character of the string is a f, but a . must be present for it to be either of the two. Again these would be restraints that you would put on the accepted data input.

How to convert an array of ASCII codes to int C++

First of all, i would like to read from plain text, i read hundreds of webpages about it and i just can't make it. I want to read every byte of the file and every two byte is a number what i want to store.
I want to read: 10 20.
I get: ASCII code of 1, ASCII code of 0, ASCII code of space etc. etc.
I tried several things, like stream.get, or stream.read, tried to convert with atoi but then i can't concatenate the two digits, i tried sprintf but all of them failed.
Array of ASCII codes:
char ASCII[] = "10 20";
Convert to integer variables:
std::istringstream iss(ASCII);
int x,y;
iss >> x >> y;
Done.
Here's the working sample: http://ideone.com/y8ZRGs
If you want to do this with your own code, there are only two things you need to be able to do.
First, you need to convert from the ASCII code of a digit to the number it represents. This is as simple as subtracting '0'.
Second, you need to convert from the numerical value of each digit of a two digit number to the number that represents. This is simple -- if T is the tens place and U is the units, it's 10T + U.
So, for example:
int twoDigitNumber (char tens, char units)
{
return 10 * (tens - '0') + (units - '0');
}

Casting an int to a char. Not storing the correct value

I'm trying to store a number as a character in a char vector named code
code->at(i) = static_cast<char>(distribution(generator));
However it is not storing the way I think it should
for some shouldn't '\x4' be the ascii value for 4? if not how do I achieve that result?
Here's another vector who's values were entered correctly.
You are casting without actually converting the int to a char. You need:
code->at(i) = distribution(generator) + '0';
No. \xN does not give you the ASCII code for the character N.
\xN is the ASCII character† whose code is N (in hexadecimal form).
So, when you write '\x4', you get the [unprintable] character with the ASCII code 4. Upon conversion to an integer, this value is still 4.
If you wanted the ASCII character that looks like 4, you'd write '\x34' because 34 is 4's ASCII code. You could also get there using some magic, based on numbers in ASCII being contiguous and starting from '0':
code->at(i) = '0' + distribution(generator);
† Ish.

Execution hangs for string input greater than 1000 characters

Code hangs for string input greater than 1000 characters
All characters in P & Q are lower-case English letters.
#include<iostream>
#include<vector>
using namespace std;
int main(){
// 1st example
string p,q;
cin >> p >> q;
// 2nd example
char p[1500],q[1500];
scanf("%s",p);
cin >> q ;
return 0;
}
Both run fine for strings less than 1000 characters.
Both alternative examples break for strings greater than 1000 characters.
I just need a way to input strings of max 1500 chracters
Edit:Turns out XCode was at fault. It works fine on the server
If you pass Unicode input to this, the buffers will overflow & you will crash.
Edited to add:
If you're lucky. Note that lowercase English Unicode characters are still two bytes each. If your input is from a file, you cannot tell by looking whether it's Unicode, you need to open it in a hex editor to be sure.
To test this, make the buffers twice as large as the input (p[2001], q[2001]).
But using static buffers is the wrong way to do this, string is the right way: the upper limit is something like 2^32 characters.