Loop not processing the last character of a string - c++

Basically, the (Vigenere) decryption works perfectly except for not including the final letter for the decryption. For instance, the decryption for m_text yields 48 letters instead of 49. I even tried to manipulate the loop but it doesn't work out well since i will get a out of range exception with .at(). Any help would be appreciated!
using namespace std;
#include <string>
#include <iostream>
int main()
{
string ALPHABET = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
string m_text = "ZOWDLTRTNENMGONMPAPXVUXADRIXUBJMWEWYDSYXUSYKRNLXU";
int length = m_text.length();
string key = "DA";
string plainText = "";
int shift = 0;
int shift2 = 0;
//Loop that decrypts
for (int k = 0; k < length-1; k+=2)
{
//Key 1 shift
shift = m_text.at(k) - key.at(0);
//Key 2 shift
shift2 = m_text.at(k+1) - key.at(1);
if (shift >= 0)
{
plainText += ALPHABET.at(shift);
}
else
{
shift += 91;
plainText += (char)shift;
}
if (shift2 >= 0)
{
plainText += ALPHABET.at(shift2);
}
else
{
shift2 += 91;
plainText += (char)shift2;
}
}
cout << plainText << endl;
}

By the looks of things, you are decoding two characters at a time. So when you have 49 characters in your string, there is one left over (which doesn't get processed). If you make m_text 48 characters long, you will notice you get the correct result.
It might be easier to replicate your key to match the length of the message, then do character-by-character decoding.

Related

Run-length decompression using C++

I have a text file with a string which I encoded.
Let's say it is: aaahhhhiii kkkjjhh ikl wwwwwweeeett
Here the code for encoding, which works perfectly fine:
void Encode(std::string &inputstring, std::string &outputstring)
{
for (int i = 0; i < inputstring.length(); i++) {
int count = 1;
while (inputstring[i] == inputstring[i+1]) {
count++;
i++;
}
if(count <= 1) {
outputstring += inputstring[i];
} else {
outputstring += std::to_string(count);
outputstring += inputstring[i];
}
}
}
Output is as expected: 3a4h3i 3k2j2h ikl 6w4e2t
Now, I'd like to decompress the output - back to original.
And I am struggling with this since a couple days now.
My idea so far:
void Decompress(std::string &compressed, std::string &original)
{
char currentChar = 0;
auto n = compressed.length();
for(int i = 0; i < n; i++) {
currentChar = compressed[i++];
if(compressed[i] <= 1) {
original += compressed[i];
} else if (isalpha(currentChar)) {
//
} else {
//
int number = isnumber(currentChar).....
original += number;
}
}
}
I know my Decompress function seems a bit messy, but I am pretty lost with this one.
Sorry for that.
Maybe there is someone out there at stackoverflow who would like to help a lost and beginner soul.
Thanks for any help, I appreciate it.
Assuming input strings cannot contain digits (this cannot be covered by your encoding as e. g. both the strings "3a" and "aaa" would result in the encoded string "3a" – how would you ever want to decompose again?) then you can decompress as follows:
unsigned int num = 0;
for(auto c : compressed)
{
if(std::isdigit(static_cast<unsigned char>(c)))
{
num = num * 10 + c - '0';
}
else
{
num += num == 0; // assume you haven't read a digit yet!
while(num--)
{
original += c;
}
}
}
Untested code, though...
Characters in a string actually are only numerical values, though. You can consider char (or signed char, unsigned char) as ordinary 8-bit integers as well. And you can store a numerical value in such a byte, too. Usually, you do run length encoding exactly that way: Count up to 255 equal characters, store the count in a single byte and the character in another byte. One single "a" would then be encoded as 0x01 0x61 (the latter being the ASCII value of a), "aa" would get 0x02 0x61, and so on. If you have to store more than 255 equal characters you store two pairs: 0xff 0x61, 0x07 0x61 for a string containing 262 times the character a... Decoding then gets trivial: you read characters pairwise, first byte you interpret as number, second one as character – rest being trivial. And you nicely cover digits that way as well.
#include "string"
#include "iostream"
void Encode(std::string& inputstring, std::string& outputstring)
{
for (unsigned int i = 0; i < inputstring.length(); i++) {
int count = 1;
while (inputstring[i] == inputstring[i + 1]) {
count++;
i++;
}
if (count <= 1) {
outputstring += inputstring[i];
}
else {
outputstring += std::to_string(count);
outputstring += inputstring[i];
}
}
}
bool alpha_or_space(const char c)
{
return isalpha(c) || c == ' ';
}
void Decompress(std::string& compressed, std::string& original)
{
size_t i = 0;
size_t repeat;
while (i < compressed.length())
{
// normal alpha charachers
while (alpha_or_space(compressed[i]))
original.push_back(compressed[i++]);
// repeat number
repeat = 0;
while (isdigit(compressed[i]))
repeat = 10 * repeat + (compressed[i++] - '0');
// unroll releat charachters
auto char_to_unroll = compressed[i++];
while (repeat--)
original.push_back(char_to_unroll);
}
}
int main()
{
std::string deco, outp, inp = "aaahhhhiii kkkjjhh ikl wwwwwweeeett";
Encode(inp, outp);
Decompress(outp, deco);
std::cout << inp << std::endl << outp << std::endl<< deco;
return 0;
}
The decompression can't possibly work in an unambiguous way because you didn't define a sentinel character; i.e. given the compressed stream it's impossible to determine whether a number is an original single number or it represents the repeat RLE command. I would suggest using '0' as the sentinel char. While encoding, if you see '0' you just output 010. Any other char X will translate to 0NX where N is the repeat byte counter. If you go over 255, just output a new RLE repeat command

How to set my decode function to take in the encoded string?

I'm working on my decode function and I've hit a wall. I dont know if I should pass in the encode function or create a class. My encode function compresses a string, I need the decode function to take that encoded string and expand it.
I've been told that it was the same as doing the encode function. I'm not sure where to go here.
#include<iostream>
#include<string>
using namespace std;
string encode(string str)
{
string encoding = "";
int count;
for (int i = 0; str[i]; i++)
{
count = 1;
while (str[i]==str[i+1])
{
count++, i++;
}
encoding += to_string(count) + str[i];
}
return encoding;
}
//Im trying to decode the encoded string
//take in a string and count how many of the same characters there are and print
//e.g
// a3b4c1......would be decoded as aaabbbbc
string decode(string in)
{
string decoding = "";
char s;
int count;
for (int i = 0; i<in; i++)
{
count = 1;
if (in[i] == 'A')
count++, i++;
}
}
int main()
{
string str = "ABBCC";
cout << encode(str);
//cout << decode(str);
}
// My encode functions executes as needed. 1A2B2C
Your encoding is not valid because the encoding of "1a" produces "111a" which is also the encoding of 111 consecutive 'a', you need to add a separator between the count and the character
In your decode function you only manage the special case of A and you do not extract the count the encoder put
Note also in
for (int i = 0; i<in; i++)
{
count = 1;
if (in[i] == 'A')
count++, i++;
}
you always reset count to 1
You need to first extract the count (with the problem I signal at the beginning of my answer) then duplicate the letter 'count' times
It is useless to do string encoding = ""; because the constructor of std::string make it empty, can be just string encoding;
You need to decode an encoded string, this is not what you do in your main where you try to decode the initial string
A corrected version can be :
#include<iostream>
#include<string>
#include <sstream>
using namespace std;
string encode(string str)
{
stringstream encoding;
int count;
for (int i = 0; str[i]; i++)
{
count = 1;
while (str[i]==str[i+1])
{
count++, i++;
}
encoding << count << ' ' << str[i];
}
return encoding.str();
}
string decode(string in)
{
stringstream is(in);
string decoding;
int n;
char c;
while (is >> n >> c)
{
while (n--)
decoding += c;
}
return decoding;
}
int main()
{
cout << encode("ABBCC2a") << endl;
cout << decode(encode("ABBCC2a")) << endl;
return 0;
}
Compilation and execution :
pi#raspberrypi:/tmp $ g++ -pedantic -Wall -Wextra e.cc
pi#raspberrypi:/tmp $ ./a.out
1 A2 B2 C1 21 a
ABBCC2a
Run-length-encoding – but in a very strange way!
encoding += to_string(count) + str[i];
Let's encode string "sssssssssss"; it will result in a string with array representation of
{ '1', '1', 's', 0 } // string "11s"
(I chose this representation deliberately, you'll see later...)
The problem is that you wouldn't be able to encode strings containing digits: "1s" will result in
{ '1', '1', '1', 's', 0 } // string "111s"
but how would you want to distinguish if we need to decode back to "1s" or into a string solely containing 111 s characters?
Try it differently: A character actually is nothing more than a number as well, e. g. letter s is represented by numerical value 115 (in ASCII and compatible, at least), the digit 7 (as a character!) by numerical value 55. So you can simply add the value as character:
encoding += static_cast<unsigned char>(count) + str[i];
There are some corner cases, unsigned char cannot hold numbers greater than 255, so a string having more subsequent equal characters would have to be encoded e. g. as
{ 255, 's', 7, 's', 0 } // 262 times letter s
Note the representation; 255 and 7 aren't even printable characters! Now let's assume we encoded a string with 115 times the letter s:
{ 115, 's', 0 } // guess, as string, this would be "ss"...
To catch, you would simply check explicitly your counter for reaching maximum value.
Now decoding gets much simpler:
size_t i = 0;
while(i < encoded.size())
{
unsigned char n = encoded[i];
++i;
while(n--)
decoded += encoded[i];
++i;
}
Totally simple: first byte always as number, second one as character...
If you insist on numbers being encoded as string (and encode only strings not containing digits), you could use a std::istringstream:
std::istringstream s(encoded);
unsigned int n;
char c;
while(s >> n >> c)
{
while(n--)
decoded += encoded[i];
}
OK, it is not symmetric to your encoding function. You could adapt the latter to be so, though:
std::ostringstream s;
for(;;) // ...
{
unsigned int count = 1;
// ...
s << count << str[i];
}

String subscript out of range (Visual Studio 2013)

I understand that questions with this title/problem have been asked numerous times before (here,here and many others). Here is my code followed by what all I have done to remove the error:
CaesarCipher.h
#ifndef CAESARCIPHER_H
#define CAESARCIPHER_H
#include <ctime>
#include <string>
using namespace std;
// Write your class CaesarCipher here.
class CaesarCipher
{
public:
CaesarCipher();
string Encode(string plainString);
string Decode(string encryptedString);
private:
int key1, key2;
char Encode(char normalChar)const;
char Decode(char encodedChar)const;
};
#endif
CaesarCipher.cpp
#include "stdafx.h"
#include "CaesarCipher.h"
using namespace std;
// Implement the member functions of class CaesarCipher.
CaesarCipher::CaesarCipher()
{
//Random initialization of integer key1
//srand(time(0));
srand((unsigned int)time(0));
int value1 = rand() % 10;
int sign1 = rand() % 2;
sign1 = sign1 == 0 ? -1 : 1;
int key1 = value1 * sign1;
//Random initialization of integer key2
//srand(time(0));
srand((unsigned int)time(0));
int value2 = rand() % 10;
int sign2 = rand() % 2;
sign2 = sign2 == 0 ? -1 : 1;
int key2 = value2 * sign2;
}
char CaesarCipher::Encode(char normalChar) const
{
int result=0;
int charValue = normalChar; //get the ASCII decimal value of character
if (charValue == 32) // if characeter is a space, we leave it
{
result = 32;
}
else
{
if (key1 > 0)
{
result = char(int(charValue + key1 - 97) % 26 + 97); // find the integer value of char after rotating it with key1(positive)
}
if (key1 < 0)
{
result = char(int(charValue -key1 - 97) % 26 + 97); // find the integer value of char after rotating it with key1(negative)
}
if (key2 > 0)
{
result += char(int(charValue + key2 - 97) % 26 + 97); // find the updated integer value of char after rotating it with key2(positive)
}
if (key2 < 0)
{
result += char(int(charValue - key2 - 97) % 26 + 97); // find the updated integer value of char after rotating it with key2(negative)
}
}
return result; // returning the integer value which will be typecasted into a char(encoded char)
}
char CaesarCipher::Decode(char encodedChar) const
{
int result = 0;
int charValue = encodedChar; //get the ASCII decimal value of encoded character
if (charValue == 32) // if characeter is a space, we leave it unchanged
{
result = 32;
}
else
{
if (key1 > 0)
{
result = char(int(charValue - key1 - 97) % 26 + 97); // find the integer value of encoded char after rotating it with key1(positive) in opposite direction
}
if (key1 < 0)
{
result = char(int(charValue + key1 - 97) % 26 + 97); // find the integer value of encoded char after rotating it with key1(negative) in opposite direction
}
if (key2 > 0)
{
result += char(int(charValue - key2 - 97) % 26 + 97); // find the updated integer value of encoded char after rotating it with key2(positive) in opposite direction
}
if (key2 < 0)
{
result += char(int(charValue + key2 - 97) % 26 + 97); // find the updated integer value of encoded char after rotating it with key2(negative) in opposite direction
}
}
return result; // returning the integer value which will be typecasted into a char(decrypted char)
}
string CaesarCipher::Encode(string plainString)
{
int length = plainString.length(); //gets the length of the
input string
string encodedString; // variable to hold the final encrypted string
for (int i = 0; i < length; i++)
{
encodedString[i] = Encode(plainString[i]); // encrypting the string one character at a time
}
return encodedString; // return the final encoded string
}
string CaesarCipher::Decode(string encryptedString)
{
int length = encryptedString.length(); //gets the length of the input encrypted string
string decodedString; // variable to hold the final decrypted string
for (int i = 0; i < length; i++)
{
decodedString[i] = Decode(encryptedString[i]); // decrypting the string one character at a time
}
return decodedString; // return the final decoded string
}
I am using two keys to cipher the text (key1 followed by key2), if it helps in any way.
Main.cpp
#include "stdafx.h"
#include "CaesarCipher.h"
#include <fstream>
#include <iostream>
int main() {
// File streams
ifstream fin("input.txt");
ofstream fout("output.txt");
if (!fin.good()) {
cout << "Error: file \"input.txt\" does not exist!" << endl;
return -1;
}
string original[20], encrypted[20], decrypted[20];
int i = 0; // will store the number of lines in the input file
CaesarCipher cipher; // an object of CaesarCipher class
// Read the sentences from the input file and save to original[20].
// Hint: use getline() function.
while (!fin.eof())
{
getline(fin, original[i]); // Reading a line from input.txt file
encrypted[i] = cipher.Encode(original[i]); // Encrypt the sentences and save to encrypted[20]
decrypted[i] = cipher.Decode(encrypted[i]); // Decrypt the sentences and save to decrypted[20]
i++;
}
//first output all the encrypted lines
for (int j = 0; j < i; j++)
{
fout << "Encrypted sentences:\n";
fout << encrypted[j]<<"\n";
}
//now output all the decrypted lines
for (int j = 0; j < i; j++)
{
fout << "Decrypted sentences:\n";
fout << decrypted[j] << "\n";
}
// Close the files and end the program.
fin.close();
fout.close();
cout << "done!";
return 0;
}
The error which i am getting isExpression: string subscript out of range. Now i understand that i am trying to iterate beyond the limits of the string (somewhere probably in CaesarCipher.cpp in Encoder or Decoder function).
I have tried to change the limits on i without any effect.
I have tried to use size() instead of length() (in desperacy inspite knowing they do the same thing).
I would really appreciate if you can pin-point any thing in particular which might be causing this error and i will try and change it by myself and see the results.
And if you can also tell, how to avoid such errors in future that will also be of great value to me.
CaesarCipher::Encode() is not allocating any memory for the character data of encodedString, so the loop has nothing valid to access with encodedString[i]. To fix that, either:
Use string encodedString = plainString; to make a copy of the input string, then the loop can manipulate the copied data:
string CaesarCipher::Encode(string plainString) {
int length = plainString.length(); //gets the length of the input string
string encodedString = plainString; // variable to hold the final encrypted string
for (int i = 0; i < length; i++) {
encodedString[i] = Encode(encodedString[i]); // encrypting the string one character at a time
}
return encodedString; // return the final encoded string
}
Use encodedString.resize(length) to pre-allocate the output string before entering the loop:
string CaesarCipher::Encode(string plainString) {
int length = plainString.length(); //gets the length of the input string
string encodedString; // variable to hold the final encrypted string
encodedString.resize(length); // allocate memory for the final encoded string
for (int i = 0; i < length; i++) {
encodedString[i] = Encode(plainString[i]); // encrypting the string one character at a time
}
return encodedString; // return the final encoded string
}
Use encodedString += plainString[i]; to append characters to the output string and let it grow as needed:
string CaesarCipher::Encode(string plainString) {
int length = plainString.length(); //gets the length of the input string
string encodedString; // variable to hold the final encrypted string
for (int i = 0; i < length; i++) {
encodedString += Encode(plainString[i]); // encrypting the string one character at a time
}
return encodedString; // return the final encoded string
}
The same problem exists in CaesarCipher::Decode() with the decodedString variable.
Also, main() has a buffer overflow if input.txt has more than 20 lines in it. Consider changing the code to use std::vector instead of fixed arrays.
And while (!fin.eof()) is wrong to use. Use while (getline(...)) instead:
// Read the sentences from the input file and save to original[20].
// Hint: use getline() function.
string line;
while (getline(fin, line)) { // Reading a line from input.txt file
original[i] = line;
encrypted[i] = cipher.Encode(original[i]); // Encrypt the sentences and save to encrypted[20]
decrypted[i] = cipher.Decode(encrypted[i]); // Decrypt the sentences and save to decrypted[20]
i++;
}

C++ ::toupper not allowing equality comparison?

I am trying to convert a string to uppercase so I can manipulate it, but while I can successfully manipulate natural uppercase strings, as well as convert lowercase to uppercase, using this method of conversion fails to allow the manipulation.
For example, if I pass "hello" through the encryption, my encrypted string becomes "HELLO", but when I pass "HELLO" through (naturally capitalized), it correctly shifts.
Is there a different way of forcing uppercase that I need to be using or am I doing something wrong?
int Caesar::encrypt (const std::string &message, std::string &emessage) {
int count = 0;
emessage = message;
std::transform(emessage.begin(), emessage.end(), emessage.begin(), ::toupper);
for (std::string::size_type i = 0; i < message.size(); i++) {
for (int j = 0; j < 26; j++) {
if (emessage[i] == std_alphabet[j]) {
std::replace(emessage.begin(), emessage.end(), message[i], c_alphabet[j]);
}
}
count++;
}
return count;
}
constructor:
Caesar::Caesar (int shift) {
// loop to populate vector with 26 letters of English alphabet
// using ASCII uppcase letter codes
for (int i = 0; i < 26; i++) {
std_alphabet.push_back(i + 65);
}
// fills Caesar alphabet with standard generated alphabet
c_alphabet = std_alphabet;
// shifts Caesar alphabet based off the constructor parameter
std::rotate(c_alphabet.begin(), c_alphabet.begin() + shift, c_alphabet.end());
}
test file:
void testCaesar() {
Caesar test(4);
std::string original = "HELLO";
std::string encrypted = "";
test.encrypt(original,encrypted);
std::cout << encrypted << std::endl;
std::cout << original << std::endl;
}
int main() {
testCaesar();
return 0;
}
Obviously there is a header and includes and stuff but that is the basic code
the header file includes the two private vectors
The specific issue you are seeing is that you're replacing the wrong thing here:
std::replace(emessage.begin(), emessage.end(), message[i], c_alphabet[j]);
If message was lowercase, then emessage will be all upper-case letters - none of which will be message[i]. so that replacement won't do anything. You meant:
std::replace(emessage.begin(), emessage.end(), emessage[i], c_alphabet[j]);
^^^^^^^^^^^
That said, your algorithm is totally wrong as HELLO encrypts as BCBBA with a shift of 4. There is a 1-1 mapping on letters, so H and L cannot both go to B. What you want to do is shift each letter as you go by just replacing it with what its next letter should be. That is:
for (std::string::size_type i = 0; i < emessage.size(); ++i) {
emessage[i] = c_alphabet[emessage[i] - 'A'];
}
With which you don't actually need the initial transformation step:
emessage = message;
for (std::string::size_type i = 0; i < emessage.size(); ++i) {
emessage[i] = c_alphabet[::toupper(emessage[i]) - 'A'];
}
The whole thing can be abridged quite a bit by just dropping your count (which is just the size anyway, so is redundant) and taking the message by-value:
std::string encrypt(std::string from) { // intentionally copying
for (char& c : from) {
c = c_alphabet[::toupper(c) - 'A'];
}
return from;
}

C++ Enigma::Decoding A Message

Hey all im working on a program that has gives you the option of converting a message(string) into 3 different schemes: Prime Shift Encoding scheme, Shifty Encoding Scheme, and a Reverse Encoding scheme. I have figured out how to encode and decode the prime shift scheme and reverse scheme, but im having a little bit of trouble with the Shifty Encoding scheme. The rules for the Shifty scheme are these: Each character in the message is converted to its ASCII value. The value is subtracted from 1000, resulting in a triplet of numbers. Each digit of the triplet is then converted to the symbolic value on the keyboard above the numbers 1 – 0. The string of symbols is the encoded message.
For example, if the character’s value is 37, it is subtracted from 1000, resulting in a triplet of 963. The corresponding keyboard characters are (^#.
The encoded message is then stored in a text file along with the key, and the number that corresponds to the scheme. When the user clicks on the decode button a filechooser opens up and the user selects the text file to read in. Once he/she selects the file they want open the program reads in the encoded message the key and the scheme. So the program must be able to get the encoded message and convert it back to its original message.
I have figured out the code to encode the Shifty scheme and it encodes the message perfectly, but im lost on how to decode the message. I know i have to somehow obtain each of the triplet of digits back from the encoded string and then subtract 1000 from each one so i can get the correct ascii character, but im lost on how to do that. Any help would be appreciated.
so far i have this:
ShiftyEnigma::ShiftyEnigma()
{
keyBoard[0] = ')';
keyBoard[1] = '!';
keyBoard[2] = '#';
keyBoard[3] = '#';
keyBoard[4] = '$';
keyBoard[5] = '%';
keyBoard[6] = '^';
keyBoard[7] = '&';
keyBoard[8] = '*';
keyBoard[9] = '(';
}
void ShiftyEnigma::encode()
{
stringstream ss;
stringstream s1;
int value = 0;
for(unsigned int i = 0; i < codedMessage.length(); ++i)
{
int ascii = codedMessage.at(i);
//subtracting 1000 from ascii number of each character in message
value = 1000 - ascii;
//setting the value in string stream in order to convert each digit of
//triplet (ex 887) into values that match the keyboard array
ss << value;
for(unsigned int i = 0; i < ss.str().length(); ++i)
{
s1 << keyBoard[(int)ss.str().at(i)-48];
}
ss.str("");
}
codedMessage = s1.str();
}
void ShiftyEnigma::decode()
{
for(unsigned int i = 0; i < codedMessage.length(); ++i)
{
}
}
we start by creating a reverse-lookup for the symbols. I use 0 for the invalid number as the array is initialized to 0 by default, and use 10 to mark the number 0.
You loop over each character of the input and use reverse-lookup on the antiKeyboard array to see what digit it maps to. if the reverse lookup returns 0, then we found an invalid character. I choose to ignore it, but you could show an error message. Now we need to get 3 valid digits and combine them togeather. to combine 3-digits in a 3-digit number we can do this: number = digit1 * 100 + digit2 * 10 + digit. I do this in a loop.
int antiKeyboard[256]= {0};
antiKeyboard['!'] = 1;
antiKeyboard['#'] = 2;
antiKeyboard['#'] = 3;
antiKeyboard['$'] = 4;
antiKeyboard['%'] = 5;
antiKeyboard['^'] = 6;
antiKeyboard['&'] = 7;
antiKeyboard['*'] = 8;
antiKeyboard['('] = 9;
antiKeyboard[')'] = 10; //Note this is 0, but i put is as 10
int digits = 0, digit, number=0;
for(unsigned int i = 0; i < codedMessage.length(); ++i)
{
digit = antiKeyboard[codedMessage.at(i)];
if (digit >=1 && digit <=10){
++digits;
number = number * 10 + (digit % 10); //note modulo 10 converts 10 back to 0.
if (digits == 3){
printf("%c",1000-number);
digits = 0;
number = 0;
}
}else{ /*Invalid character, ignoring*/ }
}