Md5 hashed value (salt+password)
hashed: "89d1ed22aac58f5bbea53b2fde81a946" (String)(32 characters)
Original string value
orginal: "test" (String)
Encryption Key & IV (Bytes)
Key: "encryptionkeyhere" (String)
16 bit key: 646C646873766D666C766D000000000 (Bytes)(Converted to 16 Bytes)
16 bit IV: 00000000000000000000000000000000 (Bytes)
Encrypted hex value
084987B6C979950A11EBE33A5499B091D127CD208E95BAE5C6B5DE5FAE65AFB68EB5C083BB808FDFD98C16694E6FCA9F2E15DF5A63FBD7E4E7EFFB242D0D56B1A3C35F76F977A70F5A9A1EAD0FC3A9E61242CBA0AF848FFC5C8C4342F4011E0436D81F6B064E086C802E175F662C43A798ADC38D25684E99E926ED30B900FF89CA66760B7DBDFBF3087378620A69981FB87346512DC6596E7420763C238EC7E2015941F3E613070E737A0D191F4730C3CF70C270B5FA13E44407B7D6D7567B6126241777D80874320B969B8818371CEC91DF97AA42ABE5BCF015D9B17BDC18F2F3E1E7E25794A97C44F760B68F1D24FD96036DFC92AF400D53D1292C1FBDD0296AFE28401D48B2EA486C781B4729E55C8794505C7AC3AB2A35C7B893DE4128A3A59335BC76071E404A9B7D600C19CBB0CB640071C7AF9CB135FC46DD29080707ED30BC9CDA6AE65ACDD84CF52B299408D82333F1A8DBD1F58213671063D7C8C1BB9002563A04BD53B03D802888F299024AD8C9BE2A0749EF920362A4C86129401DB29C2D2A17A960C2038C0DC6CCD18F361A629BBDCA17BFF0EB1EB737AE9F00CAB8CFBE2050749E6F0D6EAB7DC849380E85BC55DA82AF7354D31D8431A47FB385CC5CA01FE1DE064AE3F2426A5135938AD923C4A28661CBDA7F1397326D7921158FB06A2B4A1C6F8D141B2749D8A8B20883A7A168D811057CDA88D595EEC694558C446673B708BABFFB5FA584B2AFE527A9C3B162BD38405FC08250AB5EF1D15C7ED94AB71796A5582E25ADFBF83D7BD146C9BF5FE68FA3ADEA4C0D77A4314D06336F6D503A8C097A0FD54051CDD49ED8ECA34E3150D61C01FBB53208C62D701BEFEB15419776B9FB18437FE2C1B4A9BABFDE3CE83457FF6F3F87F83B9B702450D0409886B9D9490BD665BB1CB3E5E460ACDD7BC51958A3870985D58E4585B
C++ Encryption Code
void CRijndael::Encrypt(char const* in, char* result, size_t n, int iMode)
{
if(false==m_bKeyInit)
throw exception(sm_szErrorMsg1);
//n should be > 0 and multiple of m_blockSize
if(0==n || n%m_blockSize!=0)
throw exception(sm_szErrorMsg2);
int i;
char const* pin;
char* presult;
if(CBC == iMode) //CBC mode, using the Chain
{
for(i=0,pin=in,presult=result; i<(int)( n/m_blockSize ); i++)
{
Xor(m_chain, pin);
EncryptBlock(m_chain, presult);
memcpy(m_chain, presult, m_blockSize);
pin += m_blockSize;
presult += m_blockSize;
}
}
else if(CFB == iMode) //CFB mode, using the Chain
{
for(i=0,pin=in,presult=result; i<(int)( n/m_blockSize ); i++)
{
EncryptBlock(m_chain, presult);
Xor(presult, pin);
memcpy(m_chain, presult, m_blockSize);
pin += m_blockSize;
presult += m_blockSize;
}
}
else //ECB mode, not using the Chain
{
for(i=0,pin=in,presult=result; i<(int)( n/m_blockSize ); i++)
{
EncryptBlock(pin, presult);
pin += m_blockSize;
presult += m_blockSize;
}
}
}
Java/Kotlin Aes Descryption Code
var key = buildKeyFromString(encryptionKey)
val iv = IvParameterSpec(key)
val secretKeySpec = SecretKeySpec(key, "AES")
val cipher = Cipher.getInstance("AES/CBC/NoPadding")
cipher.init(Cipher.DECRYPT_MODE, secretKeySpec, iv)
return cipher.doFinal(encryptedByte)
The problem now is when I decrypt using the kotlin code, the initial string is not encoded properly
Decrypted String value
String: "????58f5bbea53b2fde81a946" (String)(Wrong value)
Decrypted Hex value (32 bytes)
Hex: "5C55005916125F540D170E353866356262656135336232666465383161393436"
Exptected String value
Hashed: "89d1ed22aac58f5bbea53b2fde81a946" (String)(32 characters)
The cause of the problem is that different IVs are used for encryption and decryption: The encryption uses a zero vector (0x00000000000000000000000000000000) and the decryption uses the key as IV (0x646C646873766D666C766D0000000000).
By the way, the posted ciphertext is too long. Since the plaintext is 32 bytes in size and no padding is used, the ciphertext is also 32 bytes in size and just corresponds to the first 32 bytes of the posted ciphertext (0x084987B6C979950A11EBE33A5499B091D127CD208E95BAE5C6B5DE5FAE65AFB6).
I have use key variable for both IV and secretkeyspec. So I've changed the IV to 16 bytes 0.
var key = buildKeyFromString(encryptionKey)
var ivBytes = 16 bytes zero
val iv = IvParameterSpec(ivBytes )
val secretKeySpec = SecretKeySpec(key, "AES")
val cipher = Cipher.getInstance("AES/CBC/NoPadding")
cipher.init(Cipher.DECRYPT_MODE, secretKeySpec, iv)
return cipher.doFinal(encryptedByte)
Related
I have written a simple encrypt & decrypt function and everything works. Except sometimes the HexEncoder seems to encode the cipher far too way small. Therefore the decryption fails. But this rarely happens. I have isolated the bug to the HexEncoder since the log Mismatch on encoding is only present when the problem appears. A bad workaround is to call the encrypt function again, which solves the problem. Not sure why, perhaps because of a newly generated IV but this seems unlikely to me.
Am I doing something wrong, undefined behaviour? Or is this a bug from the HexEncoder?
Encrypt function:
// Encrypt data.
template <typename Encryption> static inline
int encrypt(Encryption& m_enc, str_t& encrypted, const str_t& data) {
// Generate & set iv.
CryptoPP::byte iv [CryptoPP::AES::BLOCKSIZE];
rand.GenerateBlock(iv, CryptoPP::AES::BLOCKSIZE);
m_enc.Resynchronize(iv, CryptoPP::AES::BLOCKSIZE);
// Encrypt.
str_t cipher;
cipher.resize(data.len() + CryptoPP::AES::BLOCKSIZE);
cipher.len() = data.len() + CryptoPP::AES::BLOCKSIZE;
CryptoPP::ArraySink encrypt_sink(
(Byte*) cipher.data(),
cipher.len()
);
try {
CryptoPP::StringSource(
data.data(),
true,
new CryptoPP::StreamTransformationFilter(m_enc, new CryptoPP::Redirector(encrypt_sink))
);
} catch (...) {
return crypto::error::encrypt;
}
//print("Cipher 1: ", cipher);
// Set cipher text length now that its known
if (cipher.len() != encrypt_sink.TotalPutLength()) {
print("Mismatch on ciper; expected: ", cipher.len(), " sinked: ", encrypt_sink.TotalPutLength());
}
cipher.resize(encrypt_sink.TotalPutLength());
cipher.len() = encrypt_sink.TotalPutLength();
//print("Cipher 2: ", cipher);
// Encode.
encrypted.resize(cipher.len() * 2);
encrypted.len() = cipher.len() * 2;
CryptoPP::ArraySink encode_sink((Byte*) encrypted.data(), encrypted.len());
try {
CryptoPP::StringSource(
cipher.data(),
true,
new CryptoPP::HexEncoder(
new CryptoPP::Redirector(encode_sink)
)
);
} catch (...) {
encrypted.len() = 0;
return crypto::error::encode;
}
print("Encoded cipher 1: ", encrypted);
// Set encrypted length.
if (encrypted.len() != encode_sink.TotalPutLength()) {
print("BUG -> Mismatch on encoding; expected: ", encrypted.len(), " sinked: ", encode_sink.TotalPutLength());
//str_t shortened;
//shortened.resize(encode_sink.TotalPutLength());
//shortened.concat_no_resize(encrypted.data(), encode_sink.TotalPutLength());
//encrypted.swap(shortened);
return encrypt(m_enc, encrypted, data);
}
encrypted.resize(encode_sink.TotalPutLength());
encrypted.len() = encode_sink.TotalPutLength();
print("Encoded cipher 2: ", encrypted);
// Encode IV.
str_t encoded_iv;
encoded_iv.resize(CryptoPP::AES::BLOCKSIZE * 2);
encoded_iv.len() = CryptoPP::AES::BLOCKSIZE * 2;
CryptoPP::ArraySink iv_sink((Byte*) encoded_iv.data(), encoded_iv.len());
try {
CryptoPP::ArraySource(
iv,
CryptoPP::AES::BLOCKSIZE,
true,
new CryptoPP::HexEncoder(
new CryptoPP::Redirector(iv_sink)
)
);
} catch (...) {
encrypted.len() = 0;
return crypto::error::encode;
}
// Set encoded iv length.
if (encoded_iv.len() != iv_sink.TotalPutLength()) {
print("Mismatch on encoding iv; expected: ", encoded_iv.len(), " sinked: ", iv_sink.TotalPutLength());
}
encoded_iv.resize(iv_sink.TotalPutLength());
encoded_iv.len() = iv_sink.TotalPutLength();
print("Encoded IV: ", encoded_iv);
// Concat IV.
encoded_iv.concat(encrypted);
encrypted.swap(encoded_iv);
return 0;
}
Decrypt function:
// Decrypt data.
template <typename Decryption> static inline
int decrypt(Decryption& m_dec, str_t& decrypted, const str_t& data) {
// Decode.
str_t decoded;
decoded.resize(data.len() / 2);
decoded.len() = data.len() / 2;
try {
CryptoPP::StringSource(
data.data(),
true,
new CryptoPP::HexDecoder(
new CryptoPP::ArraySink((Byte*) decoded.data(), decoded.len())
)
);
} catch (...) {
return crypto::error::decode;
}
// Skip for now.
m_dec.Resynchronize((Byte*) decoded.data(), CryptoPP::AES::BLOCKSIZE);
// Recovered text will be less than cipher text
decrypted.resize(decoded.len() - CryptoPP::AES::BLOCKSIZE);
decrypted.len() = decoded.len() - CryptoPP::AES::BLOCKSIZE;
CryptoPP::ArraySink rs((Byte*) decrypted.data(), decrypted.len());
try {
CryptoPP::StringSource(
(Byte*) decoded.data() + CryptoPP::AES::BLOCKSIZE,
decoded.len(),
true,
new CryptoPP::StreamTransformationFilter(
m_dec,
new CryptoPP::Redirector(rs),
CryptoPP::BlockPaddingSchemeDef::BlockPaddingScheme::NO_PADDING
)
);
} catch (...) {
decrypted.len() = 0;
return crypto::error::decrypt;
}
// Set recovered text length now that its known
decrypted.resize(rs.TotalPutLength());
decrypted.len() = rs.TotalPutLength();
return 0;
}
Testing:
...
str_t data = "Hello World!", encrypted, decrypted;
aes.encrypt(encrypted, data);
aes.decrypt(decrypted, encrypted);
print("Encrypted: ", encrypted);
print("Decrypted: ", decrypted);
This are the normal logs (without the bug occurence):
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: 04A2C4FB074F6ACBA996239224BD5F77
Encoded cipher 2: 04A2C4FB074F6ACBA996239224BD5F77
Encoded IV: 02466AEBCF4AC2066CE2E144FC5B71C8
Encrypted: 02466AEBCF4AC2066CE2E144FC5B71C804A2C4FB074F6ACBA996239224BD5F77
Decrypted: Hello World!
This are the logs when the bug occurs (with the recursive encrypt call commented out):
Decoded: Hello World!
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: C97BFE
BUG -> Mismatch on encoding; expected: 32 sinked: 6
Encoded cipher 2: C97BFE
Encoded IV: 06E05C3DAE3E9D6DAC8971E5C9CA0A1A
Encrypted: 06E05C3DAE3E9D6DAC8971E5C9CA0A1AC97BFE
Decrypted:
This are the logs when the bug occurs with the workaround:
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: A0
BUG -> Mismatch on encoding; expected: 32 sinked: 2
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: 883A8A644E5B50067A
BUG -> Mismatch on encoding; expected: 32 sinked: 18
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: 58AD2010C761215422F89D42FC2F2396
Encoded cipher 2: 58AD2010C761215422F89D42FC2F2396
Encoded IV: BE361B7D8C9144052220E143AD54CB63
Encrypted: BE361B7D8C9144052220E143AD54CB6358AD2010C761215422F89D42FC2F2396
Decrypted: Hello World!
I don't know what a str_t is, but since it has a .data() method I assume that returns the underlying char*. Since you are not specifying a length in the CryptoPP::StringSource, it assumes the input stops at the first NUL byte. For plaintext input this is often fine but cipher data MAY contain embedded NULs.
Instead, either pass the length of your cipher explicitly:
CryptoPP::StringSource(
cipher.data(),
cipher.len(),
true,
new CryptoPP::HexEncoder(
new CryptoPP::Redirector(encode_sink)
)
);
Or, better, convert the str_t into a std::string that knows its own size and pass that as argument.
So I have this piece of C# code:
void Decrypt(Stream input, Stream output, string password, int bufferSize) {
using (var algorithm = Aes.Create()) {
var IV = new byte[16];
input.Read(IV, 0, 16);
algorithm.IV = IV;
var key = new Rfc2898DeriveBytes(password, algorithm.IV, 100);
algorithm.Key = key.GetBytes(16);
using(var decryptor = algorithm.CreateDecryptor())
using(var cryptoStream = new CryptoStream(input, decryptor, CryptoStreamMode.Read)) {
CopyStream(cryptoStream, output, bufferSize);
}
}
}
and I am trying to translate this into C++ with CryptoPP.
So this is what I have written:
void decrypt(std::ifstream& in_file, std::ofstream& out_file, std::string_view password, size_t bufSize) {
using namespace CryptoPP;
// Get IV
byte iv[16];
in_file.read(reinterpret_cast<char*>(iv), sizeof(iv));
// Read cypher
std::string cypher;
while (in_file && cypher.size() != bufSize) {
char c;
in_file.read(&c, 1);
cypher.push_back(c);
}
// Get key
byte key[16];
PKCS5_PBKDF2_HMAC<SHA1> pbkdf2;
pbkdf2.DeriveKey(key, sizeof(key), 0, reinterpret_cast<const byte*>(password.data()), password.size(), iv, sizeof(iv), 100);
// Decrypt
CTR_Mode<AES>::Decryption decrypt(key, sizeof(key), iv);
std::string output;
StringSource(cypher, true, new StreamTransformationFilter(decrypt, new StringSink(output)));
// Write output to file
out_file.write(output.data(), output.size());
}
However, from this function, I am only getting back trash data. What could I be doing wrong?
Thanks
Tuxifan!
So I found the solution! First of all, as #mbd mentioned, C# uses CBC by default. Additionally, I need to cut away the rest of the data like this:
while ((cipher.size() % 16) != 0) {
cipher.pop_back();
}
I have the need to crypt big files (multi GB) with crypto++. I managed to find an example on the documentation that helped me create the 2 followings functions :
bool AESEncryptFile(const std::string& clearfile, const std::string& encfile, const std::string& key) {
try {
byte iv[CryptoPP::AES::BLOCKSIZE] = {};
CryptoPP::CBC_Mode<CryptoPP::AES>::Encryption encryptor;
encryptor.SetKeyWithIV((unsigned char*)key.c_str(), CryptoPP::AES::DEFAULT_KEYLENGTH, iv);
CryptoPP::StreamTransformationFilter filter(encryptor);
CryptoPP::FileSource source(clearfile.c_str(), false);
CryptoPP::FileSink sink(encfile.c_str());
source.Attach(new CryptoPP::Redirector(filter));
filter.Attach(new CryptoPP::Redirector(sink));
const CryptoPP::word64 BLOCK_SIZE = 4096;
CryptoPP::word64 processed = 0;
while (!EndOfFile(source) && !source.SourceExhausted()) {
source.Pump(BLOCK_SIZE);
filter.Flush(false);
processed += BLOCK_SIZE;
}
filter.MessageEnd();
return true;
} catch (const CryptoPP::Exception& ex) {
return false;
}
}
bool AESDecryptFile(const std::string& encfile, const std::string& clearfile, const std::string& key) {
try {
byte iv[CryptoPP::AES::BLOCKSIZE] = {};
CryptoPP::CBC_Mode<CryptoPP::AES>::Decryption decryptor;
decryptor.SetKeyWithIV((unsigned char*)key.c_str(), CryptoPP::AES::DEFAULT_KEYLENGTH, iv);
CryptoPP::StreamTransformationFilter filter(decryptor);
CryptoPP::FileSource source(encfile.c_str(), false);
CryptoPP::FileSink sink(clearfile.c_str());
source.Attach(new CryptoPP::Redirector(filter));
filter.Attach(new CryptoPP::Redirector(sink));
const CryptoPP::word64 BLOCK_SIZE = 4096;
CryptoPP::word64 processed = 0;
while (!EndOfFile(source) && !source.SourceExhausted()) {
source.Pump(BLOCK_SIZE);
filter.Flush(false);
processed += BLOCK_SIZE;
}
.
filter.MessageEnd();
return true;
} catch (const CryptoPP::Exception& ex) {
return false;
}
}
This is working great. On 8 GB files i'm using very little memory.
But as you can see the IV is (empty for now) hardcoded and i would like to :
While encrypting , put it a the end of the file.
While decrypting : get the IV from the file to init the decryptor.
Is there a way to do that with crypto++ or should i handle it manually after/before the enc/decryption process ?
Thanks to all the differents comments here is what i managed to do. As suggested by #Sam Mason i put the iv at the beginning of the file :
So before starting to encrypt i 'm putting the iv at the beginning of the file:
CryptoPP::ArraySource(iv, sizeof(iv), true,
new CryptoPP::Redirector(sink)
);
// Encrypt
And then when decrypting i'm getting the IV back like this :
unsigned char iv[CryptoPP::AES::BLOCKSIZE];
CryptoPP::ArraySink ivSink(iv, sizeof(iv));
source.Attach(new CryptoPP::Redirector(ivSink));
source.Pump(CryptoPP::AES::BLOCKSIZE);
// Decrypt
Note for future reader : Don't use an empty IV like show in my OP , instead generate one randomly , for example :
CryptoPP::AutoSeededRandomPool prng;
unsigned char iv[CryptoPP::AES::BLOCKSIZE];
prng.GenerateBlock(iv, sizeof(iv));
I am using Crypto++ to encrypt an array of bytes using RSA. I have followed Crypto++ wiki's samples with no luck getting them to work. Encryption and Decryption in all the samples are done within a single process but I am trying to decrypt the content which is already encrypted in another process.
Here is my code:
class FixedRNG : public CryptoPP::RandomNumberGenerator
{
public:
FixedRNG(CryptoPP::BufferedTransformation &source) : m_source(source) {}
void GenerateBlock(byte *output, size_t size)
{
m_source.Get(output, size);
}
private:
CryptoPP::BufferedTransformation &m_source;
};
uint16_t Encrypt()
{
byte *oaepSeed = new byte[2048];
for (int i = 0; i < 2048; i++)
{
oaepSeed[i] = (byte)i;
}
CryptoPP::ByteQueue bq;
bq.Put(oaepSeed, 2048);
FixedRNG prng(bq);
Integer n("Value of N"),
e("11H"),
d("Value of D");
RSA::PrivateKey privKey;
privKey.Initialize(n, e, d);
RSA::PublicKey pubKey(privKey);
CryptoPP::RSAES_OAEP_SHA_Encryptor encryptor( pubKey );
assert( 0 != encryptor.FixedMaxPlaintextLength() );
byte blockSize = encryptor.FixedMaxPlaintextLength();
int divisionCount = fileSize / blockSize;
int proccessedBytes = 0;
// Create cipher text space
uint16_t cipherSize = encryptor.CiphertextLength( blockSize );
assert( 0 != cipherSize );
encryptor.Encrypt(prng, (byte*)plaintext, blockSize, (byte*)output);
return cipherSize;
}
void Decrypt(uint16_t cipherSize)
{
byte *oaepSeed = new byte[2048];
for (int i = 0; i < 2048; i++)
{
oaepSeed[i] = (byte)i;
}
CryptoPP::ByteQueue bq;
bq.Put(oaepSeed, 2048);
FixedRNG prng(bq);
Integer n("Value of N"),
e("11H"),
d("Value of D");
RSA::PrivateKey privKey;
privKey.Initialize(n, e, d);
//RSA::PublicKey pubKey(privKey);
CryptoPP::RSAES_OAEP_SHA_Decryptor decryptor( privKey );
byte blockSize = decryptor.FixedMaxPlaintextLength();
assert(blockSize != 0);
size_t maxPlainTextSize = decryptor.MaxPlaintextLength( cipherSize );
assert( 0 != maxPlainTextSize );
void* subBuffer = malloc(maxPlainTextSize);
CryptoPP::DecodingResult result = decryptor.Decrypt(prng, (byte*)cipherText, cipherSize, (byte*)subBuffer);
assert( result.isValidCoding );
assert( result.messageLength <= maxPlainTextSize );
}
Unfortunately, value of isValidCoding is false. I think I am misunderstanding something about RSA encryption/decryption!!
Note that, privKey and pubKey have been validated using KEY.Validate(prng, 3).
I have also tried to use RAW RSA instead of OAEP and SHA with no luck. I have tried to debug through crypto++ code, what I am suspicious about is prng variable. I think there is something wrong with it. I have also used AutoSeededRandomPool instead of FixedRNG but it didn't help. Worth to know that, if I copy the decryption code right after encryption code and execute it in Encrypt() method, everything is fine and isValidCoding is true!!
This is probably not be correct:
byte blockSize = encryptor.FixedMaxPlaintextLength();
...
encryptor.Encrypt(prng, (byte*)plaintext, blockSize, (byte*)output);
return cipherSize;
Try:
size_t maxLength = encryptor.FixedMaxPlaintextLength();
size_t cipherLength = encryptor.CiphertextLength( blockSize );
...
SecureByteBlock secBlock(cipherLength);
cipherLength = encryptor.Encrypt(prng, (byte*)plaintext, blockSize, secBlock);
secBlock.resize(cipherLength);
FixedMaxPlaintextLength returns a size_t, not a byte.
You should probably be calling CiphertextLength on plaintext.
I'm not really sure how you are just returning an uint_t from encrypt().
You might do better by starting fresh, and using an example from the Crypto++ as a starting point. I'm not sure this design is worth pursuing.
If you start over, then Shoup's Elliptic Curve Integrated Encryption Scheme (ECIES) would be a good choice since it combines public key with symmetric ciphers and authentication tags.
I use the following code to encrypt a string with a key, using the 3-DES algorithm:
private bool Encode(string input, out string output, byte[] k, bool isDOS7)
{
try
{
if (k.Length != 16)
{
throw new Exception("Wrong key size exception");
}
int length = input.Length % 8;
if (length != 0)
{
length = 8 - length;
for (int i = 0; i < length; i++)
{
input += " ";
}
}
TripleDESCryptoServiceProvider des = new TripleDESCryptoServiceProvider();
des.Mode = CipherMode.ECB;
des.Padding = PaddingMode.Zeros;
des.Key = k;
ICryptoTransform ic = des.CreateEncryptor();
byte[] bytePlainText = Encoding.Default.GetBytes(input);
MemoryStream ms = new MemoryStream();
CryptoStream cStream = new CryptoStream(ms,
ic,
CryptoStreamMode.Write);
cStream.Write(bytePlainText, 0, bytePlainText.Length);
cStream.FlushFinalBlock();
byte[] cipherTextBytes = ms.ToArray();
cStream.Close();
ms.Close();
output = Encoding.Default.GetString(cipherTextBytes);
}
catch (ArgumentException e)
{
output = e.Message;
//Log.Instance.WriteToEvent("Problem encoding, terminalID= "+objTerminalSecurity.TerminalID+" ,Error" + output, "Security", EventLogEntryType.Error);
return false;
}
return true;
}
I send the output parameter as is over to a WCF http-binding webservice, and I noticed that the actual encoded string looks different, it looks like there are some \t and \n but the charachters are about the same.
What is going on, why does the server get a different encoded string?
Usually cipher text is base64 encoded in an effort to be binary safe during transmission.
Also I would not use 3DES with ECB. That is awful, you must have copy pasted this from somewhere. Use AES with cbc mode and think about adding a cmac or hmac.