Problems decompressing gzip - c++

I'm trying to use a gzip c++ library to decompress some text that i compressed using this website that had a tool to do it, but when i try to decompress it in my project it says that its not compressed and fails to decompress. Am i just misunderstanding these compression formats because the names are the same or is this some other issue that i'm not aware of?
//'test message' compressed using the website
std::string test_string = R"(eJwrSS0uUchNLS5OTE8FAB8fBMY=)";
//returns false
bool is_compressed = gzip::is_compressed(test_string.data(), test_string.size());
//crashes
std::string decompressed = gzip::decompress(test_string.data(), test_string.size());

Website outputs a Base64 encoded string as ASCII, instead of the byte array. I need to decode the Base64 encoding before trying to decompress.

Related

Trying to decode zlib compressed and base64 encoded data to a readable format in Python

import zlib, base64
exec(zlib.decompress(base64.b64decode('eJxsvUuu7UizpNfXKG5PJUANP==)))
###The code is short for example only
How can I decode this?
This is a picture of the code
https://prnt.sc/26xf8ju
I don't really understand what issue you may be encountering, it should operate as normal. Always be careful when using exec() for things like this as it is used a lot by threat actors for running binaries.
import zlib, base64
a = zlib.decompress(base64.b64decode('eF7zSM3JyVcIzy/KSVFU8FTPVcjPUwguSUzO9i9LLUrLyS9XBADJNQvD'))
print(a)
Output: b"Hello World! I'm on StackOverflow!"

7Zip CLI Compress file with LZMA

I try to compress a file in the console with LZMA.
7z a -t7z output input
or
7z a -t7z -m0=lzma output input
However, I cannot open it on the client.
How can compress a file as an LZMA archive in the console?
It is possible the problem to be that the above commands add a file in an archive. However, I want to compress data in a data file without file structure.
Is there an option to compress a data file to a compressed data file with LZMA?
Edit
I see downvotes, which means the question is "not correct" in some way.
So I'll try to explain what I want to achieve.
I compress data serverside and use them on a client application. I successfully do it in Node.js like that:
const lzma = require('lzma');
lzma.compress(inputBuffer, 1, callback);
function callback(data, err) {
writefile(outputPath, Buffer.from(data));
}
However, it is very slow. So I want to call 7Zip for the compression.
My .NET server also compresses it in a similar way.
byte[] barData;
using (var barStream = dukasDataHelper.SerializeLightBars(lightBars.ToArray()))
using (var zippedStream = zipLzma.Zip(barStream))
{
barData = zippedStream.ToArray();
}
My problem is that I cannot set the correct options in CLI in order to be able to read the file in the client.
My client code C# is:
using (var blobStream = new MemoryStream(blobBytes))
using (var barStream = new ZipLzma().Unzip(blobStream))
{
SaveDataSet(barStream, localPath);
}
I have this error message when compress via CLI:
$exception {"Data Error"}
Data: {System.Collections.ListDictionaryInternal}
at SevenZipLzma.LZMA.Decoder.Code(Stream inStream, Stream outStream, Int64
inSize, Int64 outSize, ICodeProgress progress)
at SevenZipLzma.ZipLzma.Unzip(Stream stream)
Since the code works as I compress with Node.js and doesn't work when compressing via CLI, it means something is wrong.
7zip makes an archive of files and directories, whereas LZMA generates a single stream of compressed data. They are not the same format. LZMA can be used inside a 7zip archive to compress an entry (or LZMA2 or Deflate or several other compression methods).
You can try the xz command to generate LZMA streams with xz --format=lzma.

C++: Decode a HTTP response which is Base64 encoded and UTF-8 decoded

I have a C++ program which is receiving encoded binary data as a HTTP response. The response needs to be decoded and stored as a binary file. The HTTP server that is sending the binary data is written in Python and following is an example code that performs the encoding.
#!/usr/bin/env python3
import base64
# content of the file is string "abc" for testing, the file could be an image file
with open('/tmp/abc', 'rb') as _doc:
data = _doc.read()
# Get Base-64 encoded bytes
data_bytes = base64.b64encode(data)
# Convert the bytes to a string
data_str = data_bytes.decode('utf-8')
print(data_str)
Now, I want to decode the received data_str using a C++ program. I could make the python equivalent as below to work properly.
_data = data_str.encode('utf-8')
bin_data = base64.b64decode(_data)
But, with C++, I tried to use the Boost library's from_utf8 method, but no avail. Could anyone please guide the best way of decoding and getting the binary data in C++ (preferably using boost, since it is portable)?

Reducing Size / Compressing JSON String in C++

Is there a way to compress a JSON string in c++ , so that the overall size can be reduced ?
In my case mobile app which retreives XML create by CCUserDefault, then it converts that XML to JSON using rapidJson. Now I want to reduce its size or compress it using any cpp library.
Assuming you just want to minimise the size of the string (as opposed to general compression such as gzip), then a library such as rapidjson could be used.
There's an example in this unit test:
Roughly:
StringStream s("{ \"hello\" : \"world\" ");
StringBuffer buffer;
Writer<StringBuffer> writer(buffer);
Reader reader;
reader.Parse<0>(s, writer);
EXPECT_STREQ("{\"hello\":\"world\"}", buffer.GetString());
You could use zlib to compress the JSON string in memory and decompress it back. Perhaps using the ideas in here

How do I decode UTF-8?

I have a UTF-8-encoded string.
This string is first saved to a file and then sent via Apache to a process written in C++, which receives it using Curl.
How can I decode the string in the C++ process?
There is a very good article on CodeProject that shows how to read utf8 .Alternatively http://utfcpp.sourceforge.net/ has also manipulations to do it ( C++ & Boost: encode/decode UTF-8 ).