How to decompress gzipstream with zlib - c++

Can someone tell me which function I need to use in order to decompress a byte array that has been compressed with vb.net's gzipstream. I would like to use zlib.
I've included the zlib.h but I haven't been able to figure out what function(s) I should use.

You can take a look at The Boost Iostreams Library:
#include <fstream>
#include <boost/iostreams/filtering_stream.hpp>
#include <boost/iostreams/filter/gzip.hpp>
std::ifstream file;
file.exceptions(std::ios::failbit | std::ios::badbit);
file.open(filename, std::ios_base::in | std::ios_base::binary);
boost::iostreams::filtering_stream<boost::iostreams::input> decompressor;
decompressor.push(boost::iostreams::gzip_decompressor());
decompressor.push(file);
And then to decompress line by line:
for(std::string line; getline(decompressor, line);) {
// decompressed a line
}
Or entire file into an array:
std::vector<char> data(
std::istreambuf_iterator<char>(decompressor)
, std::istreambuf_iterator<char>()
);

You need to use inflateInit2() to request gzip decoding. Read the documentation in zlib.h.
There is a lot of sample code in the zlib distribution. Also take a look at this heavily documented example of zlib usage. You can modify that one to use inflateInit2() instead of inflateInit().

Here is a C function that does the job with zlib:
int gzip_inflate(char *compr, int comprLen, char *uncompr, int uncomprLen)
{
int err;
z_stream d_stream; /* decompression stream */
d_stream.zalloc = (alloc_func)0;
d_stream.zfree = (free_func)0;
d_stream.opaque = (voidpf)0;
d_stream.next_in = (unsigned char *)compr;
d_stream.avail_in = comprLen;
d_stream.next_out = (unsigned char *)uncompr;
d_stream.avail_out = uncomprLen;
err = inflateInit2(&d_stream, 16+MAX_WBITS);
if (err != Z_OK) return err;
while (err != Z_STREAM_END) err = inflate(&d_stream, Z_NO_FLUSH);
err = inflateEnd(&d_stream);
return err;
}
The uncompressed string is returned in uncompr. It's a null-terminated C string so you can do puts(uncompr). The function above only works if the output is text. I have tested it and it works.

Have a look at the zlib usage example. http://www.zlib.net/zpipe.c
The function that does the real work is inflate(), but you need inflateInit() etc.

Related

Azure TTS generating garbled result when requesting Opus encoding

The following sample code (C++, Linux, x64) uses the MS Speech SDK to request a text-to-speech of a single sentence in Opus format with no container. It then uses the Opus lib to decode to raw PCM. Everything seems to run with no errors but the result sounds garbled, as if some of the audio is missing, and the result Done, got 14880 bytes, decoded to 24000 bytes looks like this might be a decoding issue rather than an Azure issue as I'd expect a much higher compression ratio.
Note that this generates a raw PCM file, play back with: aplay out.raw -f S16_LE -r 24000 -c 1
#include <stdio.h>
#include <string>
#include <assert.h>
#include <vector>
#include <speechapi_cxx.h>
#include <opus.h>
using namespace Microsoft::CognitiveServices::Speech;
static const std::string subscription_key = "abcd1234"; // insert valid key here
static const std::string service_region = "westus";
static const std::string text = "Hi, this is Azure";
static const int sample_rate = 24000;
#define MAX_FRAME_SIZE 6*960 // from Opus trivial_example.c
int main(int argc, char **argv) {
// create Opus decoder
int err;
OpusDecoder* opus_decoder = opus_decoder_create(sample_rate, 1, &err);
assert(err == OPUS_OK);
// create Azure client
auto azure_speech_config = SpeechConfig::FromSubscription(subscription_key, service_region);
azure_speech_config->SetSpeechSynthesisVoiceName("en-US-JennyNeural");
azure_speech_config->SetSpeechSynthesisOutputFormat(SpeechSynthesisOutputFormat::Audio24Khz16Bit48KbpsMonoOpus);
auto azure_synth = SpeechSynthesizer::FromConfig(azure_speech_config, NULL);
FILE* fp = fopen("out.raw", "w");
int in_bytes=0, decoded_bytes=0;
// callback to capture incoming packets
azure_synth->Synthesizing += [&in_bytes, &decoded_bytes, fp, opus_decoder](const SpeechSynthesisEventArgs& e) {
printf("Synthesizing event received with audio chunk of %zu bytes\n", e.Result->GetAudioData()->size());
auto audio_data = e.Result->GetAudioData();
in_bytes += audio_data->size();
// confirm that this is exactly one valid Opus packet
assert(opus_packet_get_nb_frames((const unsigned char*)audio_data->data(), audio_data->size()) == 1);
// decode the packet
std::vector<uint8_t> decoded_data(MAX_FRAME_SIZE);
int decoded_frame_size = opus_decode(opus_decoder, (const unsigned char*)audio_data->data(), audio_data->size(),
(opus_int16*)decoded_data.data(), decoded_data.size()/sizeof(opus_int16), 0);
assert(decoded_frame_size > 0); // confirm no decode error
decoded_frame_size *= sizeof(opus_int16); // result size is in samples, convert to bytes
printf("Decoded to %d bytes\n", decoded_frame_size);
assert(decoded_frame_size <= (int)decoded_data.size());
fwrite(decoded_data.data(), 1, decoded_frame_size, fp);
decoded_bytes += decoded_frame_size;
};
// perform TTS
auto result = azure_synth->SpeakText(text);
printf("Done, got %d bytes, decoded to %d bytes\n", in_bytes, decoded_bytes);
// cleanup
fclose(fp);
opus_decoder_destroy(opus_decoder);
}
I've received no useful response to this question (I've also asked here and here and even tried Azure's paid support) so I gave up and switched from Audio24Khz16Bit48KbpsMonoOpus to Ogg48Khz16BitMonoOpus which means the Opus encoding is wrapped in an Ogg container, requiring the rather cumbersome libopusfile API to decode. It was kind of a pain to implement but it does the job.

How to receive a JPEG image over serial port

So I am trying to send a jpeg image (4Kb) from a raspberry pi to my Mac wirelessly using Xbee Series 1. I have an image on the raspberry pi and can read it into binary format. I've used this binary format to save it into another image file and it creates a copy of the image correctly. That tells me that I am reading it correctly. So I am trying to send that data over a serial port (to be transferred by the xbee's) to my Mac. Side note, Xbee's can only transmit I think 80 bytes of data per packet or something. I don't know how that affects what I'm doing though.
My problem is, I do not know how to read the data and properly store it into a jpeg file itself. Most of the Read() functions I have found require you to enter a length to read and I don't know how to tell how long it is since its just a serial stream coming in.
Here is my code to send the jpeg.
#include "xSerial.hpp"
#include <iostream>
#include <cstdlib>
using namespace std;
int copy_file( const char* srcfilename, const char* dstfilename );
int main(){
copy_file("tylerUseThisImage.jpeg", "copyImage.jpeg");
return 0;
}
int copy_file( const char* srcfilename, const char* dstfilename )
{
long len;
char* buf = NULL;
FILE* fp = NULL;
// Open the source file
fp = fopen( srcfilename, "rb" );
if (!fp) return 0;
// Get its length (in bytes)
if (fseek( fp, 0, SEEK_END ) != 0) // This should typically succeed
{ // (beware the 2Gb limitation, though)
fclose( fp );
return 0;
}
len = ftell( fp );
std::cout << len;
rewind( fp );
// Get a buffer big enough to hold it entirely
buf = (char*)malloc( len );
if (!buf)
{
fclose( fp );
return 0;
}
// Read the entire file into the buffer
if (!fread( buf, len, 1, fp ))
{
free( buf );
fclose( fp );
return 0;
}
fclose( fp );
// Open the destination file
fp = fopen( dstfilename, "wb" );
if (!fp)
{
free( buf );
return 0;
}
// this is where I send data in but over serial port.
//serialWrite() is just the standard write() being used
int fd;
fd = xserialOpen("/dev/ttyUSB0", 9600);
serialWrite(fd, buf, len);
//This is where the file gets copied to another file as a test
// Write the entire buffer to file
if (!fwrite( buf, len, 1, fp ))
{
free( buf );
fclose( fp );
return 0;
}
// All done -- return success
fclose( fp );
free( buf );
return 1;
}
On the receive side I know I need to open up the serial port to read and use some sort of read() but I don't know how that is done. Using a serial library it has some functions to check if serial data is available and return the number of characters available to read.
One question about the number of characters available to read, will that number grow as the serial stream comes over or will it immediately tell the entire length of the data to be read?
But finally, I know after I open the serial port, I need read the data into a buffer and then write that buffer to a file but I have not had any luck. This is what I have tried thus far.
// Loop, getting and printing characters
char temp;
bool readComplete = false;
int bytesRead = 0;
fp = fopen("copyImage11.jpeg", "rwb");
for (;;)
{
if(xserialDataAvail(fd) > 0)
{
bytesRead = serialRead(fd, buf, len);
readComplete = true;
}
if (readComplete)
{
if (!fwrite(buf, bytesRead, 1, fp))
{
free(buf);
fclose(fp);
return 0;
}
fclose(fp);
free(buf);
return 1;
}
}
I don't get errors with my code, it just doesnt create the jpeg file correctly. Maybe I'm not transmitting it right, or maybe I'm not reading/writing to file correctly. Any help would be appreciated. Thanks everyone you rock!
If you are defining your own protocol, then you need to have a method for sending the length first.
I would recommend testing your code by sending short blocks of ascii text to confirm your i/o. Once that is working you can use the ascii to set up the transfer; ie send the length, and have your receiver ready for an expected block.

Save byte pointer to a file

I'm using ProtonSDK (www.protonsdk.com)
Im trying to use this function
byte *pData = GetFileManager()->Get("texturefile.rttex", &fileSizeBytes, true, true);
I want to save the output to a file (Decompressed RTPACK) but it doesn't want to.
I am fairly new to C++ but I have a background with PHP
You can write it to a file with the C++ standard library.
http://www.cplusplus.com/doc/tutorial/files/
Or the C standard library.
Write data pData is pointing to.
#include <stdio.h>
#include <stdlib.h>
/* Proton SDK header */
/* header with byte type (not part of C standard library) */
int main(int argc, char** argv)
{
int fileSizeBytes;
FILE* fp;
byte *pData = GetFileManager()->Get(GetSavePath()+"test.txt", &fileSizeBytes, true);
fp = fopen("put_filename_here", "w");
if(fp == NULL) {
fprintf("Cannot open file!\n");
exit(-1);
}
fwrite(pData, fileSizeBytes, sizeof(byte), fp);
fclose(fp);
return 0;
}
Write pData itself to the file.
fwrite(&pData, 1, sizeof(pData), fp);

How to create a gz-compatible file with zlib?

I want to use the zlib to produce a gz-compatible output file with C++.
I installed the developer package for zlib, which can be used -- as I understand it -- to create gz-compatible files both on Unix and on Windows.
sudo aptitude install libz-dev
Although I write a C++-program, I quite I followed the usage example in the relevant points, I think. I also compiled the example to zpipe.c unchanged.
Alas, what I get is not a gz-compatible output.
$ ./zpipe.x < data.txt > x.gz
$ file x.gz
x.gz: data
$ gunzip x.gz
gzip: x.gz: not in gzip format
I thought that the reason here might be, because deflateSetHeader is not called. So I added that into my own source code, i.e. (excerpt, you can find the full code here):
struct DeflateWrap { // RAII wrapper
z_stream strm_ ; // C-Struct from zlib.h
explicit DeflateWrap() : strm_{} {
strm_.zalloc = Z_NULL;
strm_.zfree = Z_NULL;
strm_.opaque = Z_NULL;
auto ret = deflateInit2(&strm_, LEVEL,
Z_DEFLATED, 15, 9, Z_DEFAULT_STRATEGY);
if(ret != Z_OK) throw std::runtime_error("Error ZLib-Init");
}
// ...more, eg. operator-> and *...
};
void pack(const string& infn) {
DeflateWrap dwrap {};
//...
dwrap->avail_in = indata.size();
dwrap->next_in = reinterpret_cast<unsigned char*>(indata.data());
gz_header header {0}; // <<< HEADER HERE
header.name = const_cast<unsigned char*>(
reinterpret_cast<const unsigned char*>(infn.c_str()));
header.comment = Z_NULL;
header.extra = Z_NULL;
bool first = true;
do {
dwrap->avail_out = outdata.size();
dwrap->next_out = reinterpret_cast<unsigned char*>(outdata.data());
if(first) {
cerr << deflateSetHeader(&(dwrap.strm_), &header); // <<< SET HDR HERE
first = false;
}
deflate(&(dwrap.strm_), Z_FINISH); // zlib.h: this packs
auto toWrite = outdata.size() - dwrap->avail_out;
outf.write(outdata.data(), toWrite);
} while (dwrap->avail_out == 0);
}
To my interpretation I followed the manual for deflateSetHeader:
I even used deflateInit2 instead of deflateInit, probably unnecessarily
the call of deflateSetHeader is immediatly after deflateInit2
the call of deflateSetHeader is before any call of deflate
...and still I get a -2, i.e. Z_STREAM_ERROR from the deflateSetHeader call. Although, the output I produce can be uncompressed with zpipe.c, therefore it can't be totally wrong, can it?
Any idea how to set a gz-compatible header?
Update:
As I see it I use the C++-pendant to
SET_BINARY_MODE(stdin);
SET_BINARY_MODE(stdout);
by opening the files like this:
ifstream inf{ infn, ifstream::binary };
ofstream outf { infn + ".gz", ofstream::binary };
Also, I wonder why the zpipe.c example I produced also does not make a gunzip-compatible file, as I described before. From what I read here it should.
Although I read the documentation of deflateSetHeader that the output file is gz-compatible, a bit further down there is a hint that it may be not so.
This library supports reading and writing files in gzip (.gz) format with an interface similar to that of stdio, using the functions that start with "gz". The gzip format is different from the zlib format. gzip is a gzip wrapper, documented in RFC 1952, wrapped around a deflate stream.
Thus, when I use the different set of functions gz... I get gz-compatible output and simpler code:
struct GzWrite { // RAII-Wrapper
gzFile gz_ ; // C-Struct aus zlib.h
explicit GzWrite(const string& filename)
: gz_{gzopen(filename.c_str(),"wb9")}
{
if(gz_==NULL) throw std::runtime_error(strerror(errno));
}
~GzWrite() {
gzclose(gz_);
}
int write(const char* data, size_t len) {
return gzwrite(gz_, data, len);
}
GzWrite(const GzWrite&) = delete; // keine Kopie
GzWrite& operator=(const GzWrite&) = delete; // keine Zuweisung
};
void packe(const string& infn) {
vector<char> indata = lese(infn); // lese Eingabe
GzWrite gz{infn+".gz"}; // initialisiere Ausgabe
auto res = gz.write(indata.data(), indata.size());
if(res==0) throw std::runtime_error("Fehler beim Schreiben");
}
windowBits can also be –8..–15 for raw deflate. In this case, -windowBits determines the window size. deflate() will then generate raw deflate data with no zlib header or trailer, and will not compute an adler32 check value.
windowBits can also be greater than 15 for optional gzip encoding. Add 16 to windowBits to write a simple gzip header and trailer around the compressed data instead of a zlib wrapper. The gzip header will have no file name, no extra data, no comment, no modification time (set to zero), no header crc, and the operating system will be set to 255 (unknown). If a gzip stream is being written, strm->adler is a crc32 instead of an adler32.

Rendering files from C++ Node.js addon

I would like to render files in node.js from C++ addon.
I want to apply some file processing and render the output to the browser via node.js
Here is my C++ Code
std::ifstream in(filename, std::ios::binary);
in.seekg (0, in.end);
int length = in.tellg();
in.seekg (0, in.beg);
char * buffer = new char [length];
in.read (buffer,length);
in.close();
return buffer;
Following is the V8 code to add bindings for node.js, here buffer is the output from the above c++ code.
Local<Function> cb = Local<Function>::Cast(args[1]);
const unsigned argc = 1;
Local<Value> argv[argc] = {Local<Value>::New(String::New(buffer))};
cb->Call(Context::GetCurrent()->Global(), argc, argv);
This code works well for normal text files. I'm getting problem when reading text files which are having unicode characters.
For eg,
Original text file
test start
Billél
last
When receiving in node, I will get
test start
Bill�l
last
Similarly when reading a jpg, png files the output file is different than the original file.
Please help.
I was having problems with this as well. I found an implementation in the V8 examples from Google. The example I found that properly handles UTF8 encoded files is found here:
https://code.google.com/p/v8/source/browse/trunk/samples/shell.cc#218
I adapted the source to this:
const char* ReadFile(const char* fileName, int* fileSize)
{
// reference to c-string version of file
char *fileBuffer = 0;
// attempt to open the file
FILE* fd = fopen(fileName, "rb");
// clear file size
*fileSize = 0;
// file was valid
if(fd != 0)
{
// get size of file
fseek(fd, 0, SEEK_END);
*fileSize = ftell(fd);
rewind(fd);
// allocate file buffer for file contents
fileBuffer = (char*)malloc(*fileSize + 1);
fileBuffer[*fileSize] = 0;
// copy file contents
for (int charCount = 0; charCount < *fileSize;)
{
int charRead = static_cast<int>(fread(&fileBuffer[charCount], 1, *fileSize - charCount, fd));
charCount += charRead;
}
// close the file
fclose(fd);
}
return fileBuffer;
}
Also, make sure when you create a V8 string that you create a String::Utf8Value.
String::Utf8Value v8Utf8String(...);
Then to use the String::Utf8Value as a char* use the following function:
https://code.google.com/p/v8/source/browse/trunk/samples/shell.cc#91