libcurl inbuilt compression support or not - c++

I found an answer which told me that libcurl doesn't support compression.
libcurl (linux, C) - Is there a built-in way to POST/PUT a gzipped request?
But, somewhere else, I also found this:
Is there any compression available in libcurl
My question is, do I still have to compress strings on my own and then send them, or is that not necessary using libcurl?

Yes, if you send data you must compress that yourself before sending. There is no support for doing that "automatically" in for example HTTP (neither 1.1 nor HTTP/2).

Related

Compression library for Cross Platform

We need to compress and send the data from each of the endpoint client (IOS/Android/Windows Classic) and decompress it in serve end using .NET.
Is there any open source common library for compress/decompress which can be used in this scenario (common X platform).
Pl advise.
Just about any programming platform developed it the past 20 years supports zlib out of the box. Since they generally all incorporate the same free library, the data they generate is interoperable.
Look through API documentation for keywords like "zlib", "gzip", or "deflate". For example, in Android, check out Deflater and DeflaterOutputStream which implement zlib.

When will NSURLConnection decompress a compressed resource?

I've read how NSURLConnection will automatically decompress a compressed (zipped) resource, however I can not find Apple documentation or official word anywhere that specifies the logic that defines when this decompression occurs. I'm also curious to know how this would relate to streamed data.
The Problem
I have a server that streams files to my app using a chunked encoding, I believe. This is a WCF service. Incidentally, we're going with streaming because it should alleviate server load during high use and also because our files are going to be very large (100's of MB). The files could be compressed or uncompressed. I think in my case because we're streaming the data, the Content-Encoding header is not available, nor is Content-Length. I only see "Transfer-Encoding" = Identity in my response.
I am using the AFNetworking library to write these files to disk with AFHTTPRequestOperation's inputStream and outputStream. I have also tried using AFDownloadRequestOperation as well with similar results.
Now, the AFNetworking docs state that compressed files will automatically be decompressed (via NSURLConnection, I believe) after download and this is not happening. I write them to my documents directory, with no problems. Yet they are still zipped. I can unzip them manually, as well. So the file is not corrupted. Do they not auto-unzip because I'm streaming the data and because Content-Encoding is not specified?
What I'd like to know:
Why are my compressed files not decompressing automatically? Is it because of streaming? I know I could use another library to decompress afterward, but I'd like to avoid that if possible.
When exactly does NSURLConnection know when to decompress a downloaded file, automatically? I can't find this in the docs anywhere. Is this tied to a header value?
Any help would be greatly appreciated.
NSURLConnection will decompress automatically when the appropriate Content-Encoding (e.g. gzip) is available in the response header. That's down to your server to arrange.

Is there any compression available in libcurl

I need to transfer a huge file from local machine to remote machine using libcurl with C++. Is there any compression option available in-built with libcurl. As the data to be transferred is large (100 MB to 1 GB in size), it would be better if we have any such options available in libcurl itself. I know we can compress the data and send it via libcurl. But just want to know is there any better way of doing so.
Note: In my case, many client machines transfer such huge data to remote server at regular interval of time.
thanks,
Prabu
According to curl_setopt() and options CURLOPT_ENCODING, you may specify:
The contents of the "Accept-Encoding: " header. This enables decoding
of the response. Supported encodings are "identity", "deflate", and
"gzip". If an empty string, "", is set, a header containing all
supported encoding types is sent.
Here are some examples (just hit search in your browser and type in compression), but I don't know hot exactly does it work and whether it expect already gzipped data.
You still may use gzcompress() and send compressed chunks on your own (and I would do the task this way... you'll have better control on what's actually going on and you'll be able to change used algorithms).
You need to send your file with zlib compression by yourself. And perhaps there are some modification needed on the server-side.

Simple compression in c++

we have a C++ MFC application and C# Web Service. They are communicating over HTTP, but as they are exchanging text data, compression will help a lot. But for some reasons, we can't use an external library.
We basically need to compress a byte array on one side and decompress it on the other side.
What should we use to compress our data? The best scenario would be if there is something in MFC/win32 api. Or is there some simplistic code with at most LGPL license that we could integrate into our project?
As has already been said, the zlib is probably what you are looking for.
There are several algorithms within:
The deflate and inflate pair
zlib itself
lzo
The simpler is probably lzo (I advise to pass the uncompressed size on the side), but zlib isn't very complicated either and the compression rate can be parameterized (speed / size trade off) which can be a plus depending on your constraints.
For XML data (since you were speaking of web services), LZO gave me a ~4x compression factor.
Can't you just switch on HTTP compression? http://en.wikipedia.org/wiki/HTTP_compression
zlib has a very liberal license.

Any reasons not to gzip documents delivered via HTTP?

I remember someone telling me that gzipped content is not cached on some browsers?
is this true?
Are there any other reasons why I shouldn't gzip my content (pages, javascript and css files) with htaccess?
The other reason is it obviously increase CPU load, but whether this is a problem depends on your content type and your traffic.
If you are going to use GZip from with .htaccess, be sure to wrap it in a condition whereby it only executed of the mod_gzip module exists, this will make the site / app more portable if moving it to another server.
If you opt to use .htaccess GZipped content the browser will receive compressed content if it supports it, or received the normal uncompressed version if it doesn't
If you are delivering mostly .gz files, then obviously you don't want to gzip them. Otherwise it's probably a good idea, especially for cache-able content. I have never heard of caches not working with gzipped content.
I think you need to handle both GZipped and not gzipped data since IE6 and GZipping do not live together nicely.
Otherwise I cant think of an issue
If you need to stream the content of a page, or want to use Response.Flush, then you can't use compression/gzip.