How can I compress my .php site with zlib and zpipe.c - compression

I am having a problem compressing my website. My host (www.one.com) does not support gzip, so I contacted them regarding any other way to compress my site. First I talked to a supporter that told me that it could not be done, since their server is set up with zlib. So i googled zlib, and there where some tips regarding this problem.
I tried adding:
php_flag zlib.output_compression on
php_value zlib.output_compression_level 5
to my .htaccess file, witch caused the browser to render: "internal server error".
Then I tried talking to them again, and this time I was pointed to this site: http://www.zlib.net/zlib_how.html
The problem is that I have not come across neither zlib nor zpipe.c before. I have tried to google it, and look around different forums, but I do not get it.
Can anyone give me some pointers on how to get this to work? How do I implement zpipe.c? I am a complete noob here, so I would appreciate a step by step guide.
My site is writen in .php

This is how i did it and i am using one.com as hosting company.
I added this line into my index file:
<?php
ini_set("zlib.output_compression", "On");
ini_set("zlib.output_compression_level", "-1");
?>
This added compression on my whole site :) myvotebooth.com
And then i added this into my .htaccess file
<IfModule mod_deflate.c>
<FilesMatch "\.(js|css|html)$">
SetOutputFilter DEFLATE
</FilesMatch>
</IfModule>
By doing this i get a response header:
Web page compressed? Yes
Compression type? gzip
Size, Markup (bytes) 11,329
Size, Compressed (bytes) 3,187
Compression % 71.9
I also tried to compress even more -2 -3 -4 -5 but one.com only allow -1

First I talked to a supporter that told me that it could not be done,
since their server is set up with zlib.
It sounds to me like they don't know what they're talking about. What http server are you using? Apache? Whatever it is, it should support pre-compression of your html using either the gzip or "deflate" content encoding. You need to find someone who supports your server that knows that they're talking about.
zlib is a library of compression routines that can compress to either the gzip or "deflate" content encoding.
The reason I keep putting deflate in quotes is that the HTTP standard calls the zlib format "deflate". It is sometimes confused with raw deflate, and is the reason you should always use gzip for http delivery since there is no ambiguity in that case.
Update:
Since you are using Apache, you can probably get this working without the help of your hosting company. You can put Apache directives to mod_deflate your .htaccess file. Here is an example.

Related

How to enable gzip compression instead of brotli compression on OLS + DA?

I'm using Litespeed plugin for wordpress (From LiteSpeed Technologies), with a litespeed server, and despite what I do, it is not compressing javascript and css, and there's no option in the plugin for setting up "js & css" by the way. I already tried using other plugins, so I don't think the problem is the plugin, it's something in my server. I tried editing the following files usr/local/lsws/conf/httpd-tuning.conf and /usr/local/directadmin/custombuild/configure/openlitespeed/conf/httpd-tuning.conf
Changed the compression level, and yet I can't put this thing to compress. Yes, it is working, but it's only compressing text and html.
Even after changing the headers in the files from text/*, application/x-javascript, application/xml, application/javascript, image/svg+xml,application/rss+x$
to
text/*, application/x-javascript, application/xml, application/javascript, text/html, text/css, text/plain, text/xml, text/x-js, text/js, text/javascript
it still will not work.
Yes, I restart lightspeed every time I make changes, I can even see from the litespeed server administration that the new headers were applied. I believe it might be something else. I saw someone saying he said the same problem and his problem was the wildcard (besides the headers), but I don't know what that means.
For OLS, you can either enable gzip or brotli compression. Please refer to https://openlitespeed.org/kb/using-gzip-brotli-compression/. Since you are using OLS on DirectAdmin, you have no ability to modify the value by logging in to OLS admin console(That's because DA's implenmatation of splitting conf files, you will need to modify the server level or virtual host level directly by adding directive(brStaticCompressLevel 0) there. By the way, question title is better to change to: (not about caching at all, it is about )"How to enable gzip compression instead of brotli compression on OLS + DA" to avoid confusing.

TYPO3 compression only works on js/css, not on html file itself

I'm trying to optimize the pagespeed and trying to figure out what does what.
I've managed to merge and compress js/css files, so Google Pagespeed doesn't bother me with that anymore.
Now I still get the message that I can save some bytes by compressing "http://yourpage.com/" which basically means that the html/php-file itself is not compressed (I think)
Any idea how I can solve this?
Some additional information:
Using TYPO3 6.2.21 with the default .htaccess file enabled.
there is an extension for that: https://typo3.org/extensions/repository/view/sourceopt
That works very well. I use it in many projects.
Apparently I had to remove the '<IfModule mod_filter.c>' condition from my .htaccess file, now it's working. I read that my Apache version might be too old.

Requests issue decoding gzip

I'm trying to pull a large number of text files from a website using the requests package where some of the files are available outright as text and others are compressed text files.
tmpHtml = 'https://website.com/csv/pwr/someData.dat.gz'
tmpReq = requests.get(tmpHtml, proxies = proxy_w_auth, auth = (usr, pw))
When I pull the uncompressed files, everything works well however when I pull one of the compressed files I get lots of the following:
'\x1f\x8b\x08\x08\xe5\xc6\xd9A\x00\x03someData.dat\x00\xa5\x9d\xcbn\x1c\xb9\x19\x85\xf7\x01\xf2\x0e\xfd\x00Q\xa9X,^j\xa9\xc8\x9a\xb1\x9dX\x16dM\x12/\r\x8c\x0712\x19\x0f\xb2\t\x02\xf4\xc3\xa7\xba\xeeM\x9e\x9f<\xa46s\x93\xf1\r\x8b\xfd\x7fl\x9e\xe2E/\xcfwo\x1eNo\xee^\x1e\xceo\x7f\xfa\xf3\xf9\xe9\xf9\xe3\x9b\x9f\xee_\xce\x9f^\x9e\xdf=\x9d\xef?>\xbe<\xdf\x8d\xff\xba\xfe\xc3\xe9\xe5\xf3\xd3\xc3\xf4\xc3\xbf\x8c\x7f{xy\xf9\xeb\xc3\x87\x87\xc7\x97\xd3\xd3\xf3\xbb\xfb\x87\xf3\xe3\xc3\xcb\xe9\xfe\xed\xdd\xe3\x8f\x0f\xe7\x87\x7f<\xbd{\xbe{y\xf7\xf1qb\xff\xf1\x0f\xeaV\xdfvmk\xce\xf7\xdf~;\xff\xf0\xed\xb7\xd3\xa7\xff~\xf9\xfd\xe6\xe9\xeb\x97\x7f\xfd\xe9\xf4\xc3\xd3\xe9\x97\xef\xff9]\x10\xeaV-\x7f\xec\xdd\xe3\xf9\x87\xf3\xb9W\x8d\xf6\xe7\x1b\xd3\xf4n\xfc\x99\x9e\x7fH\xd3\xba\x90f\x1ak\xce7\xbaQ\xe3\x8f:_\x06\xd31ldu\xe3_tq\xc3z\x91\xd5\xdfvC\x19\xcb\x84,\xdd\xb8\x11\xa6\x9a\xce\x8c?+m\x99\ri\xf6\xc2\xb9i\xc7\xa6\xd9[\xdd\x96\xc1\\\x003vn\xda\xf8\x83\xd2\xa7\xf4\x12\xca\x17?\xe2\x10u\xd8\xe5\xf9\xc6\xa7\x1c\x8a\x1fP\xb5
I can see the file name in the beginning of the string that is returned but I'm not sure how I can actually extract the content. According to the requests documentation, it should automatically be decompressing gz files?
http://requests.readthedocs.org/en/latest/community/faq/
The response object looks like it has gzip in the headers as well:
{'Accept': '/', 'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'User-Agent': 'python-requests/2.7.0 CPython/2.7.10 Windows/7'}
Any suggestions would be much appreciated.
Sometimes web clients request that the server compress a file before sending it. Not .gz files, mind you, since you wouldn't compress something twice. This cuts down the file size, especially for large text files. The client then decompresses it automatically before displaying it to the user. This is what the requests docs in your question describe. You do not have to worry about this for your use-case.
To decompress a gzipped file, you have to either decompress it in memory using gzip (part of the standard lib) or write it to disk in 'wb' mode and use the gzip utility.

libcurl inbuilt compression support or not

I found an answer which told me that libcurl doesn't support compression.
libcurl (linux, C) - Is there a built-in way to POST/PUT a gzipped request?
But, somewhere else, I also found this:
Is there any compression available in libcurl
My question is, do I still have to compress strings on my own and then send them, or is that not necessary using libcurl?
Yes, if you send data you must compress that yourself before sending. There is no support for doing that "automatically" in for example HTTP (neither 1.1 nor HTTP/2).

Any reasons not to gzip documents delivered via HTTP?

I remember someone telling me that gzipped content is not cached on some browsers?
is this true?
Are there any other reasons why I shouldn't gzip my content (pages, javascript and css files) with htaccess?
The other reason is it obviously increase CPU load, but whether this is a problem depends on your content type and your traffic.
If you are going to use GZip from with .htaccess, be sure to wrap it in a condition whereby it only executed of the mod_gzip module exists, this will make the site / app more portable if moving it to another server.
If you opt to use .htaccess GZipped content the browser will receive compressed content if it supports it, or received the normal uncompressed version if it doesn't
If you are delivering mostly .gz files, then obviously you don't want to gzip them. Otherwise it's probably a good idea, especially for cache-able content. I have never heard of caches not working with gzipped content.
I think you need to handle both GZipped and not gzipped data since IE6 and GZipping do not live together nicely.
Otherwise I cant think of an issue
If you need to stream the content of a page, or want to use Response.Flush, then you can't use compression/gzip.