I got an issue with Cloudflare at the moment. When pressing multiple image links at once (ctrl+click) Some pages return a 520. http://art.hespen.net (image links are, for example 'Most popular' or 'Latest Wallpapers') Opening around 5 links in new tabs at once, triggers it.
I don't think the IP whitelist is an issue, as I allow all: (digital ocean server)
<Directory "/dir">
Options FollowSymLinks
AllowOverride All
Order allow,deny
Allow from all
</Directory>
The docs tell me to do a curl request, which always gave me a positive response.
I also got a har file from the 520 result: https://pastebin.com/uRDURENP
I don't know where to look anymore. Any ideas?
The issue was Imagick. I used Imagick to determine the resolution of the wallpaper on the image page. Which for some reason created a 520 error on some occasions.
Using the php function getImageSize, I have seemed to resolve the issue.
Related
I have followed several instructions here about editing the htaccess.conf file and other suggestions that come up with a search for adding expire headers to this hosting system (Bitnami/Lightwave/AWS). But nothing seems to make a difference. GTMetrix doesn't seem to see the expire headers in Page Speed or Y-Slow reports.
I'm using current versions of Joomla and Rockettheme's Gantry 5 Myriad theme. I am using RokBoost have Page Cache plugin enabled and System Cache Settings of Cache handler: file, Path to Cache Folder: blank, Cache Time: 15, Platform Specific Caching: No, System Cache: Off.
Can anyone tell me how to get the expire headers working?
Thanks for any help you can give.
Bitnami Enginer here,
Depending on the results in the GTMetrix site, you will need to add different "ExpireByType" lines in the htacess.conf file. For example, if you want to expire the .jpg files, you will need to add something similar to this
<Directory "opt/bitnami/apps/joomla/htdocs"
## EXPIRES CACHING ##
<IfModule mod_expires.c>
ExpiresActive On
ExpiresByType image/jpg "access 1 year"
</IfModule>
## EXPIRES CACHING ##
</Directory>
You will need to restart Apache after that
sudo /opt/bitnami/ctlscript.sh restart apache
Please note that you can't expire elements that are not owned by you. I mean, you can expire the jpg images that exist in your server but if you included images or any other element from another site, you can't do anything with that to expire the cache.
We were trying to get a wrong result. i was searching for a solution too but what i found after reading the GTMetrix test result with those (expire header) code and without them:
the expire header code is working but only with internal files.
this code will not work with external files(from other websites like google analytics js ...).
the test result will show you only the external files.
if you remove those lines of codes the result will be worst, and it will result in more files doesn't have an expired age.
I'm running a Django website and using lets encrypt for my SSL. Configuration of the framework is such that I can't allow access on: http://url.com/.xxxx
What I can allow free access to is:
http://url.com/static/.xxxx
My /static/ URL can accept and host any random files lets encrypt needs. Is there a way to have certbot support /static/ instead of just using / for the URL?
Thanks
EDIT
I've found a work around that is acceptable for me. Further digging, I found that /.well-known/ is always the base directory for SSL checking. That means we can add a static directory which will work nicely with certbot. Here's how, firstly add this into your apache config:
Alias /.well-known/ /var/www/XXXXX/website/static/.well-known/
<Directory /var/www/XXXXX/website/static/.well-known/>
Require all granted
</Directory>
Then add this into your settings.py file:
STATIC_ENCRYPT_URL = '/.well-known/'
STATIC_ENCRYPT_ROOT = '/var/www/XXXXX/website/static/'
Add this into your urls.py:
urlpatterns = [
...
] + static(settings.STATIC_ENCRYPT_URL, document_root=settings.STATIC_ENCRYPT_ROOT)
Reset your webserver. Now you have a special url /.well-known/ which will host any file certbot requires.
I'd still like a better answer than this.
In case other users come this way like I did from Google, here's how I improved this situation:
I was unsatisfied by my options when it came to creating ACME challenges for Let's Encrypt when running a Django application. So, I rolled my own solution and created a Django app! Basically, you can manage your ACME challenges as just another object, and the app will produce the proper end-point URL.
Simply pip install django-letsencrypt and follow the README to be on your way.
EDIT: The reason of my problem is mod_userdir. So if your host has enabled mod_userdir like Hostgator reseller package for example http://support.hostgator.com/articles/specialized-help/technical/apache-htaccess/mod_userdir then be sure that you host can disable this. Apparently Hostgator refused to disable this for the specific hosting package
Recently I received a phishing warning from google related to a file that doesn't exist in my server. The reason that it appears as it is hosted on my server is because I'm on a shared/reseller Apache hosting package. So I discovered that I can access any file of another website which is hosted on the same server as my site if I know the username of the owner of the website.
Meaning I can access
http://mywebsite.com/~somebodyelsesusername/any_path_to_their_files.php
Well this behavior is undesirable, so I want to deny access to other's websites through my domain using .htaccess
How can I block every root folder for instance mydomain.com/~somefolder/ starting with ~ without knowing what follows next? Of course I have to block access to any files or folders of that folder. I tried
<DirectoryMatch "^\~|\/\~">
Order allow,deny
Deny from all
</DirectoryMatch>
But I guess I'm not doing it right.
The answer below answers in fact the question however it doesn't fix my problem due to special circumstances. So I marked it as correct and I will further investigate the issue
<DirectoryMatch> can only be used in the server configuration file, or virtual host context, not through .htaccess.
You can possibly block access using mod_rewrite. Make sure the module is enabled, then use the following directives:
RewriteEngine on
RewriteRule ^~ - [F,L]
I have a Django applciation running on Apache with mod_wsgi, but I would like to create a development server on the same machine.
I can reach my website by http://IP_ADD and I would like to reach the development server from http://IP_ADD:8080 or another port.
But as you notice, I would like to prevent accessing to 8080 port from users who do not enter predetermined username/password.
How can I achive such protection? I may allow only certain IP address but it is not a solution.
Another question is also about the chosen port. I hace choice 8080 port but I will also setup issue tracking system, SVN etc. and I am not sure which ports should I open for them.
Thank you
For each of the sites you want to host, you could create a separate Apache site with a VirtualHost file along the following lines:
<VirtualHost *:8080>
ServerName www.example.com:8080 // Your name (if available)
ServerAlias 12.23.34.45 // Your IP
DocumentRoot /var/www/mydjangoapp // Your folder
<Directory />
Order deny,allow
Deny from all
Allow from 127
AuthName "Restricted area"
AuthType Basic
AuthUserFile /etc/apache2/users_mydjangoapp // Allowed users file
require valid-user
</Directory>
The userfile itself can be generated using Apache's authentication system. For each site, you could add a seperate user file to contain the access for that part of your system. For IP based access, just add lines like Allow from 123.123.123.123 below the Allow from 127 line.
Finally, additional sites can be created by creating more of these Apache sites (see for example here for more details). Just adapt the port (8080 in my example) to the one you want to host the additional sites under.
you can add basic authentication
http://djangosnippets.org/snippets/1304/
I have a django web application that's running on apache 2.2.14 and I want to run the admin application over https.
Having read considerable discussions on using a proxy, writing middleware, running alternative wsgi scripts, the chaps in #httpd came to my rescue. The solution is so simple, I was surprised I didn't find it online, so I'm curious to see if I've made some glaring assumptions or errors.
One complication was that I also wanted to run one of my django apps in the site over https, that is everything on /checkout.
Essentially, if a user requests a URI starting with /admin or /checkout on http, they are to be redirected to that URI but on https. Conversely, if a user requests a URI that does not start with /admin or /checkout on https, they are to be redirected to that URI but on http.
The key to solving this problem was to use Redirect and RedirectMatch directives in my VirtualHost configuration.
<VirtualHost *:80>
... host config stuff ...
Redirect /admin https://www.mywebsite.com/admin
Redirect /checkout https://www.mywebsite.com/checkout
</VirtualHost>
<VirtualHost *:443>
... ssl host config stuff ...
RedirectMatch ^(/(?!admin|checkout).*) http://www.mywebsite.com$1
</VirtualHost>
Another approach is to use #secure_required decorator. This will automatically rewrite the requested url and redirect to https://... version of the URL. Then you don't have to have Redirect in *:80 configuration. *:443 configuration may still be required for performance purpose if you want other traffic to go through normal http traffic.
I tried your solution, but ran into several problems. First, the formatting on the admin site disappeared, as if it could not find the admin static files. Second, if I tried to reach the non-admin site through https, the browser would not find it and redirect me to Yahoo search. Oddly, if I edited the yahoo search URL to eliminate all text except my correct URL (minus the http://), it would continue to search through yahoo for my site. However, typing the exact same URL afresh sent me to my site.
I solved all of these issues by simply removing the
RedirectMatch ^(/(?!admin|checkout).*) http://www.mywebsite.com$1
directive.
I should mention that I don't have a /checkout section on my site and am only trying to secure /admin. ... and yes, I did substitute my URL for "mywebsite.com"
What you described should work, but there may be a problem in the future if you need to make changes to which paths are/are not HTTPS. Because this method requires the ability to correctly modify the Apache config file it means you do not want novices in the loop. Screw up the config file and your site can go 500-error in the blink of an eye.
We chose to have a simple text file that had a list of the must-be-HTTPS paths. Anyone on the project can edit it and it is checked for correctness when it is loaded. We handle any needed redirects to/from HTTPS in middleware and it seems to work just fine. This method will also work if you are running anything other than Apache.