I'm reading that <Location '/'> actually matches the entire domain, not just the root location. I want to create a Location or LocationMatch block that matches everything but http://my.domain.com/ This means it will trigger if any characters follow that final '/
Here is how I will be testing this:
<LocationMatch "REGEX GOES HERE">
AuthType Shibboleth
ShibRequireSession On
Require Shibboleth
</LocationMatch>
I think Shibboleth may change some behavior. Also note I am using Apache 2.2 but a solution that works on 2.4 will suffice as well.
You can use LocationMatch with this regex:
<LocationMatch "^/.">
</LocationMatch>
Single DOT after ^/ will make sure there is at least one character after http://my.domain.com/ hence causing it to not to match landing page.
More details about LocationMatch
Testing:
Create this directive as:
<LocationMatch "^/(?<sitename>.+)">
RewriteEngine On
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule ^ http://www.%{HTTP_HOST}%{REQUEST_URI}?u=%{env:MATCH_SITENAME} [L,R=302]
</LocationMatch>
Now to test I am doing this:
curl -kI -A "Chrome" -L 'http://localhost/index.php'
HTTP/1.1 302 Found
Date: Mon, 11 Jul 2016 22:31:37 GMT
Server: Apache/2.4.12 (Unix) OpenSSL/1.0.1j PHP/5.6.9 mod_wsgi/3.5 Python/2.7.9
Location: http://www.localhost/index.php?u=index.php
Content-Type: text/html; charset=iso-8859-1
HTTP/1.1 200 OK
Date: Mon, 11 Jul 2016 22:31:37 GMT
Server: Apache/2.4.12 (Unix) OpenSSL/1.0.1j PHP/5.6.9 mod_wsgi/3.5 Python/2.7.9
X-Powered-By: PHP/5.6.9
Content-Type: text/html; charset=UTF-8
curl -kI -A "Chrome" -L 'http://localhost/user.php'
HTTP/1.1 302 Found
Date: Mon, 11 Jul 2016 22:33:57 GMT
Server: Apache/2.4.12 (Unix) OpenSSL/1.0.1j PHP/5.6.9 mod_wsgi/3.5 Python/2.7.9
Location: http://www.localhost/user.php?u=user.php
Content-Type: text/html; charset=iso-8859-1
HTTP/1.1 200 OK
Date: Mon, 11 Jul 2016 22:33:57 GMT
Server: Apache/2.4.12 (Unix) OpenSSL/1.0.1j PHP/5.6.9 mod_wsgi/3.5 Python/2.7.9
X-Powered-By: PHP/5.6.9
Content-Type: text/html; charset=UTF-8
curl -kI -A "Chrome" -L 'http://localhost'
HTTP/1.1 200 OK
Date: Mon, 11 Jul 2016 22:32:47 GMT
Server: Apache/2.4.12 (Unix) OpenSSL/1.0.1j PHP/5.6.9 mod_wsgi/3.5 Python/2.7.9
X-Powered-By: PHP/5.6.9
Content-Type: text/html; charset=UTF-8
You can clearly see that www redirection doesn't happen when I request landing page but happens when I request /index.php
Related
I have one normal Wordpress website on which I am trying to block requests to wp-json. I am aware that such requests go via the core of the Wordpress. My request looks like this:
[root#SV-CentOS-01 ~]# curl -i https://www.website.com/wp-json/wp/v2/users/1
When I add RewriteRule ^wp-json.*$ - [L,R=404] on top of my htaccess I get 404 server response and the API returns me the users of my website. Is there actually a way to achieve what I want via .htaccess or we need to make it the Wordpress way?
Example:
[root#SV-CentOS-01 ~]# curl -i https://www.website.com/wp-json/wp/v2/users/1
HTTP/1.1 404 Not Found
Date: Mon, 20 Sep 2021 14:14:13 GMT
Server: Apache
Vary: Accept-Encoding,Cookie,Origin
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate
Pragma: no-cache
X-Robots-Tag: noindex
Link: <https://www.website.com/wp-json/>; rel="https://api.w.org/"
X-Content-Type-Options: nosniff
Access-Control-Expose-Headers: X-WP-Total, X-WP-TotalPages, Link
Access-Control-Allow-Headers: Authorization, X-WP-Nonce, Content-Disposition, Content-MD5, Content-Type
Allow: GET
Set-Cookie: PHPSESSID=5c07eaa455457ca0ef4b358d016c3b8d; path=/
Upgrade: h2,h2c
Connection: Upgrade
Transfer-Encoding: chunked
Content-Type: application/json; charset=UTF-8
{"id":1,"name":"User One","url":"","description":"","link":"https:\/\/www.website.com\/author\/admin\/","slug":"admin","meta":[],"_links":{"self":[{"href":"https:\/\/www.website.com\/wp-json\/wp\/v2\/users\/1"}],"collection":[{"href":"https:\/\/www.website.com\/wp-json\/wp\/v2\/users"}]}}[root#SV-CentOS-01 ~]#
I want only allow from 127.0.0.1/localhost/0.0.0.0, but i tried with Access Control and .htaccess
And
order deny, allow
deny from all
allow from 127.0.0.1
Its doesnt work!
That deny allow rule won't work in OpenLiteSpeed.
For access control , make sure you have empty server-level access control list , and vhost -level won't override it.
[root#test ~]# cat /etc/hosts
127.0.0.1 mask.domain
[root#test ~]# curl -I -XGET http://mask.domain
HTTP/1.1 200 OK
Etag: "5-5d42a8ce-e18f0;;;"
Last-Modified: Thu, 01 Aug 2019 08:54:38 GMT
Content-Type: text/html
Content-Length: 5
Accept-Ranges: bytes
Date: Thu, 01 Aug 2019 08:58:50 GMT
Server: LiteSpeed
Connection: Keep-Alive
[root#test ~]# echo "123.456.789.000 mask.domain" > /etc/hosts
[root#test ~]# curl -I -XGET http://mask.domain
HTTP/1.1 403 Forbidden
Content-Type: text/html
Cache-Control: private, no-cache, max-age=0
Pragma: no-cache
Content-Length: 1139
Date: Thu, 01 Aug 2019 08:59:14 GMT
Server: LiteSpeed
Connection: Keep-Alive
The access control works on me , when I use hosts file to set to 127.0.0.1 - domain , it's 200 OK , and when I set it with public IP , it goes to 403 error.
Alternative way:
Use rewrite rule , like this:
RewriteEngine On
RewriteCond %{REMOTE_HOST} !^127\.0\.0\.1
RewriteRule .* - [F]
If you are going to use rewrite rule , make sure you have restarted OpenLiteSpeed once you changed the rules.
Best regards,
just for test, I would like to block all traffic to my website not coming from Android browser using fail2ban.
This is the string in the log file:
GET http://www.aaaaa.com/video/09_12_2014_spot_app.mp4 - ORIGINAL_DST/171.171.171.171 video/mp4
[
User-Agent: stagefright/1.2 (Linux;Android 5.0)
Cookie: _gat=1; _ga=GA1.2.909922659.1455111791
Range: bytes=705201-
Connection: Keep-Alive
Accept-Encoding: gzip
Host: www.aaaaa.com
]
[
HTTP/1.1 206 Partial Content
Date: Thu, 26 May 2016 15:27:16 GMT
Server: Apache/2.2.15 (CentOS)
Last-Modified: Tue, 09 Dec 2014 19:55:17 GMT
ETag: "2b739f-ec2b1-509cdec1610e2"
Accept-Ranges: bytes
Content-Length: 262144
Content-Range: bytes 705201-967344/967345
Connection: close
Content-Type: video/mp4
]
Any help? Thank you in advance!
Actually I was coding for downloading the files in HTTP using winsock c++ and to get the details I fired "HEAD" header..
(this is what actually I did)
HEAD /files/ODBC%20Programming%20in%20C%2B%2B.pdf HTTP/1.0
Host: devmentor-unittest.googlecode.com
Response was:
HTTP/1.0 404 Not Found
Content-Type: text/html; charset=UTF-8
Set-Cookie: PREF=ID=feeed8106df5e5f1:TM=1370157208:LM=1370157208:S=10bN4nrXqkcCDN5n; expires=Tue, 02-Jun-2015 07:13:28 GMT; path=/; domain=devmentor-unittest.googlecode.com
X-Content-Type-Options: nosniff
Date: Sun, 02 Jun 2013 07:13:28 GMT
Server: codesite_downloads
Content-Length: 974
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN
But if I do:
GET /files/ODBC%20Programming%20in%20C%2B%2B.pdf HTTP/1.0
Host: devmentor-unittest.googlecode.com
The file sucessfully gets downloaded....
After then after I download, again if I fire the HEAD request... it also brings up the following
HTTP/1.0 200 OK
Content-Length: 320381
Content-Type: application/pdf
Content-Disposition: attachment; filename="ODBC Programming in C++.pdf"
Accept-Ranges: bytes
Date: Sun, 02 Jun 2013 05:47:11 GMT
Last-Modified: Sun, 11 Nov 2007 03:17:59 GMT
Expires: Sun, 09 Jun 2013 05:47:11 GMT
Cache-Control: public, max-age=604800
Server: DFE/largefile
//something like this.....
Question: why "HEAD" is returning the false "error not found" at first but the file gets downloaded in using "GET" and after downloading "HEAD" also returns goodies i need...where have i mistaken..
The file I am trying to download here is "http://devmentor-unittest.googlecode.com/files/ODBC%20Programming%20in%20C%2B%2B.pdf" (just for example)
The problem is not on your end. Google Code simply does not implement HEAD correctly. This was reported 5 years ago and is still an open issue:
Issue 660: support HTTP HEAD method for file download urls
I have web request that uses proxy (forward proxy) to website lets say www.example.com this web site has reverse proxy (Squid) as a result all my requests are returned as MISS.
Is there any way to use forward proxy and retrieve the data ?
Please understand that i am a newbie in this area.
After reading a while i set no cashing and this what i got
This is Request
Command: GET
URI: http://www.example.com
ProtocolVersion: HTTP/1.1
UserAgent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5
Referer: http://www.example.com
Accept: */*
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Accept-Encoding: gzip,deflate
Accept-Language: en-us,en;q=0.5
Keep-Alive: 115
X-Requested-With: XMLHttpRequest
X-Prototype-Version: 1.7
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Host: www.example.com
Cookie:
PHPSESSID: 249477191de6048739cd8690fabd6060
UTG: A-3345389704b26912f6d5422.73487509-0a3a0a26a100a113a119a24a1a4a77a7a6a
addOtr: 7L4L2
CLIENT_WIDTH: 1916
MAIN_WIDTH: 1726
Cache-Control: no-store,no-cache
Pragma: no-cache
ProxyConnection: Keep-Alive
HeaderEnd: CRLF
This is Response
ProtocolVersion: HTTP/1.0
StatusCode: 407, Proxy authentication required
Reason: Proxy Authentication Required
Server: squid/3.0.STABLE19
Mime-Version: 1.0
Date: Mon, 31 Jan 2011 19:04:44 GMT
ContentType: text/html
ContentLength: 2986
X-Squid-Error: ERR_CACHE_ACCESS_DENIED 0
ProxyAuthenticate: Basic realm="Anonymous proxy"
Authenticate: Basic realm="Anonymous proxy"
X-Cache: MISS from funky
X-Cache-Lookup: NONE from funky:2448
Via: 1.0 funky (squid/3.0.STABLE19)
ProxyConnection: close
HeaderEnd: CRLF
Thanks in advance
You need to set your expires into the future.
Try changing your Expires, Cache-Control, and Pragma to look something like this:
Cache-Control:max-age=300
Date:Fri, 04 Feb 2011 04:52:58 GMT
Expires:Fri, 04 Feb 2011 04:57:58 GMT
...
(remove Pragma. You can do this by editing you .htaccess file)