fail2ban regex rule - regex

just for test, I would like to block all traffic to my website not coming from Android browser using fail2ban.
This is the string in the log file:
GET http://www.aaaaa.com/video/09_12_2014_spot_app.mp4 - ORIGINAL_DST/171.171.171.171 video/mp4
[
User-Agent: stagefright/1.2 (Linux;Android 5.0)
Cookie: _gat=1; _ga=GA1.2.909922659.1455111791
Range: bytes=705201-
Connection: Keep-Alive
Accept-Encoding: gzip
Host: www.aaaaa.com
]
[
HTTP/1.1 206 Partial Content
Date: Thu, 26 May 2016 15:27:16 GMT
Server: Apache/2.2.15 (CentOS)
Last-Modified: Tue, 09 Dec 2014 19:55:17 GMT
ETag: "2b739f-ec2b1-509cdec1610e2"
Accept-Ranges: bytes
Content-Length: 262144
Content-Range: bytes 705201-967344/967345
Connection: close
Content-Type: video/mp4
]
Any help? Thank you in advance!

Related

Getting GSSException: Defective token detected error while calling HDFS API on a kerberised cluster

I have a kerberised CDH v5.14 cluster with 3 nodes.I trying to call the HDFS API using python as below
baseurl = "http://<host_name>:50070/webhdfs/v1/prod/?op=LISTSTATUS"
__, krb_context = kerberos.authGSSClientInit("HTTP/<host_name>")
#kerberos.authGSSClientStep(krb_context, "")
negotiate_details = kerberos.authGSSClientResponse(krb_context)
headers = {"Authorization": "Negotiate " + str(negotiate_details)}
r = requests.get(baseurl, headers=headers)
print r.status_code
The below error is returned
GSSException: Defective
token detected (Mechanism level: GSSHeader did not find the right tag)
HTTP ERROR 403
But the same works fine when I run it using curl
curl -i --negotiate -u: http://<host_name>:50070/webhdfs/v1/prod/?op=LISTSTATUS
HTTP/1.1 401 Authentication required Cache-Control:
must-revalidate,no-cache,no-store Date: Wed, 30 May 2018 02:50:04 GMT
Pragma: no-cache Date: Wed, 30 May 2018 02:50:04 GMT Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1 X-FRAME-OPTIONS:
SAMEORIGIN WWW-Authenticate: Negotiate Set-Cookie: hadoop.auth=;
Path=/; HttpOnly Content-Length: 1409
HTTP/1.1 200 OK Cache-Control: no-cache Expires: Wed, 30 May 2018
02:50:04 GMT Date: Wed, 30 May 2018 02:50:04 GMT Pragma: no-cache
Expires: Wed, 30 May 2018 02:50:04 GMT Date: Wed, 30 May 2018 02:50:04
GMT Pragma: no-cache Content-Type: application/json X-FRAME-OPTIONS:
SAMEORIGIN WWW-Authenticate: Negotiate
YGYGCSqGSIb3EgECAgIAb1cwVaADAgEFoQMCAQ+iSTBHoAMCAReiQAQ+6Seu0SSYGmoqN4hdykSQ55ZcP+juBO/jk8/BGjoK5NCmdlBRFPMSbCZXvVjNHLg9iPACGvM8V0jqXTM5UfQ=
Set-Cookie:
hadoop.auth="u=XXXX&p=XXXX#HOSTNAME&t=kerberos&e=1527684604664&s=tVsrEsDMBGV0To8hOPp8mLxyiSo=";
Path=/; HttpOnly Transfer-Encoding: chunked
and it gives the correct response, what am I missing here? Any help is appreciated.

iCloud Calendar empty response

I have sent below request to iCloud calendar service to get all calendar list. But response is always empty. Can anyone help me?
Request:
GET /10232836851/calendars/F41F7478-4345-4A4A-8CD5-548122EF2C22/ HTTP/1.1
HOST: pxx-caldav.icloud.com
user-agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2
authorization: Basic XXXXXXXXXXXXXXXXXXXXXXXX==
depth: 1
content-type: text/xml
Response Header:
Server: AppleHttpServer/a6f3179
Date: Fri, 25 Nov 2016 07:35:53 GMT
Content-Length: 0
Connection: keep-alive
Last-Modified: Wed, 23 Nov 2016 19:32:45 GMT
Dav: 1, access-control, calendar-access, calendar-schedule, calendar-auto-schedule, calendar-managed-attachments, calendarserver-sharing, calendarserver-subscribed, calendarserver-home-sync
Accept-Ranges: bytes
X-Responding-Server: pv41p47ic-tydg09053001 23 a63660a6f7d1a25b5a7ed66dab0da843:44702101
X-Transaction-Id: 55b238a2-548b-41fc-acc0-b697adb4ab84
Cache-Control: private, max-age=0, no-cache
Strict-Transport-Security: max-age=31536000; includeSubDomains
Via: icloudedge:hk02p00ic-ztde010805:7401:16G8:Hong Kong
X-Apple-Request-Uuid: 55b238a2-548b-41fc-acc0-b697adb4ab84
Access-Control-Expose-Headers: X-Apple-Request-UUID; Via
To get the list of calendars on the server using the CalDAV protocol you use the PROPFIND method, not a GET. This is pretty well described in the SabreDAV Building a CalDAV Client web page.
Something like this:
PROPFIND /calendars/johndoe/ HTTP/1.1
Depth: 1
Content-Type: application/xml; charset=utf-8
Host: ...
Authorization: ...
<propfind xmlns="DAV:">
<prop>
<displayname />
</prop>
</propfind>

Qt- QNetworkReply not showing Content-Length header

For some download urls the QNetworkReply object does not contain the Content-Length header and returns File Size as -1. I tested for the following url:
http://download-cf.jetbrains.com/webide/PhpStorm-EAP-141.332.tar.gz
The headers shown by Live HTTP Headers in Firefox are as follows:
HTTP/1.1 200 OK
Content-Type: application/x-tar
Content-Length: 135144452
Connection: keep-alive
Date: Mon, 30 Mar 2015 17:49:03 GMT
Content-Encoding: gzip
x-amz-meta-s3cmd-attrs: uid:572/gname:cds/uname:cds/gid:574/mode:33188/mtime:1427282503/atime:1427282968/md5:a2ccadce9ae0f356e9c11a6d5dd5a4f0/ctime:1427282503
Last-Modified: Wed, 25 Mar 2015 11:36:03 GMT
Etag: "db9a27ca51b84bac23080028b3e267ef-9"
Accept-Ranges: bytes
Server: AmazonS3
Age: 313
X-Cache: Miss from cloudfront
Via: 1.1 f94856caaa8ad33df4ddf975899fadd2.cloudfront.net (CloudFront)
X-Amz-Cf-Id: GFsaZTTMQ5eQ54JOUzBfJmIHL6AolKkXknb2HAcfbCKsbIYgdJng_Q==
And when I do following:
qDebug()<<reply->rawHeaderList();
The output is:
("Content-Type", "Connection", "Date", "Content-Encoding",
"x-amz-meta-s3cmd-attrs", "Last-Modified",
"ETag", "Accept-Ranges", "Server", "Age", "X-Cache",
"Via", "X-Amz-Cf-Id")
Clearly, Content-Length is missing. So, is their any solution for this.
I have logged a bug report for the same. It can be tracked at following url:
https://bugreports.qt.io/browse/QTBUG-45322

writing proper "HEAD" and "GET" request in winsock c++

Actually I was coding for downloading the files in HTTP using winsock c++ and to get the details I fired "HEAD" header..
(this is what actually I did)
HEAD /files/ODBC%20Programming%20in%20C%2B%2B.pdf HTTP/1.0
Host: devmentor-unittest.googlecode.com
Response was:
HTTP/1.0 404 Not Found
Content-Type: text/html; charset=UTF-8
Set-Cookie: PREF=ID=feeed8106df5e5f1:TM=1370157208:LM=1370157208:S=10bN4nrXqkcCDN5n; expires=Tue, 02-Jun-2015 07:13:28 GMT; path=/; domain=devmentor-unittest.googlecode.com
X-Content-Type-Options: nosniff
Date: Sun, 02 Jun 2013 07:13:28 GMT
Server: codesite_downloads
Content-Length: 974
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN
But if I do:
GET /files/ODBC%20Programming%20in%20C%2B%2B.pdf HTTP/1.0
Host: devmentor-unittest.googlecode.com
The file sucessfully gets downloaded....
After then after I download, again if I fire the HEAD request... it also brings up the following
HTTP/1.0 200 OK
Content-Length: 320381
Content-Type: application/pdf
Content-Disposition: attachment; filename="ODBC Programming in C++.pdf"
Accept-Ranges: bytes
Date: Sun, 02 Jun 2013 05:47:11 GMT
Last-Modified: Sun, 11 Nov 2007 03:17:59 GMT
Expires: Sun, 09 Jun 2013 05:47:11 GMT
Cache-Control: public, max-age=604800
Server: DFE/largefile
//something like this.....
Question: why "HEAD" is returning the false "error not found" at first but the file gets downloaded in using "GET" and after downloading "HEAD" also returns goodies i need...where have i mistaken..
The file I am trying to download here is "http://devmentor-unittest.googlecode.com/files/ODBC%20Programming%20in%20C%2B%2B.pdf" (just for example)
The problem is not on your end. Google Code simply does not implement HEAD correctly. This was reported 5 years ago and is still an open issue:
Issue 660: support HTTP HEAD method for file download urls

Problem with Squid - Proxy authentication required

I have web request that uses proxy (forward proxy) to website lets say www.example.com this web site has reverse proxy (Squid) as a result all my requests are returned as MISS.
Is there any way to use forward proxy and retrieve the data ?
Please understand that i am a newbie in this area.
After reading a while i set no cashing and this what i got
This is Request
Command: GET
URI: http://www.example.com
ProtocolVersion: HTTP/1.1
UserAgent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5
Referer: http://www.example.com
Accept: */*
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Accept-Encoding: gzip,deflate
Accept-Language: en-us,en;q=0.5
Keep-Alive: 115
X-Requested-With: XMLHttpRequest
X-Prototype-Version: 1.7
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Host: www.example.com
Cookie:
PHPSESSID: 249477191de6048739cd8690fabd6060
UTG: A-3345389704b26912f6d5422.73487509-0a3a0a26a100a113a119a24a1a4a77a7a6a
addOtr: 7L4L2
CLIENT_WIDTH: 1916
MAIN_WIDTH: 1726
Cache-Control: no-store,no-cache
Pragma: no-cache
ProxyConnection: Keep-Alive
HeaderEnd: CRLF
This is Response
ProtocolVersion: HTTP/1.0
StatusCode: 407, Proxy authentication required
Reason: Proxy Authentication Required
Server: squid/3.0.STABLE19
Mime-Version: 1.0
Date: Mon, 31 Jan 2011 19:04:44 GMT
ContentType: text/html
ContentLength: 2986
X-Squid-Error: ERR_CACHE_ACCESS_DENIED 0
ProxyAuthenticate: Basic realm="Anonymous proxy"
Authenticate: Basic realm="Anonymous proxy"
X-Cache: MISS from funky
X-Cache-Lookup: NONE from funky:2448
Via: 1.0 funky (squid/3.0.STABLE19)
ProxyConnection: close
HeaderEnd: CRLF
Thanks in advance
You need to set your expires into the future.
Try changing your Expires, Cache-Control, and Pragma to look something like this:
Cache-Control:max-age=300
Date:Fri, 04 Feb 2011 04:52:58 GMT
Expires:Fri, 04 Feb 2011 04:57:58 GMT
...
(remove Pragma. You can do this by editing you .htaccess file)