How can I save the SSL keys for https when I use `urllib2`? - python-2.7

I need to save the SSL keys in a file, in order to decrypt the TCP packet via Wireshark later.
What should I do?
#!/usr/bin/env python
# -*- coding: UTF-8 -*-
import urllib2
import json
data={}
data_json = json.dumps(data, encoding='UTF-8', ensure_ascii=False)
requrl = "https://52.31.41.56/test" # look, the protocol is https
req = urllib2.Request(url=requrl, data=data_json)
req.add_header('Content-Type', 'application/json')
# how can I record the SSL keys in a file, for Wireshark decryption
rsp_fp = urllib2.urlopen(req)
rsp_data = rsp_fp.read()
print(rsp_data)

Use sslkeylogfile
Example Usage
Use sslkeylog, which is compatible with both Python2 and Python3. I'm modifying your code to save the SSL key logs while making a connection to Stack Overflow.
import urllib2
import sslkeylog
# Save SSL keys to "sslkeylog.txt" in this directory
# Note that you only have to do this once while this is in scope
sslkeylog.set_keylog("sslkeylog.txt")
# Make an HTTPS connection to Stack Overflow
requrl = "https://stackoverflow.com"
req = urllib2.Request(url=requrl)
rsp_fp = urllib2.urlopen(req)
Verification
Then if we check sslkeylog.txt, we can see that there is now an entry:
bash$ cat sslkeylogfile.txt
CLIENT_RANDOM a655a2e200ddc96c1571fe29af1962013ccbab1b9e9b865db112a9c1492c449a 3280c9fbee32df623074f80519f278420971aaa6eb91ab0f1f973d505a03ddbcc4fba2ca83f6d733addebdb0358e606d

Related

python SSLError("bad handshake: SysCallError(-1, 'Unexpected EOF')",),))

I was scraping this aspx website https://gra206.aca.ntu.edu.tw/Temp/W2.aspx?Type=2 .
As it required, I have to parse in __VIEWSTATE and __EVENTVALIDATION while sending a post request. Now I am trying to send a get request first to have those two values, and then parse then afterward.
However, I have tried several times to send a get request. It always turns out throwing this error message:
requests.exceptions.SSLError: HTTPSConnectionPool(host='gra206.aca.ntu.edu.tw', port=443): Max retries exceeded with url: /Temp/W2.aspx?Type=2 (Caused by SSLError(SSLError("bad handshake: SysCallError(-1, 'Unexpected EOF')",),))
I have tried:
upgrade OpenSSL
download requests[security]
However, none of them works.
I am currently using:
env:
python 2.7
bs4 4.6.0
request 2.18.4
openssl 1.0.2n
Here is my code:
import requests
from bs4 import BeautifulSoup
with requests.Session() as s:
s.auth = ('user', 'pass')
s.headers.update({'x-test': 'true'})
url = 'https://gra206.aca.ntu.edu.tw/Temp/W2.aspx?Type=2'
r = s.get(url, headers={'x-test2': 'true'})
soup = BeautifulSoup(r.content, 'lxml')
viewstate = soup.find('input', {'id': '__VIEWSTATE' })['value']
validation = soup.find('input', {'id': '__EVENTVALIDATION' })['value']
print viewstate, generator, validation
I am also looking for a solution for it. Some sites have deprecated TLSv1.0 and Requests + Openssl (on Windows 7) has trouble to build handshake with such peer host. Wireshark log showed the TLSv1 Client Hello was issued by the client but the host did not answer correctly. This error propagated up as the error message Requests showed. Even with the most updated Openssl/pyOpenssl/Requests and tried on Py3.6/2.7.12, no luck. Intrestingly when I replace the url to other like "google.com", the log showed TLSv1.2 Hello was issued and responded by the host. Please check images tlsv1 and
tlsv1.2.
Clearly the client has TLSv1.2 capability but why it use v1.0 Hello in the former case?
[EDIT]
I was wrong in previous statement. Wireshark misinterpreted unfinished TLSv1.2 HELLO exchanged as TLSv1. After more digging into it, I found these hosts is expecting pure TLSv1, but not a TLSv1 fallback from TLSv1.2. Due to Openssl's lack of some fields in the Hello extension fields (maybe Supported Version) when compared with the log from Chrome. I found a workaround to it. 1. Force the use of TLSv1 negotiation. 2. Change the default cipher suite to py3.4 style to re-enable 3DES.
import ssl
import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.poolmanager import PoolManager
#from urllib3.poolmanager import PoolManager
from requests.packages.urllib3.util.ssl_ import create_urllib3_context
# py3.4 default
CIPHERS = (
'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'
'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'
'!eNULL:!MD5'
)
class DESAdapter(HTTPAdapter):
"""
A TransportAdapter that re-enables 3DES support in Requests.
"""
def create_ssl_context(self):
#ctx = create_urllib3_context(ciphers=FORCED_CIPHERS)
ctx = ssl.create_default_context()
# allow TLS 1.0 and TLS 1.2 and later (disable SSLv3 and SSLv2)
#ctx.options |= ssl.OP_NO_SSLv2
#ctx.options |= ssl.OP_NO_SSLv3
#ctx.options |= ssl.OP_NO_TLSv1
ctx.options |= ssl.OP_NO_TLSv1_2
ctx.options |= ssl.OP_NO_TLSv1_1
#ctx.options |= ssl.OP_NO_TLSv1_3
ctx.set_ciphers( CIPHERS )
#ctx.set_alpn_protocols(['http/1.1', 'spdy/2'])
return ctx
def init_poolmanager(self, *args, **kwargs):
context = create_urllib3_context(ciphers=CIPHERS)
kwargs['ssl_context'] = self.create_ssl_context()
return super(DESAdapter, self).init_poolmanager(*args, **kwargs)
def proxy_manager_for(self, *args, **kwargs):
context = create_urllib3_context(ciphers=CIPHERS)
kwargs['ssl_context'] = self.create_ssl_context()
return super(DESAdapter, self).proxy_manager_for(*args, **kwargs)
tmoval=10
proxies={}
hdr = {'Accept-Language':'zh-TW,zh;q=0.8,en-US;q=0.6,en;q=0.4', 'Cache-Control':'max-age=0', 'Connection':'keep-alive', 'Proxy-Connection':'keep-alive', #'Cache-Control':'no-cache', 'Connection':'close',
'User-Agent':'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.85 Safari/537.36',
'Accept-Encoding':'gzip,deflate,sdch','Accept':'*/*'}
ses = requests.session()
ses.mount(url, DESAdapter())
response = ses.get(url, timeout=tmoval, headers = hdr, proxies=proxies)
[EDIT2]
When your HTTPS url contains any uppercase letter, the patch would fail to work. You need to reverse them to lowercase. Something unknown in the stack requests/urllib3/openssl cause the patch logic being restored to its default TLS1.2 fashion.
[EDIT3]
from http://docs.python-requests.org/en/master/user/advanced/
The mount call registers a specific instance of a Transport Adapter to a prefix. Once mounted, any HTTP request made using that session whose URL starts with the given prefix will use the given Transport Adapter.
So, to make all HTTPS requests include those redirected by the server afterwards to use the new adapter, must change this line to:
ses.mount('https://', DESAdapter())
Somehow it fixed the uppercase problem mentioned above.

Connect to secure web servive with pfx certificate using python

Hi i need help to get info from web service on secure site with pfx certificate with password.
i tried more then one example..
code example:
import requests
wsdl_url = 'blabla'
requests.get(wsdl_url, cert='cert.pfx', verify=True)
other example:
import urllib3
import certifi
wsdl_url = 'blabla'
http = urllib3.PoolManager(
cert_reqs='CERT_REQUIRED', # Force certificate check.
ca_certs=certifi.where() # Path to the Certifi bundle.
)
certifi.where()
# You're ready to make verified HTTPS requests.
try:
r = http.request('GET', wsdl_url)
print r
except urllib3.exceptions.SSLError as e:
print "wrong"
# Handle incorrect certificate error.
Error type:
connection aborted
an existing connection was forcibly closed by the remote host
help please

build_opener vs. urlopen for self-signed SSL cert

I'm trying to get the opener in urllib to work with a self-signed SSL cert in Python 2.7.9. It works perfectly with simple, direct urlopen, like so:
import urllib2
req = urllib2.Request('https://myurl.com')
r = urllib2.urlopen(req,cafile='/my/certs.pem')
r.read()
... But when I use (what I think is) effectively the same thing setting a handler instead ...
import ssl
import urllib2
s = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
s.load_verify_locations(cafile='/my/certs.pem')
sh = urllib2.HTTPSHandler(s)
o = urllib2.build_opener(sh)
r = o.open('https://myurl.com')
r.read()
... I get an error on the cert:
urllib2.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:581)>
EDIT: I've simplified my example for the purposes of the question, but I need to use the second version, because I need to add another handler (for digest authentication) before opening the connection. So, if there's an alternative option for using self-signed certs with digest authentication, I'm all ears!
Been a while since I've been on here ... Thanks in advance for your help.

How can I create https connection on python 2.7?

The situation
I'm trying to connect to a server on https protocol with python script. Could someone give me the working example that sends a GET request to https server, or web resource to how to create https connection with python?
An attempt so far
I have learned that the module httplib on python supports creation of http connection, but not https connection?
import httplib
conn = httplib.HTTPConnection('https://adsche.skplanet.com/api/startNewTurn')
header = {"Content-type" : "application/json"}
conn.request('GET', '/announce?info_hash=%da', '', header)
r1 = conn.getresponse()
print r1.status, r1.reason
data1 = r1.read()
print data1

Python requests 503 erros when trying to access localhost:8000

I am facing a bit of a situation,
Scenario: I got a django rest api running on my localhost:8000 and I want to access the api using my command line. I have tried urllib2 and python requests libs to talk to the api but failed(i'm getting a 503 error). But when I pass google.com as the url, I am getting the expected response. So I believe my approach is correct but I'm doing something wrong. please see the code below :
import urllib, urllib2, httplib
url = 'http://localhost:8000'
httplib.HTTPConnection.debuglevel = 1
print "urllib"
data = urllib.urlopen(url);
print "urllib2"
request = urllib2.Request(url)
opener = urllib2.build_opener()
feeddata = opener.open(request).read()
print "End\n"
Envioroments:
OS Win7
python v2.7.5
Django==1.6
Markdown==2.3.1
colorconsole==0.6
django-filter==0.7
django-ping==0.2.0
djangorestframework==2.3.10
httplib2==0.8
ipython==1.0.0
jenkinsapi==0.2.14
names==0.3.0
phonenumbers==5.8b1
requests==2.1.0
simplejson==3.3.1
termcolor==1.1.0
virtualenv==1.10.1
Thanks
I had a similar problem, but found that it was the company's proxy that was preventing from pinging myself.
503 Reponse when trying to use python request on local website
Try:
>>> import requests
>>> session = requests.Session()
>>> session.trust_env = False
>>> r = session.get("http://localhost:5000/")
>>> r
<Response [200]>
>>> r.content
'Hello World!'
If you are registering your serializers with DefaultRouter then your api will appear at
http://localhost:8000/api/ for an html view of the index
http://localhost:8000/api/.json for a JSON view of the index
http://localhost:8000/api/appname for an html view of the individual resource
http://localhost:8000/api/appname/.json for a JSON view of the individual resource
you can check the response in your browser to make sure your URL is working as you expect.