I ssh to my EC2 instance. I can run these commands and they work perfectly:
aws sqs list-queues
aws s3 ls
I have a small Python script that pulls data from a database, formats it as XML, and then uploads the file to S3. This upload fails with this error:
Traceback (most recent call last):
File "./data_test/data_analytics/lexisnexis/async2.py", line 289, in <module>
insert_parallel(engine, qy, Create_Temp.profile_id, nworkers)
File "./data_test/data_analytics/lexisnexis/async2.py", line 241, in insert_parallel
s3upload(bucketname, keyname, f)
File "./data_test/data_analytics/lexisnexis/async2.py", line 89, in s3upload
bucket = conn.get_bucket(bucketname)
File "/usr/lib/python2.7/dist-packages/boto/s3/connection.py", line 506, in get_bucket
return self.head_bucket(bucket_name, headers=headers)
File "/usr/lib/python2.7/dist-packages/boto/s3/connection.py", line 525, in head_bucket
response = self.make_request('HEAD', bucket_name, headers=headers)
File "/usr/lib/python2.7/dist-packages/boto/s3/connection.py", line 668, in make_request
retry_handler=retry_handler
File "/usr/lib/python2.7/dist-packages/boto/connection.py", line 1071, in make_request
retry_handler=retry_handler)
File "/usr/lib/python2.7/dist-packages/boto/connection.py", line 1030, in _mexe
raise ex
SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
How can I have a script that dies, even when aws cli works?
To be clear, I'm running the Python script as the same user, from the same EC2 instance, as I run the aws cli commands.
aws --version
aws-cli/1.11.176 Python/2.7.12 Linux/4.9.43-17.38.amzn1.x86_64 botocore/1.7.34
The last line of your error messages tells you the problem:
SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
Your issue could be one of the following:
1) There is an error with the certificate with the server that you are connecting to.
2) The certificate chain is incomplete for the server that you are connecting to.
3) You are missing "cacert.pem". Do a Google search on "cacert.pem". This is a common problem and there is a lot of information on downloading and installing this file.
Certificate verification in Python
Related
I'm trying to use s3fs in python to connect to an s3 bucket. The associated credentials are saved in a profile called 'pete' in ~/.aws/credentials:
[default]
aws_access_key_id=****
aws_secret_access_key=****
[pete]
aws_access_key_id=****
aws_secret_access_key=****
This seems to work in AWS CLI (on Windows):
$>aws s3 ls s3://my-bucket/ --profile pete
PRE other-test-folder/
PRE test-folder/
But I get a permission denied error when I use what should be equivalent code using the s3fs package in python:
import s3fs
import requests
s3 = s3fs.core.S3FileSystem(profile = 'pete')
s3.ls('my-bucket')
I get this error:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\s3fs\core.py", line 504, in _lsdir
async for i in it:
File "C:\ProgramData\Anaconda3\lib\site-packages\aiobotocore\paginate.py", line 32, in __anext__
response = await self._make_request(current_kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\aiobotocore\client.py", line 154, in _make_api_call
raise error_class(parsed_response, operation_name)
ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<ipython-input-9-4627a44a7ac3>", line 5, in <module>
s3.ls('ma-baseball')
File "C:\ProgramData\Anaconda3\lib\site-packages\s3fs\core.py", line 993, in ls
files = maybe_sync(self._ls, self, path, refresh=refresh)
File "C:\ProgramData\Anaconda3\lib\site-packages\fsspec\asyn.py", line 97, in maybe_sync
return sync(loop, func, *args, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\fsspec\asyn.py", line 68, in sync
raise exc.with_traceback(tb)
File "C:\ProgramData\Anaconda3\lib\site-packages\fsspec\asyn.py", line 52, in f
result[0] = await future
File "C:\ProgramData\Anaconda3\lib\site-packages\s3fs\core.py", line 676, in _ls
return await self._lsdir(path, refresh)
File "C:\ProgramData\Anaconda3\lib\site-packages\s3fs\core.py", line 527, in _lsdir
raise translate_boto_error(e) from e
PermissionError: Access Denied
I have to assume it's not a config issue within s3 because I can access s3 through the CLI. So something must be off with my s3fs code, but I can't find a whole lot of documentation on profiles in s3fs to figure out what's going on. Any help is of course appreciated.
First Time I am trying AWS services. I have to integrate AWS polly with asterisk for text to speech.
here is example code i written to convert text to speech
from boto3 import client
import boto3
import StringIO
from contextlib import closing
polly = client("polly", 'us-east-1' )
response = polly.synthesize_speech(
Text="Good Morning. My Name is Rajesh. I am Testing Polly AWS Service For Voice Application.",
OutputFormat="mp3",
VoiceId="Raveena")
print(response)
if "AudioStream" in response:
with closing(response["AudioStream"]) as stream:
data = stream.read()
fo = open("pollytest.mp3", "w+")
fo.write( data )
fo.close()
I am getting following error.
Traceback (most recent call last):
File "pollytest.py", line 11, in <module>
VoiceId="Raveena")
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 253, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 530, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 141, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 166, in _send_request
request = self.create_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 150, in create_request
operation_name=operation_model.name)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 227, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 210, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/signers.py", line 90, in handler
return self.sign(operation_name, request)
File "/usr/local/lib/python2.7/dist-packages/botocore/signers.py", line 147, in sign
auth.add_auth(request)
File "/usr/local/lib/python2.7/dist-packages/botocore/auth.py", line 316, in add_auth
raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials
I want to provide credentials directly in this script so that i can use this in asterisk system application.
UPDATE:
created a file ~/.aws/credentials with below content
[default]
aws_access_key_id=XXXXXXXX
aws_secret_access_key=YYYYYYYYYYY
now for my current login user its working fine, but for asterisk PBX it is not working.
Your code runs perfectly fine for me!
The last line is saying:
botocore.exceptions.NoCredentialsError: Unable to locate credentials
So, it is unable to authenticate against AWS.
If you are running this code on an Amazon EC2 instance, the simplest method is to assign an IAM Role to the instance when it is launched (it can't be added later). This will automatically assign credentials that can be used by application running on the instance -- no code changes required.
Alternatively, you could obtain an Access Key and Secret Key from IAM for your IAM User and store those credentials in a local file via the aws configure command.
It is bad practice to put credentials in source code, since they may become compromised.
See:
IAM Roles for Amazon EC2
Best Practices for Managing AWS Access Keys
Please note,asterisk pbx usually run under asterisk user.
So you have put authentification for that user, not root.
I am attempting to use requests package from python to access this site: https://egov.uscis.gov/casestatus/landing.do
When I ran this command:
requests.get('https://egov.uscis.gov/casestatus/landing.do')
I got the usual SSL error when your authentication verification fails..
Read through stackoverflow and adopted one of the solutions: download the certificate in (.crt) and then used openssl to convert to .pem file. I then copied the contents from this .pem file to the end of cacert.pem. However this did not work.
>>> requests.get('https://egov.uscis.gov/casestatus/landing.do')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\Sandra\Anaconda\lib\site-packages\requests\api.py", line 69, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\Sandra\Anaconda\lib\site-packages\requests\api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "C:\Users\Sandra\Anaconda\lib\site-packages\requests\sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\Sandra\Anaconda\lib\site-packages\requests\sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "C:\Users\Sandra\Anaconda\lib\site-packages\requests\adapters.py", line 431, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:581)
Any pointers as to how I can overcome this without resorting to verify=False
Also Is there any difference in downloading the file via https://superuser.com/a/97203 and https://superuser.com/a/176721?
Because I have no issue with requests.get('https://www.google.com'), do other websites place restrictions on the certificate you download?
egov.usics.gov does not provide a complete chain in its SSL handshake.
You'll need to employ a workaround similar to what's suggested here until the site administrator fixes the certificate chain issue. The intermediate certificate in your case can be obtained from https://ssl-tools.net/certificates/yuox7i-symantec-class-3-secure-server-ca
There are three ways to setup CA cert:
$ pip install certifi then
>>> requests.get(url, verify=certifi.where())
>>> requests.get(url, verify='/path/to/cert_bundle_file')
>>> os.environ['REQUESTS_CA_BUNDLE'] = '/path/to/cert_bundle_file'
>>> requests.get(url)
I would like to ask if it is currently possible to use spark-ec2 script https://spark.apache.org/docs/latest/ec2-scripts.html together with credentials that are consisting not only from: aws_access_key_id and aws_secret_access_key, but it also contains aws_security_token.
When I try to run the script I am getting following error message:
ERROR:boto:Caught exception reading instance data
Traceback (most recent call last):
File "/Users/zikes/opensource/spark/ec2/lib/boto-2.34.0/boto/utils.py", line 210, in retry_url
r = opener.open(req, timeout=timeout)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 404, in open
response = self._open(req, data)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 422, in _open
'_open', req)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 382, in _call_chain
result = func(*args)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 1214, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 1184, in do_open
raise URLError(err)
URLError: <urlopen error [Errno 64] Host is down>
ERROR:boto:Unable to read instance data, giving up
No handler was ready to authenticate. 1 handlers were checked. ['QuerySignatureV2AuthHandler'] Check your credentials
Does anyone has some idea what can be possibly wrong? Is aws_security_token the problem?
It maybe seems to me more as boto than Spark problem.
I have tried both:
1) setting credentials in ~/.aws/credentials and ~/.aws/config
2) setting credential by commands:
export aws_access_key_id=<my_aws_access_key>
export aws_secret_access_key=<my_aws_seecret_key>
export aws_security_token=<my_aws_security_token>
My launch command is:
./spark-ec2 -k my_key -i my_key.pem --additional-tags "mytag:tag1,mytag2:tag2" --instance-profile-name "profile1" -s 1 launch test
you can setup your credentials & config using the command aws configure.
I had the same issue but in my case my AWS_SECRET_ACCESS_KEY had a slash, I regenerated the key until there was no slash and it worked
The problem was that I did not use profile called default after renaming everything worked well.
Why google adwords api stops on call this link:
https://adwords.google.com/api/adwords/mcm/v201502/CustomerService?wsdl
With this error - should I load some certificate before and how?
urllib2.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)>
Using Python 2.7.10.
Full source code:
create_adwords_client_without_yaml.py
Full error code:
Traceback (most recent call last):
File "C:/Users/Crezary Wagner/PycharmProjects/learn-adwords/src/examples/create_adwords_client_without_yaml.py", line 56, in <module>
CLIENT_CUSTOMER_ID)
File "C:/Users/Crezary Wagner/PycharmProjects/learn-adwords/src/examples/create_adwords_client_without_yaml.py", line 50, in main
customer = adwords_client.GetService('CustomerService').get()
File "C:\root\Python27\lib\site-packages\googleads\adwords.py", line 256, in GetService
proxy=proxy_option, cache=self.cache, timeout=3600)
File "C:\root\Python27\lib\site-packages\suds\client.py", line 115, in __init__
self.wsdl = reader.open(url)
File "C:\root\Python27\lib\site-packages\suds\reader.py", line 150, in open
d = self.fn(url, self.options)
File "C:\root\Python27\lib\site-packages\suds\wsdl.py", line 136, in __init__
d = reader.open(url)
File "C:\root\Python27\lib\site-packages\suds\reader.py", line 74, in open
d = self.download(url)
File "C:\root\Python27\lib\site-packages\suds\reader.py", line 92, in download
fp = self.options.transport.open(Request(url))
File "C:\root\Python27\lib\site-packages\suds\transport\https.py", line 62, in open
return HttpTransport.open(self, request)
File "C:\root\Python27\lib\site-packages\suds\transport\http.py", line 67, in open
return self.u2open(u2request)
File "C:\root\Python27\lib\site-packages\suds\transport\http.py", line 132, in u2open
return url.open(u2request, timeout=tm)
File "C:\root\Python27\lib\urllib2.py", line 431, in open
response = self._open(req, data)
File "C:\root\Python27\lib\urllib2.py", line 449, in _open
'_open', req)
File "C:\root\Python27\lib\urllib2.py", line 409, in _call_chain
result = func(*args)
File "C:\root\Python27\lib\urllib2.py", line 1240, in https_open
context=self._context)
File "C:\root\Python27\lib\urllib2.py", line 1197, in do_open
raise URLError(err)
urllib2.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)>
Python uses certificates from 'system ssl certificate store' to certify https connection, if there is not any appropriate ssl cert in the store error occurs like this.
Download ssl certificate (Open your https link in browser and click lock icon in address bar > More Information > View Certificate > Details > Export) and install it on your system as stated this link http://windows.microsoft.com/en-us/windows/import-export-certificates-private-keys#1TC=windows-7
Not sure if that's the problem here, but worth checking it.
Python 2.7.9 enabled certificate validation by default for HTTP connections.
The server you're connecting to does not have a certificate that is trusted by your client. pysphere should configure SSL appropriately for this use case.
Try making your request like:
requests.get('https://adwords.google.com/api/adwords/mcm/v201502/CustomerService?wsdl', verify=False)
Try this, it helped me:
import ssl
ssl._create_default_https_context = ssl._create_unverified_context
I encountered this issue. I had my phone setup using the same DNS block list and it wasn't immediately apparent after I'd enabled the tool and resumed work on this particular project. I suggest scrutinizing your setup and verify that there aren't any adblockers (DNS level in my case ala NextDNS/hosted PiHole) enabled. Hours upon hours spent trying out python versions, certificates, reinstalling things. Hope this helps someone!