First Time I am trying AWS services. I have to integrate AWS polly with asterisk for text to speech.
here is example code i written to convert text to speech
from boto3 import client
import boto3
import StringIO
from contextlib import closing
polly = client("polly", 'us-east-1' )
response = polly.synthesize_speech(
Text="Good Morning. My Name is Rajesh. I am Testing Polly AWS Service For Voice Application.",
OutputFormat="mp3",
VoiceId="Raveena")
print(response)
if "AudioStream" in response:
with closing(response["AudioStream"]) as stream:
data = stream.read()
fo = open("pollytest.mp3", "w+")
fo.write( data )
fo.close()
I am getting following error.
Traceback (most recent call last):
File "pollytest.py", line 11, in <module>
VoiceId="Raveena")
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 253, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 530, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 141, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 166, in _send_request
request = self.create_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 150, in create_request
operation_name=operation_model.name)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 227, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 210, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/signers.py", line 90, in handler
return self.sign(operation_name, request)
File "/usr/local/lib/python2.7/dist-packages/botocore/signers.py", line 147, in sign
auth.add_auth(request)
File "/usr/local/lib/python2.7/dist-packages/botocore/auth.py", line 316, in add_auth
raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials
I want to provide credentials directly in this script so that i can use this in asterisk system application.
UPDATE:
created a file ~/.aws/credentials with below content
[default]
aws_access_key_id=XXXXXXXX
aws_secret_access_key=YYYYYYYYYYY
now for my current login user its working fine, but for asterisk PBX it is not working.
Your code runs perfectly fine for me!
The last line is saying:
botocore.exceptions.NoCredentialsError: Unable to locate credentials
So, it is unable to authenticate against AWS.
If you are running this code on an Amazon EC2 instance, the simplest method is to assign an IAM Role to the instance when it is launched (it can't be added later). This will automatically assign credentials that can be used by application running on the instance -- no code changes required.
Alternatively, you could obtain an Access Key and Secret Key from IAM for your IAM User and store those credentials in a local file via the aws configure command.
It is bad practice to put credentials in source code, since they may become compromised.
See:
IAM Roles for Amazon EC2
Best Practices for Managing AWS Access Keys
Please note,asterisk pbx usually run under asterisk user.
So you have put authentification for that user, not root.
Related
I am trying to schedule query using big query data transfer api and giving required permission bigquery.admin and enabled the big query transfer api.
Permission Documentation:
https://cloud.google.com/bigquery-transfer/docs/enable-transfer-service
Also tried with project owner permission to the service account. But still giving same error.
Code Documentation: (Setting up a scheduled query with a service account)
https://cloud.google.com/bigquery/docs/scheduling-queries
Part in which error coming
transfer_config = transfer_client.create_transfer_config(
bigquery_datatransfer.CreateTransferConfigRequest(
parent=parent,
transfer_config=transfer_config,
service_account_name=service_account_name,
)
)
Error StackTrace
Traceback (most recent call last):
File "/home/ubuntu/prod/venv_trellai/lib/python3.6/site-packages/google/api_core/grpc_helpers.py", line 73, in error_remapped_callable
return callable_(*args, **kwargs)
File "/home/ubuntu/prod/venv_trellai/lib/python3.6/site-packages/grpc/_channel.py", line 946, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "/home/ubuntu/prod/venv_trellai/lib/python3.6/site-packages/grpc/_channel.py", line 849, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.PERMISSION_DENIED
details = "The caller does not have permission"
debug_error_string = "{"created":"#1633536014.842657676","description":"Error received from peer ipv4:142.250.192.138:443","file":"src/core/lib/surface/call.cc","file_line":1070,"grpc_message":"The caller does not have permission","grpc_status":7}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "__main__.py", line 728, in <module>
mbc.schedule_query()
File "/home/ubuntu/prod/trell-ds-framework/data_engineering/data_migration/schedule_quries.py", line 62, in schedule_query
service_account_name=service_account_name,
File "/home/ubuntu/prod/venv_trellai/lib/python3.6/site-packages/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/client.py", line 647, in create_transfer_config
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
File "/home/ubuntu/prod/venv_trellai/lib/python3.6/site-packages/google/api_core/gapic_v1/method.py", line 145, in __call__
return wrapped_func(*args, **kwargs)
File "/home/ubuntu/prod/venv_trellai/lib/python3.6/site-packages/google/api_core/grpc_helpers.py", line 75, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.PermissionDenied: 403 The caller does not have permission
Service file have all these credentials below.
BigQuery Admin
BigQuery Data Transfer Service Agent
Service Account Token Creator
Storage Admin
I am already setting up json authentication cred in environment variable but still gives permission error.
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = Constants.BIG_QUERY_SERVICE_ACCOUNT_CRED
Can anyone help me out here? Thanks in advance.
Take a look at this page on authentication: https://cloud.google.com/bigquery/docs/authentication/service-account-file#python
Assuming you're using Service Account, you can provide the credentials explicitly to confirm they work as expected:
from google.cloud import bigquery
from google.oauth2 import service_account
# TODO(developer): Set key_path to the path to the service account key
# file.
# key_path = "path/to/service_account.json"
credentials = service_account.Credentials.from_service_account_file(
key_path, scopes=["https://www.googleapis.com/auth/cloud-platform"],
)
client = bigquery.Client(credentials=credentials, project=credentials.project_id,)
I would recommend you to see if the service account you are using is referring to the project you are using and has all the permissions needed to schedule the query. My best guess is that you are pointing to another project with the service account.
Also, you need one extra role for the service account that is the next one “Service Account Token Creator”.
I went on Google Cloud and enabled a project, billing, and the Cloud Speech to Text API. Then I downloaded a .json file. Then I tried to execute this basic code in PyCharm.
import os
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] ="instant-medium-282.json"
from google.cloud import speech_v1
from google.cloud.speech_v1 import enums
client = speech_v1.SpeechClient()
encoding = enums.RecognitionConfig.AudioEncoding.FLAC
sample_rate_hertz = 44100
language_code = 'en-US'
config = {'encoding': encoding, 'sample_rate_hertz': sample_rate_hertz, 'language_code': language_code}
uri = 'gs://bucket_name/file_name.flac'
audio = {'uri': uri}
response = client.recognize(config, audio)
However, I keep getting this error:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
return callable_(*args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/grpc/_channel.py", line 826, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/grpc/_channel.py", line 729, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.PERMISSION_DENIED
details = "The billing account for the owning project is disabled in state absent"
debug_error_string = "{"created":"#1593884707.640503000","description":"Error received from peer ipv6:[2607:f8b0:4009:813::200a]:443","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"The billing account for the owning project is disabled in state absent","grpc_status":7}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/sal.py", line 16, in <module>
response = client.recognize(config, audio)
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/google/cloud/speech_v1/gapic/speech_client.py", line 255, in recognize
return self._inner_api_calls["recognize"](
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__
return wrapped_func(*args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/google/api_core/retry.py", line 281, in retry_wrapped_func
return retry_target(
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/google/api_core/retry.py", line 184, in retry_target
return target()
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout
return func(*args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.PermissionDenied: 403 The billing account for the owning project is disabled in state absent
I also confirmed with Google Cloud customer support that billing is enabled as it should be. Any suggestions on how to fix this error?
I ran into an issue while uploading a file using the CLI where I was trying to copy into a bucket that did not actually exist. It threw this same error "AccessDeniedException: 403 The billing account for the owning project is disabled in state closed".
If all was well with billing, perhaps the bucket you were pointing to did not exist?
Just wanted to flag that there seem to be non-billing reasons for such an error could be thrown..
The "The billing account for the owning project is disabled in state absent" error message,may happen when the API has just been enabled recently. So it is recommended to try again.
In my case,
"AccessDeniedException: 403 The billing account for the owning project is disabled in state closed"
happen, because I didn't added 'Billing Account' to the project I've created.
You can link your project to a Billing Account in Billing Projects tab.
Once you did so, wait some time and the error should gone.
I am attempting to retrieve a list of files from S3 with a specific prefix using an AWS Lambda. I bundle the Lambda with boto3-1.9.244 (the latest version). When I run the Lambda, I receive a SyntaxError on the S3 resource assignment although it could have something to do with Boto3 session.
I'm using Python 3.6 and AWS Lambda uses boto3-1.9.221 and botocore-1.12.221. When I run the code without bundling the latest version of boto3, it works. My current solution is to simply bundle boto3-1.9.221 with the lambda code rather than the latest version of boto3.
import boto3
s3 = boto3.resource('s3')
I expect it to create an s3 resource, but I get this error:
invalid syntax (_base.py, line 414): SyntaxError
Traceback (most recent call last):
File "/var/task/lambda_function.py", line 20, in lambda_handler
s3 = boto3.resource('s3')
File "/var/task/boto3/__init__.py", line 100, in resource
return _get_default_session().resource(*args, **kwargs)
File "/var/task/boto3/session.py", line 389, in resource
aws_session_token=aws_session_token, config=config)
File "/var/task/boto3/session.py", line 263, in client
aws_session_token=aws_session_token, config=config)
File "/var/task/botocore/session.py", line 839, in create_client
client_config=config, api_version=api_version)
File "/var/task/botocore/client.py", line 80, in create_client
cls = self._create_client_class(service_name, service_model)
File "/var/task/botocore/client.py", line 110, in _create_client_class
base_classes=bases)
File "/var/task/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/var/task/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/var/task/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/var/task/boto3/utils.py", line 61, in _handler
module = import_module(module)
File "/var/task/boto3/utils.py", line 52, in import_module
__import__(name)
File "/var/task/boto3/s3/inject.py", line 15, in <module>
from boto3.s3.transfer import create_transfer_manager
File "/var/task/boto3/s3/transfer.py", line 127, in <module>
from s3transfer.exceptions import RetriesExceededError as \
File "/var/task/s3transfer/__init__.py", line 134, in <module>
import concurrent.futures
File "/var/task/concurrent/futures/__init__.py", line 8, in <module>
from concurrent.futures._base import (FIRST_COMPLETED,
File "/var/task/concurrent/futures/_base.py", line 414
raise exception_type, self._exception, self._traceback
^
SyntaxError: invalid syntax
Yes, it does support. So this issue is not related to the API version.
You can access a specific API version just by replacing latest with the version number you want in the URL.
Latest
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#bucket
1.9.244
https://boto3.amazonaws.com/v1/documentation/api/1.9.244/reference/services/s3.html#bucket
Turns out the issue was that I was installing the requirements with Python2 rather than Python3. By installing the requirements with Python3, I no longer received a syntax error.
It looks like your lambda function doesn't have the IAM role for S3. You may specify the Access key and Secret key to the resource directly,
resource = boto3.resource(
's3',
# Hard coded strings as credentials, not recommended.
aws_access_key_id='AKIAIO5FODNN7E******', # not real
aws_secret_access_key='ABCDEF+c2L7yXeGvUyrPgYsDnWRRC1AYE******' # not real
)
or have to give the right permission to the lambda function.
I ssh to my EC2 instance. I can run these commands and they work perfectly:
aws sqs list-queues
aws s3 ls
I have a small Python script that pulls data from a database, formats it as XML, and then uploads the file to S3. This upload fails with this error:
Traceback (most recent call last):
File "./data_test/data_analytics/lexisnexis/async2.py", line 289, in <module>
insert_parallel(engine, qy, Create_Temp.profile_id, nworkers)
File "./data_test/data_analytics/lexisnexis/async2.py", line 241, in insert_parallel
s3upload(bucketname, keyname, f)
File "./data_test/data_analytics/lexisnexis/async2.py", line 89, in s3upload
bucket = conn.get_bucket(bucketname)
File "/usr/lib/python2.7/dist-packages/boto/s3/connection.py", line 506, in get_bucket
return self.head_bucket(bucket_name, headers=headers)
File "/usr/lib/python2.7/dist-packages/boto/s3/connection.py", line 525, in head_bucket
response = self.make_request('HEAD', bucket_name, headers=headers)
File "/usr/lib/python2.7/dist-packages/boto/s3/connection.py", line 668, in make_request
retry_handler=retry_handler
File "/usr/lib/python2.7/dist-packages/boto/connection.py", line 1071, in make_request
retry_handler=retry_handler)
File "/usr/lib/python2.7/dist-packages/boto/connection.py", line 1030, in _mexe
raise ex
SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
How can I have a script that dies, even when aws cli works?
To be clear, I'm running the Python script as the same user, from the same EC2 instance, as I run the aws cli commands.
aws --version
aws-cli/1.11.176 Python/2.7.12 Linux/4.9.43-17.38.amzn1.x86_64 botocore/1.7.34
The last line of your error messages tells you the problem:
SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
Your issue could be one of the following:
1) There is an error with the certificate with the server that you are connecting to.
2) The certificate chain is incomplete for the server that you are connecting to.
3) You are missing "cacert.pem". Do a Google search on "cacert.pem". This is a common problem and there is a lot of information on downloading and installing this file.
Certificate verification in Python
i Created new config file:
$ sudo vi ~/.boto
there i paste my credentials (as written in readthedocs for botp):
[Credentials]
aws_access_key_id = YOURACCESSKEY
aws_secret_access_key = YOURSECRETKEY
im trying to check connection:
import boto
boto.set_stream_logger('boto')
s3 = boto.connect_s3("us-east-1")
and my answer:
2014-11-26 14:05:49,532 boto [DEBUG]:Using access key provided by client.
2014-11-26 14:05:49,532 boto [DEBUG]:Retrieving credentials from metadata server.
2014-11-26 14:05:50,539 boto [ERROR]:Caught exception reading instance data
Traceback (most recent call last):
File "/Library/Python/2.7/site-packages/boto/utils.py", line 210, in retry_url
r = opener.open(req, timeout=timeout)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 404, in open
response = self._open(req, data)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 422, in _open
'_open', req)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 382, in _call_chain
result = func(*args)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 1214, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 1184, in do_open
raise URLError(err)
URLError: <urlopen error timed out>
2014-11-26 14:05:50,540 boto [ERROR]:Unable to read instance data, giving up
Traceback (most recent call last):
File "/Users/user/PycharmProjects/project/untitled.py", line 8, in <module>
s3 = boto.connect_s3("us-east-1")
File "/Library/Python/2.7/site-packages/boto/__init__.py", line 141, in connect_s3
return S3Connection(aws_access_key_id, aws_secret_access_key, **kwargs)
File "/Library/Python/2.7/site-packages/boto/s3/connection.py", line 190, in __init__
validate_certs=validate_certs, profile_name=profile_name)
File "/Library/Python/2.7/site-packages/boto/connection.py", line 569, in __init__
host, config, self.provider, self._required_auth_capability())
File "/Library/Python/2.7/site-packages/boto/auth.py", line 975, in get_auth_handler
'Check your credentials' % (len(names), str(names)))
boto.exception.NoAuthHandlerFound: No handler was ready to authenticate. 1 handlers were checked. ['HmacAuthV1Handler'] Check your credentials
why its not found the Credentials?
there is something that i did wrong?
Your issue is:
The string 'us-west-1' you provide as the first argument will be treat as the AWSAccessKeyID.
What you want is:
First creating a connection, note that a connection has no region or location info in it.
conn = boto.connect_s3('your_access_key', 'your_secret_key')
And then when you want to do some thing with the bucket, write the region info as an argument.
from boto.s3.connection import Location
conn.create_bucket('mybucket', location=Location.USWest)
or:
conn.create_bucket('mybucket', location='us-west-1')
By default, the location is the empty string which is interpreted as the US Classic Region, the original S3 region. However, by specifying another location at the time the bucket is created, you can instruct S3 to create the bucket in that location.