AWS boto3 code unable to connect to IoT endpoint - amazon-web-services

I am trying to send a 'hello world' message to an AWS IoT endpoint.
The Amazon documentation at
https://docs.aws.amazon.com/panorama/latest/dev/applications-awssdk.html
has this simple code sample:
import boto3
iot_client=boto3.client('iot-data')
topic = "panorama/panorama_my-appliance_Thing_a01e373b"
iot_client.publish(topic=topic, payload="my message")
This code works fine when I put it inside a Lambda function.
But When I try to run this code on my PC in a stand-alone Python application, I get the error message:
certificate verify failed: unable to get local issuer certificate
(_ssl.c:1125)
I do have an .aws/credentials file with entries like
[default]
aws_access_key_id = xxxxxxxxxx
aws_secret_access_key = xxxxxxxxxx
I checked the endpoint is correct:
aws iot describe-endpoint
command returns a valid -ats end point like:
"endpointAddress": "xxxxxxx-ats.iot.us-east-2.amazonaws.com"
If I specify this end point while creating the client:
iot_client=boto3.client('iot-data',
region_name='us-east-2',
endpoint_url=xxxxxxx-ats.iot.us-east-2.amazonaws.com)
I get the error:
ValueError: Invalid endpoint: xxxxxx-ats.iot.us-east-2.amazonaws.com
What am I missing? Do I need to download any certificate files? If so, this code does not seem to use any certificates.
The same setup is working with S3 or DynamoDB:
s3 = boto3.resource('s3')
and
dynamodb = boto3.resource('dynamodb')
are working fine on my PC.

I had this same issue and adding https:// fixed it for me.
iot_client=boto3.client('iot-data',
region_name='us-east-2',
endpoint_url=https://xxxxxxx-ats.iot.us-east-2.amazonaws.com)

Related

Unable to execute HTTP request with aws sdk and localstack

I'm trying to hit the localstack s3 service with aws sdk and it works well with aws cli but the aws sdk is behaving weird by adding the bucketname to the front of the url mentioning unable to connect.
[![INTELLIJ debug][1]][1]
Code is as below
public void testS3() {
final String localStackS3URL = "http://localhost:4566";
final String REGION = "us-east-1";
final AwsClientBuilder.EndpointConfiguration endpoint = new AwsClientBuilder.EndpointConfiguration(localStackS3URL, REGION);
final AmazonS3 client = AmazonS3ClientBuilder.standard()
.withEndpointConfiguration(endpoint)
.build();
if(!client.doesBucketExistV2("test")){
client.createBucket("test");
}
}
Can anyone help me what is wrong here ? It works with aws cli but the aws sdk is prefixing the bucket name strangely.
[![cmd aws cli][2]][2]
Thanks in advance
[1]: https://i.stack.imgur.com/wMI8D.png
[2]: https://i.stack.imgur.com/L0jLV.png
try adding the http client parameter while building the S3 client it worked for me
httpClient(UrlConnectionHttpClient.builder().build())

Not possible to connect from Nextcloud to an Aws s3 bucket

I am trying to add external storage to my Nextcloud to use. That would be an AWS S3 bucket. However, this is not possible because I get the following error message:
Exception: Creation of bucket \"nextcloud-modul346\" failed. Error executing \"CreateBucket\" on \"http:\/\/nextcloud-modul346.s3.eu-west-1.amazonaws.com\/\"; AWS HTTP error: Client error: `PUT http:\/\/nextcloud-modul346.s3.eu-west-1.amazonaws.com\/` resulted in a `403 Forbidden` response:\n\u003C?xml version=\"1.0\" encoding=\"UTF-8\"?\u003E\n\u003CError\u003E\u003CCode\u003EInvalidAccessKeyId\u003C\/Code\u003E\u003CMessage\u003EThe AWS Access Key Id you provided (truncated...)\n InvalidAccessKeyId (client): The AWS Access Key Id you provided does not exist in our records. - \u003C?xml version=\"1.0\" encoding=\"UTF-8\"?\u003E\n\u003CError\u003E\u003CCode\u003EInvalidAccessKeyId\u003C\/Code\u003E\u003CMessage\u003EThe AWS Access Key Id you provided does not exist in our records.\u003C\/Message\u003E\u003CAWSAccessKeyId\u003EASIARERFVIEWRBG5WD63\u003C\/AWSAccessKeyId\u003E\u003CRequestId\u003EM6BN3MC6F0214DQM\u003C\/RequestId\u003E\u003CHostId\u003EgVf0nUVJXQDL2VV50pP0qSzbTi+N+8OMbgvj4nUMv10pg\/T5VVccb4IstfopzzhuxuUCtY+1E58=\u003C\/HostId\u003E\u003C\/Error\u003E
However, I cannot use IAM users or groups as this is blocked by my organization. Also, I work with the AWS Learner Lab and I have to use S3.
As credentials I have specified in Nextcloud the aws_access_key_id and aws_secret_access_key from Learnerlab. However, I cannot connect to it. This Post havn't helped either.
Does anyone know a solution to this problem which does not involve IAM?
Thanks for any help!

Boto3 not directing to the endpoint_url

I'm currently trying to connect to my enterprise s3 URL (which is not amazon web-service) using boto3 and I have the following error.
EndpointConnectionError: Could not connect to the endpoint URL: "https://s3.fr-par.amazonaws.com/my_buket...." which is absolutely not the enpoint given in the code.
s3 = boto3.resource(service_name='s3',
aws_access_key_id= 'XXXXXX',
aws_secret_access_key='YYYYYYY',
endpoint_url= 'https://my_buket.s3.my_region.my_company_enpoint_url')
my_bucket=s3.Bucket(s3_bucket_name)
bucket_list = []
for file in my_bucket.objects.filter(Prefix='boston.csv'):
bucket_list.append(file.key)
As can be seen in the error image boto3 tries to connect to a amazonaws url, which is not that of my enterprise. Finally I want to indicate that I am able to connect to my enterprise s3 using minIO https://docs.min.io/ which indicate there no errors in the aws_access_key_id, the aws_secret_access_key and endpoint_url I use with boto3.
I have executed the code using a python 3.9 environment (Boto3 version 1.22.1) a anaconda 3.9 environment (Boto3 version 1.22.0) and a jupyter notebook always with same error. The OS is an Ubuntu 20.04.4 LTS virtualized on Oracle VM virtual box.
https://my_buket.s3.my_region.my_company_enpoint_url is not the endpoint. The list of valid S3 endpoints is here. But normally you don't have to specify it explicitly. Boto3 will "know" which endpoint to use for each region.
Since some people seem to have the same problem, I'm posting the solution I found.
For some reason the code in the question still doesn't work for me. Alternatively, I handle pointing to my enterprise's S3 just by first creating a session and creating the resource and client from it. Note that in endpoint_url, no bucket is indicated.
Since there is no bucket in endpoint_url, you have access to all buckets associated with the credential pass, and therefore it is necessary to specify the bucket in the resource and client instances methods.
session = boto3.Session(region_name=my_region)
resource = session.resource('s3',
endpoint_url='https://s3.my_region.my_company_enpoint_url',
aws_access_key_id='XXXXXX',
aws_secret_access_key='YYYYYY')
client = session.client('s3',
endpoint_url='https://s3.my_region.my_company_enpoint_url',
aws_access_key_id='XXXXXX',
aws_secret_access_key='YYYYYY')
client.upload_file(path_to_local_file, bucket_name, upload_path,
Callback=call,
ExtraArgs=ExtraArgs)

boto3 not able to access given region name while taking region provided by AWS Lambda

I have boto client like this
client = boto3.client('rekognition', region_name="us-east-1")
I am using this client to detect text from image and deployed code in AWS region where Rekognition api is not available but provided the region-name where it is available in client. On executing/Testing the lambda function, it is giving
errorMessage": "Could not connect to the endpoint URL: \"https://rekognition.ap-south-1.amazonaws.com/"
Why it is picking ap-south-1 as i provided in client-"us-east-1"
client = boto3.client('rekognition', region_name="us-east-1")
But when I run the code locally with region-name:- ap-south-1 and in client
client = boto3.client('rekognition', region_name="us-east-1")
its running wonderfully
but not running on AWS lambda
While successfully running when both the regions are same(us-east-1)
So great if anyone can provide any suggestion, Required Help soon!!!!!!!
As on March 15th 2018, AWS Rekogniton is not supported in Mumbai (ap-south-1)
See supported regions: Amazon Rekognition - Available Regions

Amazon AWS 403 InvalidAccesskey Error when I run the Amazon S3 Sample

I'm trying to just test out AWS s3 with eclipse using Java, I'm just trying to execute the Amazon s3 sample, but it doesn't recognise my credentials, and I'm sure my credentials are legitimate, it gives me the following error:
===========================================
Getting Started with Amazon S3
===========================================
Listing buckets
Caught an AmazonServiceException, which means your request made it to Amazon S3, but was rejected with an error response for some reason.
Error Message: Status Code: 403, AWS Service: Amazon S3, AWS Request ID: 057D91D336C1FASC, AWS Error Code: InvalidAccessKeyId, AWS Error Message: The AWS Access Key Id you provided does not exist in our records.
HTTP Status Code: 403
AWS Error Code: InvalidAccessKeyId
Error Type: Client
Request ID: 057D91D336C1FASC
a little update here:
so there's a credential file that aws creates in the computer system. mine case was '/Users/macbookpro/.aws/credentials'
the file in this place decides the default accessKeyId and stuff.. go ahead and update it.
So I ran into the same issue, but i think i figured it out.
I was using Node.js, but i think the problem should be the same since it's how they have structured their object was the issue.
in javascript if you run this in the backend,
var aws = require('aws-sdk');
aws.config.accessKeyId= "Key bablbalab"
console.log(aws.config.accessKeyId)
you will find it prints out something different. coz the correct way of setting the accessKeyId isn't what they have provided in the official website tutorial
aws.config.accessKeyId="balbalb"
or
aws.config.loadFromPath = ('./awsConfig.json')
or any of that.
If you log the entire "aws.config", you will find the correct way is
console.log(aws.config)
console.log(aws.config.credentials.secretAccessKey)
aws.config.credentials.secretAccessKey="Key balbalab"
you see the structure of the object? there's the inconsistence