Following the sample code provided in the boto3 documentation for using the workmailmessageflow service
import boto3
client = boto3.client('workmailmessageflow')
triggers and UnknownServiceError: "Unknown service: 'workmailmessageflow'. Valid service names are..."
The boto3 version reported by AWS Lambda Python 3.7 is 1.9.221. Any thoughts on why workmailmessageflow is not recognized?
Ref: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/workmailmessageflow.html
Because workmailmessageflow first appeared in Boto3 1.9.228.
You have to use a custom version of Boto3, either use a layer or package the latest version together with your code.
Related
I want to launch a Spark job on EMR Serverless from Airflow. I want to use Spark 3.3.0 and Scala 2.13 but the 6.9.0 EMR Release ships with Scala 2.12. I created a FAT jar including all Spark dependencies and it won't work either. As an alternative, I am trying to use an EMR custom image by creating an application using --image-configuration with the Airflow operator but it won't just pass through all the arguments from the boto API.
create_app = EmrServerlessCreateApplicationOperator(
task_id="create_my_app",
job_type="SPARK",
release_label="emr-6.9.0",
config={"name": "data-ingestion",
"imageConfiguration": {
"imageUri": "xxxxxxx.dkr.ecr.eu-west-1.amazonaws.com/emr-custom-image:0.0.1"}})
Airflow gives the following error message:
Unknown parameter in input: "imageConfiguration", must be one of:
name, releaseLabel, type, clientToken, initialCapacity, maximumCapacity, tags, autoStartConfiguration, autoStopConfiguration, networkConfiguration
This other config won't work either:
config={"name": "data-ingestion",
"imageUri": "xxxxxxx.dkr.ecr.eu-west-1.amazonaws.com/emr-custom-image:0.0.1"})
Does anybody have any ideas other than downgrading my Scala version?
Airflow operator passes the argument to the boto3 client, and this client create the application.
The configuration imageConfiguration is added to boto3 client in 1.26.44 (PR), and the other configuration are added in different version (please check the changelog).
So you can try to upgrade the version of boto3 in you Airflow server, provided that it is compatible with the others dependencies, and if not, you may need to upgrade your Airflow version.
I want to authenticate a user to AWS cognito from within pyscript.
In python, one would use the python boto3 module. In pyscript however, I have to make the calls to AWS running in async mode, and I have to await the answer. boto3 doesn't support asyncio calls.
I found the python aioboto3 module.
Unfortunatly, pyscript reports that it's unable to install the package:
PY1001): Unable to install package(s) 'aioboto3'. Reason: Can't find a pure Python 3 Wheel for package(s) 'aioboto3'
Same error (for another package) is mentioned in
Couldn't find a pure Python 3 wheel for 'tensorflow'. You can use `micropip.install(..., keep_going=True)` to get a list of all packages
Anybody an idea how I can interface with AWS from pyscript?
I'm trying to get the data from s3 with pyspark from AWS EMR Cluster.
I'm still getting this error - An error occurred while calling o27.parquet. : java.lang.NoClassDefFoundError: com/amazonaws/services/s3/model/MultiObjectDeleteException.
I have tried with different versions of jars/clusters, still no results.
from pyspark import SparkContext
from pyspark.sql import SQLContext
from pyspark import SparkConf
from pyspark.sql import SparkSession
conf = SparkConf().set("spark.jars","/usr/lib/spark/jars/hadoop-aws-3.2.1.jar,/usr/lib/spark/aws-java-sdk-s3-1.11.873.jar")
sc = SparkContext( conf=conf)
sqlContext = SQLContext(sc)
df2 = sqlContext.read.parquet("s3a://stackdev2prq/Badges/*")
I'm using hadoop-aws-3.2.1.jar and aws-java-sdk-s3-1.11.873.jar.
Spark 3.0.1 on Hadoop 3.2.1 YARN
I know that version I need propper version aws-java-sdk, but how can I check which version should I dounload?
mvnrepo provides the information
I don't see it for 3.2.1. Looking in the hadoop-project pom and JIRA versions, HADOOP-15642 says "1.11.375"; the move to 1.11.563 only went in with 3.2.2
do put the whole (huge) aws-java-sdk-bundle on the classpath. that shades everything and avoids version mismatch hell with jackson, httpclient etc.
That said: if you are working with EMR, you should just use the s3:// URLs and pick up the EMR team's S3 connector.
I am creating a DeepLens project to recognise people, when one of select group of people are scanned by the camera.
The project uses a lambda, which processes the images and triggers the 'rekognition' aws api.
When I trigger the API from my local machine - I get a good response
When I trigger the API from AWS console - I get failed response
Problem
After much digging, I found that the 'boto3' (AWS python library) is of version:
1.9.62 - on my local machine
1.8.9 - on AWS console
Question
Can I upgrade the 'boto3' library version on the AWS lambda console ?? If so, how ?
If you don't want to package a more recent boto3 version with you function, you can download boto3 with each invocation of the Lambda. Remember that /tmp/ is the directory that Lambda will allow you to download to, so you can use this to temporarily download boto3:
import sys
from pip._internal import main
main(['install', '-I', '-q', 'boto3', '--target', '/tmp/', '--no-cache-dir', '--disable-pip-version-check'])
sys.path.insert(0,'/tmp/')
import boto3
from botocore.exceptions import ClientError
def handler(event, context):
print(boto3.__version__)
You can achieve the same with either Python function with dependencies or with a Virtual Environment.
These are the available options other than that you also try to contact Amazon team if they can help you with up-gradation.
I know, you're asking for a solution through Console, but this is not possible (as of my knowledge).
To solve this you need to provide the boto3 version you require to your lambda (either with the solution from user1998671 or with what Shivang Agarwal is proposing). A third solution is to provide the required boto3 version as a layer for the lambda. The big advantage of the layer is that you can re-use it for all your lambdas.
This can be achieved by following the guide from AWS (the following is mainly copied from the linked guide from AWS):
IMPORTANT: Make sure to adjust boto3-mylayer with a for you suitable name.
Create a lib folder by running the following command:
LIB_DIR=boto3-mylayer/python
mkdir -p $LIB_DIR
Install the library to LIB_DIR by running the following command:
pip3 install boto3 -t $LIB_DIR
Zip all the dependencies to /tmp/boto3-mylayer.zip by running the following command:
cd boto3-mylayer
zip -r /tmp/boto3-mylayer.zip .
Publish the layer by running the following command:
aws lambda publish-layer-version --layer-name boto3-mylayer --zip-file fileb:///tmp/boto3-mylayer.zip
The command returns the new layer's Amazon Resource Name (ARN), similar to the following one:
arn:aws:lambda:region:$ACC_ID:layer:boto3-mylayer:1
To attach this layer to your lambda execute the following:
aws lambda update-function-configuration --function-name <name-of-your-lambda> --layers <layer ARN>
To verify the boto version in your lambda you can simply add the following two print commands in your lambda:
print(boto3.__version__)
print(botocore.__version__)
I'd like to use the Boto3 put_bucket_encryption inside a lambda function, but the current Lambda execution enviornment is at botocore version 1.7.37, and the put_bucket_encryption was introduced in botocore 1.7.41.
So I'd like to package up my local version of boto3/botocore.
I've included pip packages in lambda functions using serverless framework, along with serverless-python-requirements, but it doesn't seem to work for boto3/botocore.
The function responds to a CreateBucket event and tries to put_bucket_encryption, but fails with
'S3' object has no attribute 'put_bucket_encryption': AttributeError
How can I force my lambda function to use a more up to date botocore?
Was able to resolve with kichik's help
What I missed was the section about omitting packages in the serverless-python-requirements docs. Specifically:
By default, this will not install the AWS SDKs that are already installed on Lambda.
So in my serverless.yml I added
custom:
pythonRequirements:
noDeploy:
- pytest
Once I deployed, it was using my packaged versions of boto3/botocore