"import requests" in AWS Lambda without a layer? - amazon-web-services

I've had a lambda function in Python 3.9 which is pretty simple. Just sends a post request. It has worked fine for the past few weeks.
I've deleted and remade the stack several times in CloudFormation, and have had no issues.
Randomly, today it started throwing the error
"errorMessage": "Unable to import module 'index': No module named 'requests'",
"errorType": "Runtime.ImportModuleError"
index.py is the name of the lambda function.
So I looked it up, and everyone says that 'requests' is not prepackaged into Lambda, and you need to create a layer where you manually install it into a zip etc.
How is this possible? It worked for weeks without having to create a layer, and the code was not touched.
But now 'import requests' just stops working?
There are of course other alternatives to the requests module. 'urllib' comes to mind. But this is bugging me. How is it possible that it worked for weeks, but randomly just stops being able to import this module?
Its like some kind of Mandela effect. Every single person says you need to create a venv, and package it with dependencies manually. I've never had to do that and it worked for AGES. I feel like im going crazy...

Did you possibly change the version of Python used by the function?
From Upcoming changes to the Python SDK in AWS Lambda | AWS Compute Blog:
In response to customer feedback, we have decided to cancel the change described in this blog post. The version of the AWS SDK included in the AWS Lambda runtimes for Python 2.7, Python 3.6 and Python 3.7 will continue to include the ‘requests’ module in Botocore. No action is required for customers using these runtimes. The Lambda runtimes for Python 3.8 and later do not include the ‘requests’ module. Customers using ‘requests’ with these runtimes should package the ‘requests’ module with their function code or as a Lambda layer.

Related

No module named 'openpyxl'",

I am trying to write code using the AWS Lambda function to extract Pdf Invoice data from the AWS Textract Service and save the data into Excel. To do this, I installed the openpyxl library, created a zip file for it, and created a layer in lambda function that uses the openpyxl library. I am getting the following error ( No module named 'openpyxl'",). I would appreciate your assistance in resolving it.
Have you tried Textractor pip install amazon-textract-textractor? It comes built-in with the export-to-excel features and comes with the lamdba layers pre-built on the official GitHub repository: https://aws-samples.github.io/amazon-textract-textractor/using_in_lambda.html
Note that the available lamdba layer uses XlsxWriter instead of openpyxl.
Disclaimer: I am a maintainer of Textractor.

ModuleNotFoundError: No module named 'aiohttp' in AWS Glue

I am using AWS glue to create ETL workflow, where I am fetching the data from the API and loading it into RDS. In AWS Glue, I used pyspark script. In the same script, I have used the 'aiohttp' and 'asyncio' modules of python to call my API asynchronously. But in AWS glue it is throwing me an error that Module Not found for the only aiohttp.
I have already tried with different versions of aiohttp module and tested in the glue job but still throwing me the same error. Can someone please help me with this topic?
Glue 2.0
AWS Glue version 2.0 lets you provide additional Python modules or different versions at the job level. You can use the --additional-python-modules job parameter with a list of comma-separated Python modules to add a new module or change the version of an existing module.
Also, within the --additional-python-modules option you can specify an Amazon S3 path to a Python wheel module.
This link to official documentation lists all modules already available. If you need a different version or need one to be installed, it can be specified in the parameter mentioned above.
Glue 1.0 & 2.0
You can zip the python library, upload it so s3 and specify the path as --extra-py-files job parameter.
See link to official documentation for more information.

Unable to import module Error in AWS Lambda Nodejs

I am working on an AWS Lambda where I need to use AWS Textract, for this, I have used AWS-SDK, with AWS-SDK I was able to import s3, but AWS Textract is not working, when I deploy and test it shows,
Unable to import module 'src/functions/routes/handler': Error
I think it has something to do with packaging, like may be It is not packaging the relevant file, but not sure and I don't know how to make it package if it is the problem, any suggestions are appreciated.
Since you have not given any code to go on, I will consider the possibility that you have properly written your import code and this error is a valid response to an issue. In that case, you should note that there are issues with importing Textract in Lambda via the aws-sdk. See this reported issue:
https://github.com/aws/aws-sdk-js/issues/2728
For a thorough and thoughtful example of using Textract with a Lambda service, consider this excellent article posted with complete details:
https://medium.com/#sumindaniro/aws-textract-with-lambda-walkthrough-ed4473aedd9d

Dynamically append function to AppSync Pipeline Resolver via CloudFormation

I am currently developing an AppSync based API in a domain driven manner, so we need to put a function to an already created Pipeline Resolver. Does anybody know if there is there any chance doing this via CloudFormation without using a custom resource?
Thanks in advance, Sven
Terraform can do this neatly if you can build this pull request yourself or vote in this issue to get it merged into a public release.
The new syntax is described here.
The build process is actually quite simple. It took me about 30 min end-to-end.
Install GoLang.
Clone the repo with the changes and sync it with the main (upstream) repo.
Make sure you cloned it into go\src\github.com\terraform-providers\terraform-provider-aws folder.
Run go build from go\src\github.com\terraform-providers\terraform-provider-aws
Replace .terraform\plugins\...\terraform-provider-aws-* executable with the one you compiled.
Run terraform init
Test by trying to import a function terraform import aws_appsync_function.example xxxxx-yyyyy
I hope the pull request gets merged by the time you read this.

Problem creating Lambda function that has a Layer using boto3

If I try to use boto3 Lambda create_function() to create a Lambda function, and I try to include Layers via Layers=['string'] parameter, I get the following error message:
Unknown parameter in input: "Layers", must be one of: FunctionName, Runtime, Role, Handler, Code, Description, Timeout, MemorySize, Publish, VpcConfig, DeadLetterConfig, Environment, KMSKeyArn, TracingConfig, Tags
... any ideas? The documentation suggests that this should work, but something is clearly off here. NOTE: I also have a similar problem with "Layers" in update_function_configuration() as well.
My guess is that the version of boto3 that the AWS Lambda console uses has not been updated/refreshed yet to support Layers. Because when I run the same code locally on a machine with a fairly recent version of boto3, it runs without any problems. I have already tried using both listed Python runtimes of 3.6 and 3.7 that in the AWS console, but neither worked. These runtimes have respective versions of boto3 of 1.7.74 and 1.9.42. But my local machine has 1.9.59. So perhaps the addition of Lambda Layers occurred between 1.9.42 and 1.9.59.
My guess is that the version of boto3 that the AWS Lambda console uses has not been updated/refreshed yet to support Layers.
That's completely right. AWS usually updates the available libraries on AWS Lambda regularly, but hasn't updated them for several months now for unknown reasons.
The supported API endpoints are actually not defined in boto3, but in botocore.
Currently botocore 1.10.74 is available on AWS Lambda, while support for AWS Lambda Layers got added in botocore 1.12.56.
To avoid such incompatibilities between your code and the versions of available libraries, you should create a deployment package containing boto3 and botocore in addition to your AWS Lambda function code, so your code uses your bundled versions instead the ones AWS provides. That's what AWS suggests as part of their best practices as well:
Control the dependencies in your function's deployment package.
The AWS Lambda execution environment contains a number of libraries such as the AWS SDK for the Node.js and Python runtimes (a full list can be found here: Lambda Execution Environment and Available Libraries). To enable the latest set of features and security updates, Lambda will periodically update these libraries. These updates may introduce subtle changes to the behavior of your Lambda function. To have full control of the dependencies your function uses, we recommend packaging all your dependencies with your deployment package.