Error importing AWS Lambda function
The following error was encountered when attempting to import your function
spawn unzip ENOENT
that is the particular error on import,I am using ubuntu AMI
is there any method in awscli to download lambda and upload with libraries install
I had this error when trying to import too - fixed by installing unzip on the server running Cloud9 (it was ubuntu so 'sudo apt-get install unzip').
Related
I am attempting to create an AWS Elastic Beanstalk environment using the EB CLI, however I am running into an issue when using the command
eb create environment-name
the error is
ERROR: FileNotFoundError - [Errno 2] No such file or directory: './homebrew/Library/Homebrew/os/mac/pkgconfig/fuse/osxfuse.pc'
however when I try to install osxfuse with brew install --cask osxfuse I receive an error of
installer: Error - The FUSE for macOS installation package is not compatible with this version of macOS.
I have also tried to install macfuse with no change
I am using MacOs Ventura
I have tried to install osxFuse macFuse, and nothing changes this.
I'm trying to run the alphafold.v2 on AWS via their batch architecture program (github link: https://github.com/aws-samples/aws-batch-architecture-for-alphafold)
When running the first command to install dependencies:
%pip install -r notebook-requirements.txt -q -q
I'm getting the following error message:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed.
This behaviour is the source of the following dependency conflicts.
awscli 1.24.10 requires botocore==1.26.10, but you have botocore 1.23.54 which is incompatible.
aiobotocore 1.3.0 requires botocore<1.20.50,>=1.20.49, but you have botocore 1.23.54 which is incompatible
I'm a bit confused as to how I can meet both requirments for a botocore 1.26.10 and botocore<1.20.50,>=1.20.49 - is it possible to have both installed at the same time?
Thank for the help!
I'm using Shell Command Activity that calls a Python script. This Python script utilizes boto3 to perform some functions. In the shell script in Shell Command Activity , I'm trying to install boto3 on to machine before calling my Python script. I'm also installing pip.
Inspite of installing boto3, I keep getting the error
"ImportError: No module named boto3" , while it is executing my Python script when the data pipeline is activated
This is how my shell script looks like:
#!/bin/bash
sudo yum -y install python37
curl -O https://bootstrap.pypa.io/get-pip.py
python3 get-pip.py --user
pip install boto3 --upgrade --user
aws s3 cp s3://path-to-my-script/python_script.py /tmp/python_script.py
python /tmp/python_script.py
I also tried the recommendation mentioned here, which also failed with the same error: https://stackoverflow.com/a/44225052/4549186
(All of data pipeline activities run on the configured Ec2 resource that is created during pipeline activation)
What is the right way to install pip/boto3 on the ec2 resource and refer to it in the Python code?
This might be a really silly question, but I'm trying to train this model: https://github.com/Rayhane-mamah/Tacotron-2 on an AWS instance. I'm using an AWS educate account so I was unable to launch an EC2 instance with a Deep Learning AMI, instead I launched a regular Linux 2 AMI.
As per the repo's machine setup instructions, I installed python3 and pip and tensorflow onto the instance. However, I am unable to run the command:
sudo yum install -y libasound-dev portaudio19-dev libportaudio2 libportaudiocpp0 ffmpeg libav-tools
(the repo lists the command with apt-get instead of yum)
When I run that command, most of the packages are unavailable. The output I get is:
No package libasound-dev available.
No package portaudio19-dev available.
No package libportaudio2 available.
No package libportaudiocpp0 available.
No package ffmpeg available.
No package libav-tools available.
How can I install these packages onto my ec2 instance? Thanks
EDIT: I see now my issue is EC2's Linux 2 AMI is running on Centos. I would have to manually install each of these packages (I think). Might be easier to try and launch an Ubuntu server, or Linux 1 and use the docker file included in the repo.
You Can use Cloud Formation Template to install the pacakges inside EC2 .In that way whenever EC2 comes up , it will come up with all the packages.
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/deploying.applications.html
I recently installed PostgreSQL Server at my AWS AMI EC2 instance.
I used this answer to guide me through the right RPM package.
Everything is working fine, however, now when I try to install Postgis with sudo yum install postgis23_95.x86_64 I receive:
Error: Package: postgis23_95-2.3.7-1.rhel6.x86_64 (pgdg95)
Requires: libpcre.so.0()(64bit)
I cannot manage to fix this error. I also tried sudo yum install pcre-tools.x86_64, but nothing changed.