How to give the local zip path in AWS CouldFormation YAML CodeUri? - amazon-web-services

I have exported a lambda YAML from its export funtion using Download AWS SAM file.
Also I have Downloaded the code zip file from Download deployment package.
in the YAML file we need to give the CodeUri
in the Downloaded YAML it is . as shown in the below picture.
So when I upload it in the AWS CouldFormation it says:
'CodeUri' is not a valid S3 Uri of the form 's3://bucket/key' with
optional versionId query parameter.
I need to know is there a way to give the zip file in the CodeUri from the local file path rather then uploading it in the S3.
I have tried with the zip file name I downloaded as well and still I get the same error.

You have to first run package command. It may not work with zip itself, so you may try with unpacked source code.

Related

Why can't my GCP script/notebook find my file?

I have a working script that finds the data file when it is in the same directory as the script. This works both on my local machine and Google Colab.
When I try it on GCP though it can not find the file. I tried 3 approaches:
PySpark Notebook:
Upload the .ipynb file which includes a wget command. This downloads the file without error but I am unsure where it saves it to and the script can not find the file either (I assume because I am telling it that the file is in the same directory and pressumably using wget on GCP saves it somewhere else by default.)
PySpark with bucket:
I did the same as the PySpark notebook above but first I uploaded the dataset to the bucket and then used the two links provided in the file details when you click the file name inside the bucket on the console (neither worked). I would like to avoid this though as wget is much faster then downloading on my slow wifi then reuploading to the bucket through the console.
GCP SSH:
Create cluster
Access VM through SSH.
Upload .py file using the cog icon
wget the dataset and move both into the same folder
Run script using python gcp.py
Just gives me an error saying file not found.
Thanks.
As per your first and third approach, if you are running a PySpark code on Dataproc, irrespective of whether you use .ipynb file or .py file, please note the below points:
If you use the ‘wget’ command to download the file, then it will be downloaded in the current working directory where your code is executed.
When you try to access the file through the PySpark code, it will check defaultly in HDFS. If you want to access the downloaded file from the current working directory, use the “ file:///” URI with absolute file path.
If you want to access the file from HDFS, then you have to move the downloaded file to HDFS and then access from there using an absolute HDFS file path. Please refer the below example:
hadoop fs -put <local file_name> </HDFS/path/to/directory>

CodeDeploy pipeline not finding AppSpec.yml - but is clearly available

I've had this running months ago, so I know it works, but have created a new EC2 instance to deploy my code and stuck at the first hurdle.
My Deployment Details runs as follows:
Application Stop - succeeded
Download Bundle - succeeded
BeforeInstall - Failed
Upon looking at the failed event, I get:
The CodeDeploy agent did not find an AppSpec file within the unpacked revision directory at revision-relative path "appspec.yml". The revision was unpacked to directory "C:\ProgramData/Amazon/CodeDeploy/57f7ec1b-0452-444e-840c-4deb4566e82d/d-WH9HTZAW0/deployment-archive", and the AppSpec file was expected but not found at path "C:\ProgramData/Amazon/CodeDeploy/57f7ec1b-0452-444e-840c-4deb4566e82d/d-WH9HTZAW0/deployment-archive/appspec.yml". Consult the AWS CodeDeploy Appspec documentation for more information at http://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file.html
Thing is, if I jump onto my EC2 and copy and paste the full path, sure enough I see the YML file, along with the files that were in a ZIP file within my S3 bucket, so they've been successfully sent to the EC2 and unzipped.
So I'm sure it's not a permissions things, the connection is being clearly made, and the S3 Bucket, CodeDeploy and my EC2 are all happy.
I read various posts on StackOverflow about changing the AppSpec.yml file to "appspec.yml", "AppSpec.yaml", "appspec.yaml", and still nothing works.
Anything obvious to try out?
OK, after a few days back and forth, the solution was incredibly annoying (and embarrassing)...
On my EC2 instance, the "File Name Extensions" was unticked, so my AppSpec.yml was actually AppSpec.yml.txt
IF anyone else has a similar issue, do check this first!!
How are you zipping the file. A lot of times users end up "double-zipping". To check if you unzip the .zip file does it gives you the files or the folder?
When we zip a folder on Windows, it basically creates a folder inside the zip folder and thus, CodeDeploy agent cannot read it. So to zip the artifact, please select all the files and then right click to zip it on the same location. This would avoid creating a new folder inside the zip.

google cloud functions command to package without deploying

I must be missing something because I cant find this option here: https://cloud.google.com/sdk/gcloud/reference/beta/functions/deploy
I want to package and upload my function to a bucket: --stage-bucket
But not actually deploy the function
I'm going to deploy multiple functions (different handlers) from the same package with a Deployment Manager template: type: 'gcp-types/cloudfunctions-v1:projects.locations.functions'
gcloud beta functions deploy insists on packaging AND deploying the function.
Where is the gcloud beta functions package command?
Here is an example of the DM template I plan to run:
resources:
- name: resource-name
type: 'gcp-types/cloudfunctions-v1:projects.locations.functions'
properties:
labels:
testlabel1: testlabel1value
testlabel2: testlabel2value
parent: projects/my-project/locations/us-central1
location: us-central1
function: function-name
sourceArchiveUrl: 'gs://my-bucket/some-zip-i-uploaded.zip'
environmentVariables:
test: '123'
entryPoint: handler
httpsTrigger: {}
timeout: 60s
availableMemoryMb: 256
runtime: nodejs8
EDIT: I realized I have another question. When I upload a zip does that zip need to include dependencies? Do I have to do npm install or pip install first and include those packages in the zip or does cloud functions read my requirements.txt and packages.json and do that for me?
The SDK CLI does not provide a command to package your function.
This link will provide you with detail on how to zip your files together. There are just two points to follow:
File type should be a zip file.
File size should not exceed 100MB limit.
Then you need to call an API, which returns a Signed URL to upload the package.
Once uploaded you can specify the URL minus the extra parameters as the location.
There is no gcloud functions command to "package" your deployment, presumably because this amounts to just creating a zip file and putting it into the right place, then referencing that place.
Probably the easiest way to do this is to generate a zip file and copy it into a GCS bucket, then set the sourceArchiveUrl on the template to the correct location.
There are 2 other methods:
You can point to source code in source repository (this would use the sourceRepository part of the template).
You can get a direct url (using this API) to upload a ZIP file to using a PUT request, upload the code there, and then pass this same URL to the signedUploadUrl on the template. This is the method discussed in #John's answer. It does not require you to do any signing yourself, and likewise does not require you to create your own bucket to store the code in (the "Signed URL" refers to a private cloud functions location).
At least with the two zip file methods you do not need to include the (publicly available) dependencies -- the package.json (or requirements.txt) file will be processed by cloud functions to install them. I don't know about the SourceRepository method but I would expect it would work similarly. There's documentation about how cloud functions installs dependencies during deployment of a function for node and python.

Lambda function throws class not found exception when deployed with Jenkins generated zip file

I'm working on AWS Lambda function .I deploy it by uploading a zip file and source code (project) written in Java 8.
project is built using gradle. upon successful build, it generates the deployment zip.
this works perfectly fine when I deploy the locally generated zip in Lambda function.
Working scenario:
Zip generated through gradle build locally in workspace -> copied to AWS S3
location -> specify the s3 zip path in Lambda upload/specify URL path field.
but when I generate the gradle build from jenkins , the zip which is generated is not working in the lambda function. it throws "class not found exception"
Exception scenario:
Zip generated through gradle in Jenkins -> copied to AWS S3 location ->
specify the s3 zip path in Lambda upload/specify URL path field.
Class not found: com.sample.HelloWorld: java.lang.ClassNotFoundException
java.lang.ClassNotFoundException: com.sample.HelloWorld
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
I suspected this could be the issue with file permissions of the content inside the zip file. i verfied this by comparing both the zip contents in a linux
environment. I could see that files from the zip generated from jenkins lacks some permissions hence i handled permissons provision for the zip contents in
my gradle build code.
task zip(type: Zip) {
archiveName 'lambda-project.zip'
fileMode 0777
from sourceSets.main.output.files.each { zipTree(it) }
from (configurations.runtime) {
into 'lib'
}
}
But still I'm getting the same error. I can see the file contents now have full permissions but still getting the same error.
Note:
Tried to make the deployment package as jar and tested. still getting same error.
I have configured the lambda handler configuration correctly. example: class name is "HelloWorld.java" and package name is com.sample then
my lambda handler configuration is com.sample.HelloWorld. I'm pretty confident about this point because with the same configuration
it works fine when zip generated locally
I have compared the zip contents (locally generated and jenkins generated ) could not see any difference in them
The directories inside the zip files were lacking permissions. I have tried by providing file permissions earlier but it worked after providing permissions for directories in gradle build.
dirMode 0777
I would recommend using serverless framework for lambda deployment, serverless framework help us to deploy lambda functions without much hassle. But if you want to setup CI, CD, monitoring and logging then you can refer to the book below.

Cannot upload development package to Lambda

I am constantly getting this error when trying to upload my development package to lambda. On my windows 7 pro box.
--zip-file must be a zip file with fileb:// prefix.
I have googled and found very little help. I have tried with a full path, with quotes, without, file instead of fileb all without any hope.
My publish Batch file:
del emailer.zip
cd emailer
"C:\Program Files\WinRAR\rar.exe" a -r emailer.zip
move /y emailer.zip ../emailer.zip
cd ..
aws lambda update-function-code --function-name emailer --zip-file fileb://emailer.zip
I have uploaded the development package here in case there is an issue with how I have constructed the package.
Why am I constantly getting this error? what do I need to do/research to resolve this issue?
Your file is not a valid zip file, you have created it through winrar which have created another type of archive
when downloading your file
fhenri#machine:~/Downloads$ file emailer.zip
emailer.zip: RAR archive data, v1d, os: Win32
When create a zip file (unzip zip cli) I am getting
fhenri#machine:~/Downloads$ file emailer_zip.zip
email_zip.zip: Zip archive data, at least v1.0 to extract
If you need to use winrar, you can check use winrar command line to create zip archives to create a correct zip archive, otherwise just winzip or another zip program