GCloud Error: Source code size exceeds the limit - google-cloud-platform

I'm doing the tutorial of basic fulfillment and conversation setup of api.ai tutorial to make a chat bot, and when I try to deploy the function with the command:
gcloud beta functions deploy --stage-bucket venky-bb7c4.appspot.com --trigger-http
(where 'venky-bb7c4.appspot.com' is the bucket_name)
It return the following error message:
ERROR: (gcloud.beta.functions.deploy) OperationError: code=3, message=Source code size exceeds the limit
I've searched but not found any answer, I don't know where is the error.
this is the JS file that appear in the tutorial:
/
HTTP Cloud Function.
#param {Object} req Cloud Function request context.
#param {Object} res Cloud Function response context.
*/
exports.helloHttp = function helloHttp (req, res) {
response = "This is a sample response from your webhook!" //Default response from the webhook to show it's working
res.setHeader('Content-Type', 'application/json'); //Requires application/json MIME type
res.send(JSON.stringify({ "speech": response, "displayText": response
//"speech" is the spoken version of the response, "displayText" is the visual version
}));
};

Neither of these worked for me. The way I was able to fix this was to make sure I was running the deploy from my project directory (the directory containing index.js)

The command creates zip with whole content of your current directory (except node_modules subdirectory) not just the JS file (this is because your function may use other resources).
The error you see is because size of (uncompressed) files in the directory is bigger than 512MB.
The easiest way to solve this is by moving the .js file to its own directory and deploying from there (you can use --local-path to point to directory containing the source file if you want your working directory to be different from directory with function source).

I tried with source option or deploying from the index.js folder and still a different problem exists.
This error usually happens if the code that is being uploaded is large. In my tests I found more than 100MB lead to the mentioned error.
However,
To resolve this there are two solutions.
Update .gcloudignore to ignore the folders which aren't required for your function
Still if option 1 doesn't resolve, you need to create a bucket in storage and mention it with --stage-bucket option.
Create a new bucket for deployment (one time)
gsutil mb my-cloud-functions-deployment-bucket
The bucket you created needs to be unique else it throws already created
Deploy
gcloud functions deploy subscribers-firestoreDatabaseChange
--trigger-topic firestore-database-change
--region us-central1
--runtime nodejs10
--update-env-vars "REDIS_HOST=10.128.0.2"
--stage-bucket my-cloud-functions-deployment-bucket

I had similar problems while deploying cloud functions. What is working for me was specifying the js files source folder.
gcloud functions deploy functionName --trigger-http **--source path_to_project_root_folder**
Also be sure to include all unnecessary folders in .gcloudignore.

Ensure the package folder has a .gitignore file (excluding node_modules).
The most recent version of gcloud requires it in order to not load node_modules. My code size went from 119MB to 17Kb.
Once I've added the .gitignore file, the log printed as well
created .gcloudignore file. See `gcloud topic gcloudignore` for details.

Related

Can you deploy a Gen2 cloud function from below the top level of a Cloud Source repository?

It appears that you cannot deploy a Gen2 cloud function using gcloud from a cloud source repo unless it is at the top level.
Here's a sample redacted deploy command for a gen 1 python function that works:
gcloud beta functions deploy funcname --source https://source.developers.google.com/projects/projectname/repos/reponame/moveable-aliases/main/paths/pathname --runtime python310 --trigger-http --project=projectname
if you add the -gen2 flag, it fails because it can't find main.py. Error is:
OperationError: code=3, message=Build failed with status: FAILURE and message: missing main.py and GOOGLE_FUNCTION_SOURCE not specified. Either create the function in main.py or specify GOOGLE_FUNCTION_SOURCE to point to the file that contains the function.
If you add main.py to the root of the repo and run the same command, it finds main.py, which indicates to me that it isn't honoring the paths.
There is an additional problem which doesn't matter unless the first one is fixed, which is that if pathname is below the top level (folder/subfolder) gcloud sees that as a syntax error when the gen2 flag is set, but not without it.
Is there any way around this? It is very inconvenient.
Answering as community wiki.As per above comments
There is a bug raised for this at issue tracker. Which is still open further progress can be tracked there.

How can I find the implementation code for streaming data from Cloud Storage into BiqQuery?

When I try to run the relevant code in Cloud Shell that would allow the streaming function to be deployed it claims that the source folder containing the streaming function itself does not exist.
The relevant buckets have already been created, it's the function itself which appears not to be there - would it be possible to install this separately maybe?
The original code followed by the error message is given below:
gcloud functions deploy streaming \
--source=./functions/streaming --runtime=python37 \
--stage-bucket=${FUNCTIONS_BUCKET} \
--trigger-bucket=${FILES_SOURCE}
(gcloud.functions.deploy) argument '--source': Provided directory does not exist
As shown in the error message, the path './functions/streaming' does not exist in your cloud shell. You have to pass the absolute path to the directory (in your cloud shell) where your python code is located. Please refer to the documentation below:
The source parameter can take 3 different locations:
--source=SOURCE Location of source code to deploy. Location of the source can be one of the following three options:
Source code in Google Cloud Storage (must be a .zip archive),
Reference to source repository or,
Local filesystem path (root directory of function source).
Note that, depending on your runtime type, Cloud Functions will look
for files with specific names for deployable functions. For Node.js,
these filenames are index.js or function.js. For Python, this is
main.py.

google cloud functions command to package without deploying

I must be missing something because I cant find this option here: https://cloud.google.com/sdk/gcloud/reference/beta/functions/deploy
I want to package and upload my function to a bucket: --stage-bucket
But not actually deploy the function
I'm going to deploy multiple functions (different handlers) from the same package with a Deployment Manager template: type: 'gcp-types/cloudfunctions-v1:projects.locations.functions'
gcloud beta functions deploy insists on packaging AND deploying the function.
Where is the gcloud beta functions package command?
Here is an example of the DM template I plan to run:
resources:
- name: resource-name
type: 'gcp-types/cloudfunctions-v1:projects.locations.functions'
properties:
labels:
testlabel1: testlabel1value
testlabel2: testlabel2value
parent: projects/my-project/locations/us-central1
location: us-central1
function: function-name
sourceArchiveUrl: 'gs://my-bucket/some-zip-i-uploaded.zip'
environmentVariables:
test: '123'
entryPoint: handler
httpsTrigger: {}
timeout: 60s
availableMemoryMb: 256
runtime: nodejs8
EDIT: I realized I have another question. When I upload a zip does that zip need to include dependencies? Do I have to do npm install or pip install first and include those packages in the zip or does cloud functions read my requirements.txt and packages.json and do that for me?
The SDK CLI does not provide a command to package your function.
This link will provide you with detail on how to zip your files together. There are just two points to follow:
File type should be a zip file.
File size should not exceed 100MB limit.
Then you need to call an API, which returns a Signed URL to upload the package.
Once uploaded you can specify the URL minus the extra parameters as the location.
There is no gcloud functions command to "package" your deployment, presumably because this amounts to just creating a zip file and putting it into the right place, then referencing that place.
Probably the easiest way to do this is to generate a zip file and copy it into a GCS bucket, then set the sourceArchiveUrl on the template to the correct location.
There are 2 other methods:
You can point to source code in source repository (this would use the sourceRepository part of the template).
You can get a direct url (using this API) to upload a ZIP file to using a PUT request, upload the code there, and then pass this same URL to the signedUploadUrl on the template. This is the method discussed in #John's answer. It does not require you to do any signing yourself, and likewise does not require you to create your own bucket to store the code in (the "Signed URL" refers to a private cloud functions location).
At least with the two zip file methods you do not need to include the (publicly available) dependencies -- the package.json (or requirements.txt) file will be processed by cloud functions to install them. I don't know about the SourceRepository method but I would expect it would work similarly. There's documentation about how cloud functions installs dependencies during deployment of a function for node and python.

Google Cloud Function Environment Variables Directory

I've got a Cloud Function working properly, but now I'd like to obfuscate some credentials with environmental variables. When I try running this command:
gcloud beta functions deploy my-function --trigger-http --set-env-vars user=username,pass=password --runtime nodejs6 --project my-project
I get this error:
ERROR: (gcloud.beta.functions.deploy) OperationError: code=3, message=Function load error: File index.js or function.js that is expected to define function doesn't exist in the root directory.
I created the function using the GCP web UI, and I can't find the directory where the function lives to cd into. Presumably running the same command from the directory that the function lives in would work.
Where do cloud functions live in my project?
If you check the details of the function deployment on the logs, you'll notice field protoPayload.request.function.sourceUploadUrl contains a URL where your source code is uploaded during the deployment.
This URL is is in the form of https://storage.googleapis.com/gcf-upload-<region>-<random>/<random>.zip. This means that the function is uploaded that GCS bucket. That GCS bucket is not on your project (it's from Google), so you won't have direct access to the files. You can download the files stored on that bucket through the console (the "Download zip" button on the source page).
The upload bucket can also be found through
gcloud functions list --format='table[](name,sourceUploadUrl)'
Knowing this, you have 2 paths:
Use another way of deploying the function (e.g. from a source repo)
Use the API to patch the function
I'm partial to the second option, since it's really easy to execute:
curl -XPATCH -H"Authorization: Bearer $(gcloud auth print-access-token)" -H'content-type:application/json' 'https://cloudfunctions.googleapis.com/v1/projects/<PROJECT_ID>/locations/<REGION>/functions/<FUNCTION_ID>?updateMask=environmentVariables' -d'{"environmentVariables":{"user":"<USERNAME>", "password":"<PASSWORD>"}}'
However, if you need to access anywhere Google related, rather than pass user/password, it's best to use the application default credentials, and providing access to the service account on the resource. To find out the service account used by your function, you can run:
gcloud functions list --format='table[](name,serviceAccountEmail)'

Lambda function throws class not found exception when deployed with Jenkins generated zip file

I'm working on AWS Lambda function .I deploy it by uploading a zip file and source code (project) written in Java 8.
project is built using gradle. upon successful build, it generates the deployment zip.
this works perfectly fine when I deploy the locally generated zip in Lambda function.
Working scenario:
Zip generated through gradle build locally in workspace -> copied to AWS S3
location -> specify the s3 zip path in Lambda upload/specify URL path field.
but when I generate the gradle build from jenkins , the zip which is generated is not working in the lambda function. it throws "class not found exception"
Exception scenario:
Zip generated through gradle in Jenkins -> copied to AWS S3 location ->
specify the s3 zip path in Lambda upload/specify URL path field.
Class not found: com.sample.HelloWorld: java.lang.ClassNotFoundException
java.lang.ClassNotFoundException: com.sample.HelloWorld
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
I suspected this could be the issue with file permissions of the content inside the zip file. i verfied this by comparing both the zip contents in a linux
environment. I could see that files from the zip generated from jenkins lacks some permissions hence i handled permissons provision for the zip contents in
my gradle build code.
task zip(type: Zip) {
archiveName 'lambda-project.zip'
fileMode 0777
from sourceSets.main.output.files.each { zipTree(it) }
from (configurations.runtime) {
into 'lib'
}
}
But still I'm getting the same error. I can see the file contents now have full permissions but still getting the same error.
Note:
Tried to make the deployment package as jar and tested. still getting same error.
I have configured the lambda handler configuration correctly. example: class name is "HelloWorld.java" and package name is com.sample then
my lambda handler configuration is com.sample.HelloWorld. I'm pretty confident about this point because with the same configuration
it works fine when zip generated locally
I have compared the zip contents (locally generated and jenkins generated ) could not see any difference in them
The directories inside the zip files were lacking permissions. I have tried by providing file permissions earlier but it worked after providing permissions for directories in gradle build.
dirMode 0777
I would recommend using serverless framework for lambda deployment, serverless framework help us to deploy lambda functions without much hassle. But if you want to setup CI, CD, monitoring and logging then you can refer to the book below.