Google Cloud Function Environment Variables Directory - google-cloud-platform

I've got a Cloud Function working properly, but now I'd like to obfuscate some credentials with environmental variables. When I try running this command:
gcloud beta functions deploy my-function --trigger-http --set-env-vars user=username,pass=password --runtime nodejs6 --project my-project
I get this error:
ERROR: (gcloud.beta.functions.deploy) OperationError: code=3, message=Function load error: File index.js or function.js that is expected to define function doesn't exist in the root directory.
I created the function using the GCP web UI, and I can't find the directory where the function lives to cd into. Presumably running the same command from the directory that the function lives in would work.
Where do cloud functions live in my project?

If you check the details of the function deployment on the logs, you'll notice field protoPayload.request.function.sourceUploadUrl contains a URL where your source code is uploaded during the deployment.
This URL is is in the form of https://storage.googleapis.com/gcf-upload-<region>-<random>/<random>.zip. This means that the function is uploaded that GCS bucket. That GCS bucket is not on your project (it's from Google), so you won't have direct access to the files. You can download the files stored on that bucket through the console (the "Download zip" button on the source page).
The upload bucket can also be found through
gcloud functions list --format='table[](name,sourceUploadUrl)'
Knowing this, you have 2 paths:
Use another way of deploying the function (e.g. from a source repo)
Use the API to patch the function
I'm partial to the second option, since it's really easy to execute:
curl -XPATCH -H"Authorization: Bearer $(gcloud auth print-access-token)" -H'content-type:application/json' 'https://cloudfunctions.googleapis.com/v1/projects/<PROJECT_ID>/locations/<REGION>/functions/<FUNCTION_ID>?updateMask=environmentVariables' -d'{"environmentVariables":{"user":"<USERNAME>", "password":"<PASSWORD>"}}'
However, if you need to access anywhere Google related, rather than pass user/password, it's best to use the application default credentials, and providing access to the service account on the resource. To find out the service account used by your function, you can run:
gcloud functions list --format='table[](name,serviceAccountEmail)'

Related

Can you deploy a Gen2 cloud function from below the top level of a Cloud Source repository?

It appears that you cannot deploy a Gen2 cloud function using gcloud from a cloud source repo unless it is at the top level.
Here's a sample redacted deploy command for a gen 1 python function that works:
gcloud beta functions deploy funcname --source https://source.developers.google.com/projects/projectname/repos/reponame/moveable-aliases/main/paths/pathname --runtime python310 --trigger-http --project=projectname
if you add the -gen2 flag, it fails because it can't find main.py. Error is:
OperationError: code=3, message=Build failed with status: FAILURE and message: missing main.py and GOOGLE_FUNCTION_SOURCE not specified. Either create the function in main.py or specify GOOGLE_FUNCTION_SOURCE to point to the file that contains the function.
If you add main.py to the root of the repo and run the same command, it finds main.py, which indicates to me that it isn't honoring the paths.
There is an additional problem which doesn't matter unless the first one is fixed, which is that if pathname is below the top level (folder/subfolder) gcloud sees that as a syntax error when the gen2 flag is set, but not without it.
Is there any way around this? It is very inconvenient.
Answering as community wiki.As per above comments
There is a bug raised for this at issue tracker. Which is still open further progress can be tracked there.

How do I programmatically download a file from a private Google Cloud Source Repository with a service account?

I have a Google Cloud Source Repository I want my application to download files from. I have a specific use case where I want to get files from a Google Cloud Source Repository programmatically- not GCS or another location.
I want to control permissions to the repo with standard Google IAM. Can I grant a GCP service account access to read from a Cloud Source Repository?
In bitbucket you can download a file directly from a private repo with a rest call like this: curl -s -S --user username:apppassword -L -O https://bitbucket.org/<ORG_NAME>/<REPO>/src/master/<FOLDER>/file.txt
How can I use a GSA to download a file like this from a private Google Cloud Source Repository?
I am doing this in code so I do not have access to ssh or curl or the gcloud cli. I'll be using python to fetch this file.
I was also looking if the SDK supports this. I did not see anything in the docs for a python API for interacting with Google Cloud Source Repositories this way. I'm wondering how I can pull down this file with the requests library or even something like GitPython while authenticating with the GSA.
EDIT
Per the comments I tried creating a token in python and gcloud, but it does not work. The token is generated fine, but file download doesn't work.
I tried this (and via python):
curl -s -S -H "Authorization: Bearer $(gcloud auth print-access-token)" -L -O https://source.cloud.google.com/MY_GCP_PROJECT/MY_REPO/master/README.md
This downloads a huge html page that seems to be showing auth errors.
Maybe the http path is wrong? What is the correct path to the file in the source repo via http GET?
I confirmed I have permissions because this works gcloud source repos clone MY_REPO --project=MY_PROJECT
EDIT
This is where I am right now, I can't figure out what the right URL is to point to a specific branch and file:
import google.auth
import google.auth.transport.requests
import requests
# Generate a token from current security context
creds, project = google.auth.default()
auth_req = google.auth.transport.requests.Request()
creds.refresh(auth_req)
# Set token in Authorization header of http request
headers = {'Authorization':'Bearer {}'.format(creds.token)}
# Repo URL with branch and file specified (trying to download README.md in the root of the repo)
# What is the right URL here?
url = "https://source.developers.google.com/p/<GCP PROJECT>/r/<REPO NAME>/<BRANCH NAME>/README.md"
response = requests.get(url, headers=headers)
# I get a big mess of html with auth errors
print(response.content)
If I use this URL "https://source.developers.google.com/<GCP PROJECT>/<REPO NAME>/<BRANCH NAME>/README.md" I get back a page that includes PERMISSION_DENIED: The caller does not have permission

How to download jar from artifact registry (GCP)?

I have a maven Artifact Registry and am able to add dependency in pom.xml and get the jar.
I have another usecase where I would like to only download the jar using CLI something which you can easily do with other external maven repos eg curl https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-spark-runtime/0.7.0-incubating/iceberg-spark-runtime-0.7.0-incubating.jar --output temp.jar
I don't see any instructions about how to do this.
I needed this too.
I have configured a service account following gcp guide
Then, I have executed the following command to get authbasic credz :
gcloud artifacts print-settings gradle \
[--project=PROJECT] \
[--repository=REPOSITORY] \
[--location=LOCATION] \
--json-key=KEY-FILE \
[--version-policy=VERSION-POLICY] \
[--allow-snapshot-overwrites]
In the output you have the artifactRegistryMavenSecret.
Finally you get your artifact with :
curl -L -u _json_key_base64:{{ artifactRegistryMavenSecret }} https://{{ region }}-maven.pkg.dev/{{ projectId }}/{{ repository }}/path/of/artifact/module/{{ version }}/app-{{ version }}.jar -o file.jar
It seems like this feature as mentioned does not exist yet for Artifact Registry based on this open feature request (this feature request has currently no ETA). However, you can try to implement a Cloud build automation not to only save your built artifact in Artifact Registry, but also to store them in Google Cloud Storage or other Storage repositories; so you can easily access the JARs (since Cloud Storage supports direct downloading).
In order to do this, you would need to integrate Cloud Build with Artifact Registry. The documentation page has instructions to use Maven projects with Cloud Build and Artifact Registry. In addition, you can configure Cloud Build to store built artifacts in Cloud Storage.
Both of these integrations are configured through a Cloud Build configuration file. In this file, the steps for building a project are defined, including integrations to other serverless services. This integration would involve defining a target Maven repository:
steps:
- name: gcr.io/cloud-builders/mvn
args: ['deploy']
And a location to deploy the artifacts into Cloud Storage:
artifacts:
objects:
location: [STORAGE_LOCATION]
paths: [[ARTIFACT_PATH],[ARTIFACT_PATH], ...]
Additional to #Nicolas Roux's answer:
artifactRegistryMavenSecret is basically an encode64 of the Service Account json key.
So instead of runnig gcloud artifacts print-settings gradle and curl -u _json_key_base64:{{ artifactRegistryMavenSecret }}, another way is you can directly use the token from gcloud auth print-access-token, then apply this token to cURL.
For example:
1. gcloud auth activate-service-account SERVICE_ACCOUNT#DOMAIN.COM \
--key-file=/path/key.json --project=PROJECT_ID
2. curl --oauth2-bearer "$(gcloud auth print-access-token)" \
-o app-{{ version }}.jar \
-L https://{{ region }}-maven.pkg.dev/{{ projectId }}/{{ repository }}/path/of/artifact/module/{{ version }}/app-{{ version }}.jar
By that, if you're working with Google Auth Action (google-github-actions/auth#v0) in Github Actions Workflow, then you can easily run the curl command without needing to extract artifactRegistryMavenSecret.

How to specify the root folder to deploy an app using the Cloud SDK?

I'm using "Google App Engine" from GCP to host a static website. I already created the website files (HTML, JS) and yaml using Visual Studio Code. I have the folder with those files stored locally in my local computer.
I downloaded the Cloud SDK Shell for Windows. I logged in to my account, and selected the project. According to videos and tutorials, I need to deploy the app using "gcloud app deploy".
However I got an error saying that an "app.yaml" file is required to deploy this directory...
I'm trying to follow this tutorial:
https://cloud.google.com/appengine/docs/standard/python/getting-started/hosting-a-static-website#before_you_begin
I'm also trying to follow the steps contained in this video:
https://www.youtube.com/watch?v=mlcO7nfQzSg
How do I specify the root folder where I have my "app.yaml" file?
Thanks in advance!!
I already tried with many commands and unfortunately none of them have worked
The particular case in which gcloud app deploy works without additional arguments is for single-service applications only and only if the command is executed in the directory in which the service's app.yaml configuration file exists (and has that exact name, can't use a different name).
For other cases deployables can/must be specified. From gcloud app deploy:
SYNOPSIS
gcloud app deploy [DEPLOYABLES …] [--bucket=BUCKET] [--image-url=IMAGE_URL] [--no-promote] [--no-stop-previous-version]
[--version=VERSION, -v VERSION] [GCLOUD_WIDE_FLAG …]
DESCRIPTION
This command is used to deploy both code and configuration to the App
Engine server. As an input it takes one or more DEPLOYABLES that
should be uploaded. A DEPLOYABLE can be a service's .yaml file or a
configuration's .yaml file (for more information about configuration
files specific to your App Engine environment, refer to
https://cloud.google.com/appengine/docs/standard/python/configuration-files
or
https://cloud.google.com/appengine/docs/flexible/python/configuration-files).
Note, for Java Standard apps, you must add the path to the
appengine-web.xml file inside the WEB-INF directory. gcloud app
deploy skips files specified in the .gcloudignore file (see gcloud
topic gcloudignore for more information).
So apart from running the command with no arguments in the directory in which your app.yaml exists is to specify the app.yaml (with a full or relative path if needed) as a deployable:
gcloud app deploy path/to/your/app.yaml
IMHO doing this is a good habit - specifying deployables is more reliable and is the only way to deploy apps with multiple services or using routing via a dispatch.yaml file.
gcloud app deploy will look at the current directory first for app.yaml. Generally you will change to the directory with app.yaml and your other files before deploying

GCloud Error: Source code size exceeds the limit

I'm doing the tutorial of basic fulfillment and conversation setup of api.ai tutorial to make a chat bot, and when I try to deploy the function with the command:
gcloud beta functions deploy --stage-bucket venky-bb7c4.appspot.com --trigger-http
(where 'venky-bb7c4.appspot.com' is the bucket_name)
It return the following error message:
ERROR: (gcloud.beta.functions.deploy) OperationError: code=3, message=Source code size exceeds the limit
I've searched but not found any answer, I don't know where is the error.
this is the JS file that appear in the tutorial:
/
HTTP Cloud Function.
#param {Object} req Cloud Function request context.
#param {Object} res Cloud Function response context.
*/
exports.helloHttp = function helloHttp (req, res) {
response = "This is a sample response from your webhook!" //Default response from the webhook to show it's working
res.setHeader('Content-Type', 'application/json'); //Requires application/json MIME type
res.send(JSON.stringify({ "speech": response, "displayText": response
//"speech" is the spoken version of the response, "displayText" is the visual version
}));
};
Neither of these worked for me. The way I was able to fix this was to make sure I was running the deploy from my project directory (the directory containing index.js)
The command creates zip with whole content of your current directory (except node_modules subdirectory) not just the JS file (this is because your function may use other resources).
The error you see is because size of (uncompressed) files in the directory is bigger than 512MB.
The easiest way to solve this is by moving the .js file to its own directory and deploying from there (you can use --local-path to point to directory containing the source file if you want your working directory to be different from directory with function source).
I tried with source option or deploying from the index.js folder and still a different problem exists.
This error usually happens if the code that is being uploaded is large. In my tests I found more than 100MB lead to the mentioned error.
However,
To resolve this there are two solutions.
Update .gcloudignore to ignore the folders which aren't required for your function
Still if option 1 doesn't resolve, you need to create a bucket in storage and mention it with --stage-bucket option.
Create a new bucket for deployment (one time)
gsutil mb my-cloud-functions-deployment-bucket
The bucket you created needs to be unique else it throws already created
Deploy
gcloud functions deploy subscribers-firestoreDatabaseChange
--trigger-topic firestore-database-change
--region us-central1
--runtime nodejs10
--update-env-vars "REDIS_HOST=10.128.0.2"
--stage-bucket my-cloud-functions-deployment-bucket
I had similar problems while deploying cloud functions. What is working for me was specifying the js files source folder.
gcloud functions deploy functionName --trigger-http **--source path_to_project_root_folder**
Also be sure to include all unnecessary folders in .gcloudignore.
Ensure the package folder has a .gitignore file (excluding node_modules).
The most recent version of gcloud requires it in order to not load node_modules. My code size went from 119MB to 17Kb.
Once I've added the .gitignore file, the log printed as well
created .gcloudignore file. See `gcloud topic gcloudignore` for details.