Redeploy Google Cloud Function from command line using Source Repositories - google-cloud-platform

I have a fairly simply Google Cloud Function that I'm deploying from Cloud Source Repositories.
I'm using the Google Cloud Shell as my development machine.
When I make updates to the function as I'm developing, I use the CLI to push updates to my Source Repository. However, running the gcloud functions deploy ... command from the command line doesn't seem to force GCF to pull in the latest source.
Occasionally, the deploy command after pushing new source code will simply state "Nothing to update." (which is incorrect.)
More often, it will go through the deployment process but the function will still run the previous version of the code.
When this happens the only way I can get the function to update is to use the dashboard, "Edit" the function, and then hit the Deploy button (even though I didn't change anything.)
Am I forgetting to do some kind of versioning or tagging that is required? Is there a way to force the CLI to pull the most current commit from the source repo?

I think you're looking for the --source=SOURCE gcloud functions deploy option to point to a source repository instead of the current directory (the default):
--source=SOURCE
Location of source code to deploy. Location of the source can be one
of the following three options:
Source code in Google Cloud Storage (must be a .zip archive),
Reference to source repository or,
Local filesystem path (root directory of function source).
Note that if you do not specify the --source flag:
Current directory will be used for new function deployments.
If the function is previously deployed using a local filesystem path, then function's source code will be updated using the current
directory.
If the function is previously deployed using a Google Cloud Storage location or a source repository, then the function's source code will
not be updated.
The value of the flag will be interpreted as a Cloud Storage location,
if it starts with gs://.
The value will be interpreted as a reference to a source repository,
if it starts with https://.
Otherwise, it will be interpreted as the local filesystem path. When
deploying source from the local filesystem, this command skips files
specified in the .gcloudignore file (see gcloud topic
gcloudignore for more information). If the .gcloudignore file
doesn't exist, the command will try to create it.
The minimal source repository URL is:
https://source.developers.google.com/projects/${PROJECT}/repos/${REPO}
By using the URL above, sources from the root directory of the
repository on the revision tagged master will be used.
If you want to deploy from a revision different from master, append
one of the following three sources to the URL:
/revisions/${REVISION},
/moveable-aliases/${MOVEABLE_ALIAS},
/fixed-aliases/${FIXED_ALIAS}.
If you'd like to deploy sources from a directory different from the
root, you must specify a revision, a moveable alias, or a fixed alias,
as above, and append /paths/${PATH_TO_SOURCES_DIRECTORY} to the URL.
Overall, the URL should match the following regular expression:
^https://source\.developers\.google\.com/projects/
(?<accountId>[^/]+)/repos/(?<repoName>[^/]+)
(((/revisions/(?<commit>[^/]+))|(/moveable-aliases/(?<branch>[^/]+))|
(/fixed-aliases/(?<tag>[^/]+)))(/paths/(?<path>.*))?)?$
An example of a validly formatted source repository URL is:
https://source.developers.google.com/projects/123456789/repos/testrepo/
moveable-aliases/alternate-branch/paths/path-to=source

Related

GCP bucket reachable in UI but not by gcsfuse in the cloud shell

Hi I want reach some files in a GCP bucket from the cloud shell terminal (for sftp reasons), gcsfuse successfully mounts the father dir and it has all the directories except the one I need, any ideas what am I doing wrong?
In Google Cloud Storage object names ending in a slash(/) represent a directory, and all other object names represent a file. By default directories are not implicitly defined, they exist only if a matching object ending in a slash(/) exists.
Since the usual file system operations like mkdir will do the right thing, if someone set up a bucket's structure using only gcsfuse then they will not notice anything odd about this. However, if someone uses some other tool to set up objects in Google Cloud Storage (such as the storage browser in the Google Cloud Console), they may notice that not all objects are visible until they create leading directories for them.
For example, let's say someone uploaded an object demo/start.txt by choosing the folder upload option in the storage browser section in Google Cloud Console, then mounted it with gcsfuse. The file system will initially appear empty, since there is no demo/ object. However if they subsequently run mkdir demo, they will now see a directory named demo containing a file named start.txt.
To mitigate this issue gcsfuse supports a flag called --implicit-dirs. When this flag is enabled, name lookup requests use the Google Cloud Storage API's Objects.list operation to search for objects that would implicitly define the existence of a directory with the name in question. So, in the example above, a directory named demo containing a file start.txt would appear.
So in your case I suspect the file you are not able to see is a folder which you have uploaded in Google Cloud Storage bucket. As you have already mounted gcsfuse with a directory, if you mount it again using the flag --implicit-dirs, it will throw an error. So I would suggest you to unmount the directory by running the following command -
fusermount -u /path/to/mount/directory
Then mount the directory again by running the following command -
gcsfuse --implicit-dirs BUCKET_NAME /path/to/mount/directory
You can also create a new directory and mount that directory with gcsfuse without unmounting the existing mounted directory.
Please note that the flag --implicit-dirs has some drawbacks. I would recommend you to go through this github issue to get detailed information about it.

How can I find the implementation code for streaming data from Cloud Storage into BiqQuery?

When I try to run the relevant code in Cloud Shell that would allow the streaming function to be deployed it claims that the source folder containing the streaming function itself does not exist.
The relevant buckets have already been created, it's the function itself which appears not to be there - would it be possible to install this separately maybe?
The original code followed by the error message is given below:
gcloud functions deploy streaming \
--source=./functions/streaming --runtime=python37 \
--stage-bucket=${FUNCTIONS_BUCKET} \
--trigger-bucket=${FILES_SOURCE}
(gcloud.functions.deploy) argument '--source': Provided directory does not exist
As shown in the error message, the path './functions/streaming' does not exist in your cloud shell. You have to pass the absolute path to the directory (in your cloud shell) where your python code is located. Please refer to the documentation below:
The source parameter can take 3 different locations:
--source=SOURCE Location of source code to deploy. Location of the source can be one of the following three options:
Source code in Google Cloud Storage (must be a .zip archive),
Reference to source repository or,
Local filesystem path (root directory of function source).
Note that, depending on your runtime type, Cloud Functions will look
for files with specific names for deployable functions. For Node.js,
these filenames are index.js or function.js. For Python, this is
main.py.

How to specify the root folder to deploy an app using the Cloud SDK?

I'm using "Google App Engine" from GCP to host a static website. I already created the website files (HTML, JS) and yaml using Visual Studio Code. I have the folder with those files stored locally in my local computer.
I downloaded the Cloud SDK Shell for Windows. I logged in to my account, and selected the project. According to videos and tutorials, I need to deploy the app using "gcloud app deploy".
However I got an error saying that an "app.yaml" file is required to deploy this directory...
I'm trying to follow this tutorial:
https://cloud.google.com/appengine/docs/standard/python/getting-started/hosting-a-static-website#before_you_begin
I'm also trying to follow the steps contained in this video:
https://www.youtube.com/watch?v=mlcO7nfQzSg
How do I specify the root folder where I have my "app.yaml" file?
Thanks in advance!!
I already tried with many commands and unfortunately none of them have worked
The particular case in which gcloud app deploy works without additional arguments is for single-service applications only and only if the command is executed in the directory in which the service's app.yaml configuration file exists (and has that exact name, can't use a different name).
For other cases deployables can/must be specified. From gcloud app deploy:
SYNOPSIS
gcloud app deploy [DEPLOYABLES …] [--bucket=BUCKET] [--image-url=IMAGE_URL] [--no-promote] [--no-stop-previous-version]
[--version=VERSION, -v VERSION] [GCLOUD_WIDE_FLAG …]
DESCRIPTION
This command is used to deploy both code and configuration to the App
Engine server. As an input it takes one or more DEPLOYABLES that
should be uploaded. A DEPLOYABLE can be a service's .yaml file or a
configuration's .yaml file (for more information about configuration
files specific to your App Engine environment, refer to
https://cloud.google.com/appengine/docs/standard/python/configuration-files
or
https://cloud.google.com/appengine/docs/flexible/python/configuration-files).
Note, for Java Standard apps, you must add the path to the
appengine-web.xml file inside the WEB-INF directory. gcloud app
deploy skips files specified in the .gcloudignore file (see gcloud
topic gcloudignore for more information).
So apart from running the command with no arguments in the directory in which your app.yaml exists is to specify the app.yaml (with a full or relative path if needed) as a deployable:
gcloud app deploy path/to/your/app.yaml
IMHO doing this is a good habit - specifying deployables is more reliable and is the only way to deploy apps with multiple services or using routing via a dispatch.yaml file.
gcloud app deploy will look at the current directory first for app.yaml. Generally you will change to the directory with app.yaml and your other files before deploying

Google Container Registry build trigger on folder change

I can setup a build trigger on GCR to build my Docker image every time my Git repository gets updated. However, I have a single repository with multiple folders, and a Docker file in each folder.
Ex:
my_app
-- service-1
Dockerfile-1
-- service-2
Dockerfile-2
How do I only build Dockerfile-1 when the service-1 folder gets updated?
This is a variation on this GitHub feature request -- in your case, differential behavior based on the changed files (folders) rather than the branch.
We are considering this feature as part of the development of support for more advanced workflow control and will post back on that GitHub issue when it becomes available.
The work-around available to you today is to use a bash script that conditionally builds (or doesn't) based on an inspection of the files changed in the $COMMIT_SHA that triggered the build. Note that the git builder can be used to get the list of files changed via git diff-tree --no-commit-id --name-only -r $COMMIT_SHA.

GCloud Error: Source code size exceeds the limit

I'm doing the tutorial of basic fulfillment and conversation setup of api.ai tutorial to make a chat bot, and when I try to deploy the function with the command:
gcloud beta functions deploy --stage-bucket venky-bb7c4.appspot.com --trigger-http
(where 'venky-bb7c4.appspot.com' is the bucket_name)
It return the following error message:
ERROR: (gcloud.beta.functions.deploy) OperationError: code=3, message=Source code size exceeds the limit
I've searched but not found any answer, I don't know where is the error.
this is the JS file that appear in the tutorial:
/
HTTP Cloud Function.
#param {Object} req Cloud Function request context.
#param {Object} res Cloud Function response context.
*/
exports.helloHttp = function helloHttp (req, res) {
response = "This is a sample response from your webhook!" //Default response from the webhook to show it's working
res.setHeader('Content-Type', 'application/json'); //Requires application/json MIME type
res.send(JSON.stringify({ "speech": response, "displayText": response
//"speech" is the spoken version of the response, "displayText" is the visual version
}));
};
Neither of these worked for me. The way I was able to fix this was to make sure I was running the deploy from my project directory (the directory containing index.js)
The command creates zip with whole content of your current directory (except node_modules subdirectory) not just the JS file (this is because your function may use other resources).
The error you see is because size of (uncompressed) files in the directory is bigger than 512MB.
The easiest way to solve this is by moving the .js file to its own directory and deploying from there (you can use --local-path to point to directory containing the source file if you want your working directory to be different from directory with function source).
I tried with source option or deploying from the index.js folder and still a different problem exists.
This error usually happens if the code that is being uploaded is large. In my tests I found more than 100MB lead to the mentioned error.
However,
To resolve this there are two solutions.
Update .gcloudignore to ignore the folders which aren't required for your function
Still if option 1 doesn't resolve, you need to create a bucket in storage and mention it with --stage-bucket option.
Create a new bucket for deployment (one time)
gsutil mb my-cloud-functions-deployment-bucket
The bucket you created needs to be unique else it throws already created
Deploy
gcloud functions deploy subscribers-firestoreDatabaseChange
--trigger-topic firestore-database-change
--region us-central1
--runtime nodejs10
--update-env-vars "REDIS_HOST=10.128.0.2"
--stage-bucket my-cloud-functions-deployment-bucket
I had similar problems while deploying cloud functions. What is working for me was specifying the js files source folder.
gcloud functions deploy functionName --trigger-http **--source path_to_project_root_folder**
Also be sure to include all unnecessary folders in .gcloudignore.
Ensure the package folder has a .gitignore file (excluding node_modules).
The most recent version of gcloud requires it in order to not load node_modules. My code size went from 119MB to 17Kb.
Once I've added the .gitignore file, the log printed as well
created .gcloudignore file. See `gcloud topic gcloudignore` for details.