Why doesn't my Cloud Function (2nd gen) trigger? - google-cloud-platform

After getting some help deploying the sample code for a cloud function with a storage trigger I now get the function deployed just fine, but now it won't trigger :-(
The function is deployed:
The deployed code (per the tutorial):
'use strict';
// [START functions_cloudevent_storage]
const functions = require('#google-cloud/functions-framework');
// Register a CloudEvent callback with the Functions Framework that will
// be triggered by Cloud Storage.
functions.cloudEvent('imageAdded', cloudEvent => {
console.log(`Event ID: ${cloudEvent.id}`);
console.log(`Event Type: ${cloudEvent.type}`);
const file = cloudEvent.data;
console.log(`Bucket: ${file.bucket}`);
console.log(`File: ${file.name}`);
console.log(`Metageneration: ${file.metageneration}`);
console.log(`Created: ${file.timeCreated}`);
console.log(`Updated: ${file.updated}`);
});
// [END functions_cloudevent_storage]
The deploy command I used:
gcloud functions deploy xxx-image-handler --gen2 --runtime=nodejs16 --project myproject --region=europe-west3 --source=. --entry-point=imageAdded --trigger-event-filters='type=google.cloud.storage.object.v1.finalized' --trigger-event-filters='bucket=xxx_report_images'
To test I just uploaded a text file from command line:
gsutil cp test-finalize.txt gs://xxx_report_images/test-finalize.txt
Which worked fine, and the file was uploaded to the xxx_report_images bucket (which also is in the europe-west3 region) but the function is never triggered, which I can see by 0 items in the log:
gcloud beta functions logs read xxx-image-handler --gen2 --limit=100 --region europe-west3
Listed 0 items.
What is going on here? This seems very straight-forward and I fail to see what I'm missing, so hopefully I can get some guidance.
**** EDIT 1 *****
Regarding comment 1 below. I can see (in the Eventarch trigger list, picture below) that the service account used by eventarch is xxx-compute#developer.gserviceaccount.com
And I can see in the IAM Principals list that this indeed is the Default compute service account, which has the Editor role. I also explicitly added the Eventarc Connection Publisher role to this service account but without success (I assume the Editor role already contains this role, but just to be sure...). So I guess the issue is not related to your suggestion (?).
BTW. I tried the suggested gcloud compute project-info describe --format="value(defaultServiceAccount)" but just got Could not fetch resource: - Required 'compute.projects.get' permission for 'projects/myproject' and I couldn't figure out which role to add to which serviceaccount to enable this. However, as seen above I found the info in the Eventarc section in GCP instead.
**** EDIT 2 ****.
Indeed after more testing I think I have nailed it down to being a permission issue on Eventarc, just like suggested in the comment. I've posted a new question which is more specific.

Related

GKE cluster creator in GCP

How can we get the cluster owner details in GKE. Logging part only contains the entry with service account operations and there is no entry with principal email of userId anywhere.
It seems very difficult to get the name of the user who created the GKE cluster.
we have exported complete json file of logs but did not the user entry who actually click on create cluster button. I think this is very common use case to know GKE cluster creator, not sure if we are missing something.
Query:
resource.type="k8s_cluster"
resource.labels.cluster_name="clusterName"
resource.labels.location="us-central1"
-protoPayload.methodName="io.k8s.core.v1.configmaps.update"
-protoPayload.methodName="io.k8s.coordination.v1.leases.update"
-protoPayload.methodName="io.k8s.core.v1.endpoints.update"
severity=DEFAULT
-protoPayload.authenticationInfo.principalEmail="system:addon-manager"
-protoPayload.methodName="io.k8s.apiserver.flowcontrol.v1beta1.flowschemas.status.patch"
-protoPayload.methodName="io.k8s.certificates.v1.certificatesigningrequests.create"
-protoPayload.methodName="io.k8s.core.v1.resourcequotas.delete"
-protoPayload.methodName="io.k8s.core.v1.pods.create"
-protoPayload.methodName="io.k8s.apiregistration.v1.apiservices.create"
I have referred the link below, but it did not help either.
https://cloud.google.com/blog/products/management-tools/finding-your-gke-logs
Audit Logs and specifically Admin Activity Logs
And, there's a "trick": The activity audit log entries include the API method. You can find the API method that interests you. This isn't super straightforward but it's relatively easy. You can start by scoping to the service. For GKE, the service is container.googleapis.com.
NOTE APIs Explorer and Kubenetes Engine API (but really container.googleapis.com) and projects.locations.clusters.create. The mechanism breaks down a little here as the protoPayload.methodName is a variant of the underlying REST method name.
And so you can use logs explorer with the following very broad query:
logName="projects/{PROJECT}/logs/cloudaudit.googleapis.com%2Factivity"
container.googleapis.com
NOTE replace {PROJECT} with the value.
And then refine this based on what's returned:
logName="projects/{PROJECT}/logs/cloudaudit.googleapis.com%2Factivity"
protoPayload.serviceName="container.googleapis.com"
protoPayload.methodName="google.container.v1beta1.ClusterManager.CreateCluster"
NOTE I mentioned that it isn't super straightforward because, as you can see in the above, I'd used gcloud beta container clusters create and so I need the google.container.v1beta1.ClusterManager.CreateCluster method but, it was easy to determine this from the logs.
And, who dunnit?
protoPayload: {
authenticationInfo: {
principalEmail: "{me}"
}
}
So:
PROJECT="[YOUR-PROJECT]"
FILTER="
logName=\"projects/${PROJECT}/logs/cloudaudit.googleapis.com%2Factivity\"
protoPayload.serviceName=\"container.googleapis.com\"
protoPayload.methodName=\"google.container.v1beta1.ClusterManager.CreateCluster\"
"
gcloud logging read "${FILTER}" \
--project=${PROJECT} \
--format="value(protoPayload.authenticationInfo.principalEmail)"
For those who are looking for a quick answer.
Use the log filter in Logs Explorer & use below to check the creator of the cluster.
resource.type="gke_cluster"
protoPayload.authorizationInfo.permission="container.clusters.create"
resource.labels.cluster_name="your-cluster-name"
From gcloud command, you can get the creation date of the cluster.
gcloud container clusters describe YOUR_CLUSTER_NAME --zone ZONE

GCP Vertex AI Training Custom Job : User does not have bigquery.jobs.create permission

I'm struggling to execute a query with Bigquery python client from inside a training custom job of Vertex AI from Google Cloud Platform.
I have built a Docker image which contains this python code then I have pushed it to Container Registry (eu.gcr.io)
I am using this command to deploy
gcloud beta ai custom-jobs create --region=europe-west1 --display-name="$job_name" \
--config=config_custom_container.yaml \
--worker-pool-spec=machine-type=n1-standard-4,replica-count=1,container-image-uri="$docker_img_path" \
--args="${model_type},${env},${now}"
I have even tried to use the option --service-account to specify a service account with admin Bigquery role, it did not work.
According to this link
https://cloud.google.com/vertex-ai/docs/general/access-control?hl=th#granting_service_agents_access_to_other_resources
the Google-managed service accounts for AI Platform Custom Code Service Agent (Vertex AI) have already the right to access to BigQuery, so I do not understand why my job fails with this error
google.api_core.exceptions.Forbidden: 403 POST https://bigquery.googleapis.com/bigquery/v2/projects/*******/jobs?prettyPrint=false:
Access Denied: Project *******:
User does not have bigquery.jobs.create permission in project *******.
I have replaced the id with *******
Edit:
I have tried several configuration, my last config YAML file only contents this
baseOutputDirectory:
outputUriPrefix:
Using the field serviceAccount does not seem to edit the actual configuration unlike --service-account option
Edit 14-06-2021 : Quick Fix
like #Ricco.D said
try explicitly defining the project_id in your bigquery code if you
have not done this yet.
bigquery.Client(project=[your-project])
has fixed my problem. I still do not know about the causes.
To fix the issue it is needed to explicitly specify the project ID in the Bigquery code.
Example:
bigquery.Client(project=[your-project], credentials=credentials)

GCP project creation via deploymentmanager

So im trying to create a project with google cloud deployment manager,
Ive structured the setup roughly as below:
# Structure
Org -> Folder1 -> Seed-Project(Location where I am running deployment manager from)
Organization:
IAM:
-> {Seed-Project-Number}#cloudservices.gserviceaccount.com:
- Compute Network Admin
- Compute Shared VPC Admin
- Organisation Viewer
- Project Creator
# DeploymentManager Resource:
type cloudresourcemanager.v1.project
name MyNewProject
parent
id: '{folder1-id}'
type: folder
projectId: MyNewProject
The desired result is that MyNewProject should be created under Folder1.
However; It appears as if the deployment manager service account does not have sufficent permissions:
$ CLOUDSDK_CORE_PROJECT=Seed-Project gcloud deployment-manager deployments \
create MyNewDeployment \
--config config.yaml \
--verbosity=debug
Error message:
- code: RESOURCE_ERROR
location: /deployments/MyNewDeployment/resources/MyNewProject
message: '{"ResourceType":"cloudresourcemanager.v1.project",
"ResourceErrorCode":"403","ResourceErrorMessage":{"code":403,"message":"The
caller does not have permission","status":"PERMISSION_DENIED","statusMessage":"Forbidden","requestPath":"https://cloudresourcemanager.googleapis.com/v1/projects/MyNewProject","httpMethod":"GET"}}'
I've done some digging, and it appears to be calling the resourcemanager.projects.get method; The 'Compute Shared VPC Admin (roles/compute.xpnAdmin)' role should provide this permission as documented here: https://cloud.google.com/iam/docs/understanding-roles
Except that doesn't seem to be the case, whats going on ?
Edit
Id like to add some additional information gathered from debugging efforts:
These are the API requests from the deployment manager, (from the seed project).
You can see that the caller is an anonymous service account, this isn't what id expect to see. (Id expect to see {Seed-Project-Number}#cloudservices.gserviceaccount.com as the calling account here)
Edit-2
config.yaml
imports:
- path: composite_types/project/project.py
name: project.py
resources:
- name: MyNewProject
type: project.py
properties:
parent:
type: folder
id: "{folder1-id}"
billingAccountId: billingAccounts/REDACTED
activateApis:
- compute.googleapis.com
- deploymentmanager.googleapis.com
- pubsub.googleapis.com
serviceAccounts: []
composite_types/project/* is an exact copy of the templates found here:
https://github.com/GoogleCloudPlatform/deploymentmanager-samples/tree/master/community/cloud-foundation/templates/project
The key thing is that this is a GET operation, not an attempt to create the project. This is to verify global uniqueness of the project-id requested, and if not unique, PERMISSION_DENIED is thrown.
Lousy error message, lots of wasted developer hours !
Probably late, but just to share that I ran into similar issue today.Double checked every permission mentioned in the Readme for the serviceAccount under which the deployment manager job runs ({Seed-Project-Number}#cloudservices.gserviceaccount.com in the question), turned out that the Billing Account User role was not assigned contrary to what I thought earlier. Granting that and running it again worked.

What does "Unknown Error" mean when `gcloud datastore export`?

I wanna back-up data in Cloud Datastore into Cloud Storage, and executed a command like this:
gcloud datastore export gs://some_bucket/path/ \
--namespaces=foo --kinds='Bar' --project some_project
But it just return
ERROR: (gcloud.datastore.export) UNKNOWN: Unknown Error.
I cannot figure out what is wrong, and neither find solution to this.
What does this error mean?
Unknown Error most likely means Bad Parameter(s) ...
the / at the end of the path could be the reason - or probably, because the --namespaces lack the single ' quotes; those are also case-sensitive. that's at least what the documentation would hint for.
gcloud datastore export gs://some_bucket/path \
--namespaces='foo' --kinds='Bar' --project some_project
... there's a --verbosity parameter.
Your command is correct. The reason you are receiving this error is most likely related with the permissions.
For all export requests, both the account making the request and the
App Engine default service account for the GCP project, must have an
IAM role that grants the following permissions for your Cloud Storage
bucket:
storage.buckets.get
storage.objects.create
storage.objects.list
In my case this was a testing project, and apparently I didn't enable the storage in that firebase project... wasted a good 4 hours on this🤦🏽‍♂️

Is it possible to deploy a background Function "myBgFunctionInProjectB" in "project-b" and triggered by my topic "my-topic-project-a" from "project-a"

It's possible to create a topic "my-topic-project-a" in project "project-a" so that it can be publicly visible (this is done by setting the role "pub/sub subscriber" to "allUsers" on it).
Then from project "project-b" I can create a subscription to "my-topic-project-a" and read the events from "my-topic-project-a". This is done using the following gcloud commands:
(these commands are executed on project "project-b")
gcloud pubsub subscriptions create subscription-to-my-topic-project-a --topic projects/project-a/topics/my-topic-project-a
gcloud pubsub subscriptions pull subscription-to-my-topic-project-a --auto-ack
So ok this is possible when creating a subscription in "project-b" linked to "my-topic-project-a" in "project-a".
In my use case I would like to be able to deploy a background function "myBgFunctionInProjectB" in "project-b" and triggered by my topic "my-topic-project-a" from "project-a"
But ... this doesn't seem to be possible since gcloud CLI is not happy when you provide the full topic name while deploying the cloud function:
gcloud beta functions deploy myBgFunctionInProjectB --runtime nodejs8 --trigger-topic projects/project-a/topics/my-topic-project-a --trigger-event google.pubsub.topic.publish
ERROR: (gcloud.beta.functions.deploy) argument --trigger-topic: Invalid value 'projects/project-a/topics/my-topic-project-a': Topic must contain only Latin letters (lower- or upper-case), digits and the characters - + . _ ~ %. It must start with a letter and be from 3 to 255 characters long.
is there a way to achieve that or this is actually not possible?
Thanks
So, it seems that is not actually possible to do this. I have found it by checking it in 2 different ways:
If you try to create a function through the API explorer, you will need to fill the location where you want to run this, for example, projects/PROJECT_FOR_FUNCTION/locations/PREFERRED-LOCATION, and then, provide a request body, like this one:
{
"eventTrigger": {
"resource": "projects/PROJECT_FOR_TOPIC/topics/YOUR_TOPIC",
"eventType": "google.pubsub.topic.publish"
},
"name":
"projects/PROJECT_FOR_FUNCTION/locations/PREFERRED-LOCATION/functions/NAME_FOR_FUNTION
}
This will result in a 400 error code, with a message saying:
{
"field": "event_trigger.resource",
"description": "Topic must be in the same project as function."
}
It will also say that you missed the source code, but, nonetheless, the API already shows that this is not possible.
There is an already open issue in the Public Issue Tracker for this very same issue. Bear in mind that there is no ETA for it.
I also tried to do this from gcloud, as you tried. I obviously had the same result. I then tried to remove the projects/project-a/topics/ bit from my command, but this creates a new topic in the same project that you create the function, so, it's not what you want.