I am trying to fetch data access logs for Cloud Profiler API which is created using VM Instance. I can see profile is created successfully in Logs Explorer and logName contains data_access.
Now, I am trying to fetch those logs programmatically. I tried through Cloud Function entries.list API. Number of ways I have tried, I am not getting any error but no logs are showing, all other logs except for Data_Acess logs are visible, just when i filter through data access logs output is nothing, but when i do the same in Console, its there
Same way I tried with gcloud logging read command, still nothing output i am getting.
gcloud beta logging read 'timestamp>="2021-05-13T12:09:05Z" AND logName:"projects/******/logs/cloudaudit.googleapis.com%2Fdata_access"' --limit=10 --format=json --order=asc
I have tried changing the order to desc, different filters I have tried but not working
I am getting proper response from Google API Explorer
Update I got it working after re-authentication but still my cloud function doesn't work. How would i re-authenticate in Cloud Function
headers = {"Authorization": "Bearer "+ credentials.token}
r = requests.post("https://logging.googleapis.com/v2/entries:list", params=payload, headers=headers)
this is how i am running my code in Cloud Function. As a output with same parameters as in gcloud, i am getting {}
Related
This is part of my journey to get a clear overview of which users/service accounts are in my GCP Project and when they last logged in.
Endgoal: to be able to clean up users/service-accounts if needed when they weren't on GCP for a long time.
First question:
How can I find in the logs when a specific user used resources, so I can determine when this person last logged in?
You need the Auditlogs and to see them you can run the following query in Cloud Logging:
protoPayload.#type="type.googleapis.com/google.cloud.audit.AuditLog"
protoPayload.authenticationInfo.principalEmail="your_user_name_email_or_your_service_account_email"
You can also check the Activity logs and filter on a user:
https://console.cloud.google.com/home/activity
Related questions + answers:
Pull "last access" information on projects from Google Cloud Platform (GCP)
IAM users and last login date in google cloud
How to list, find, or search iam policies across services (APIs), resource types, and projects in google cloud platform (GCP)?
There is now also the newly added Log Analytics.
This allows you to use SQL to query your logs.
Your logging buckets _Default and _Required need to be upgraded to be able to use Log Analytics:
https://cloud.google.com/logging/docs/buckets#upgrade-bucket
After that you use for example the console to use SQL on your logs:
https://console.cloud.google.com/logs/analytics
Unfortunately, at the moment you can only query the logs that were created after you've switched on Log Analytics.
Example query in the Log Analytics:
SELECT
timestamp,
proto_Payload.audit_log.authentication_info.principal_email,
auth_info.resource,
auth_info.permission,
auth_info.granted
FROM
`logs__Default_US._AllLogs`
left join unnest(proto_Payload.audit_log.authorization_info) auth_info
WHERE
timestamp > TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 30 DAY)
and proto_payload.type = "type.googleapis.com/google.cloud.audit.AuditLog"
and proto_Payload.audit_log.authentication_info.principal_email in ("name_of_your_user")
ORDER BY
timestamp
I need your help , actually i need to create a dasboard in GCP to show TTR Time , TTR response via fetching logs from GCP logging that i am writting using script but unable to achieve it.
Below is command i am using:
gcloud logging write logging/user/TTR4 '{"Gremblin_correlation_exec_id": "correlation_id","SenerioName": "Senerio1","ServiceName": "Service1","SubsystemName": "subsystem1","TTRTime": 500,"EndTimestamp": "2020-11-30 06:06:56+00:00","Node_ipfirst": "10.128.0.55:80","node_ipsecound": "10.128.0.6:80","starttimestamp": "2020-11-30 05:58:08+00:00" }' --payload-type=json
i am getting jason data but not able to show it on dasboard like TTRTime above 500 using filter based upon service name and subsystemName.
I am trying to identify the users who created tables in BigQuery.
Is there any command line or API that would provide this information. I know that audit logs do provide this information, but I was looking for a command line which could do the job so that i could wrap this in a shell script and run them against all the tables at one time. Same for Google Storage Buckets as well. I did try
gsutil iam get gs://my-bkt and looked for "role": "roles/storage.admin" role, but I do not find the admin role with all buckets. Any help?
This is a use case for audit logs. BigQuery tables don't report metadata about the original resource creator, so scanning via tables.list or inspecting the ACLs don't really expose who created the resource, only who currently has access.
What's the use case? You could certainly export the audit logs back into BigQuery and query for table creation events going forward, but that's not exactly the same.
You can find it out using Audit Logs. You can access them both via Console/Log Explorer or using gcloud tool from the CLI.
The log filter that you're interested in is this one:
resource.type = ("bigquery_project" OR "bigquery_dataset")
logName="projects/YOUR_PROJECT/logs/cloudaudit.googleapis.com%2Factivity"
protoPayload.methodName = "google.cloud.bigquery.v2.TableService.InsertTable"
protoPayload.resourceName = "projects/YOUR_PROJECT/datasets/curb_tracking/tables/YOUR_TABLE"
If you want to run it from the command line, you'd do something like this:
gcloud logging read \
'
resource.type = ("bigquery_project" OR "bigquery_dataset")
logName="projects/YOUR_PROJECT/logs/cloudaudit.googleapis.com%2Factivity"
protoPayload.methodName = "google.cloud.bigquery.v2.TableService.InsertTable"
protoPayload.resourceName = "projects/YOUR_PROJECT/datasets/curb_tracking/tables/YOUR_TABLE"
'\
--limit 10
You can then post-process the output to find out who created the table. Look for principalEmail field.
Is there a way to determine if a backend service is ready? I ask because I run a script that creates a backend then a url map that uses this backend. The problem is I sometimes get errors saying the backend is not ready for use. I need to be able to pause until the backend is ready before I create a url map. I could check the error response for the phrase 'is not ready' but this isn't reliable for future versions of gcloud. This is somewhat related to another post I recently made on how to reliably check for gcloud errors.
I could also say the same for the url map. When i create a proxy that uses the url map, sometimes i get the error saying the url map is not ready.
Here's an example of what I'm experiencing:
gcloud compute url-maps add-path-matcher app-url-map
--path-matcher-name=web-path-matcher
--default-service=web-backend
--new-hosts="example.com"
--path-rules="/*=web-backend"
ERROR: (gcloud.compute.url-maps.add-path-matcher) Could not fetch resource:
- The resource 'projects/my-project/global/backendServices/web-backend' is not ready
gcloud compute target-https-proxies create app-https-proxy
--url-map app-url-map
--ssl-certificates app-ssl-cert
ERROR: (gcloud.compute.target-https-proxies.create) Could not fetch resource:
- The resource 'projects/my-project/global/urlMaps/app-url-map' is not ready
gcloud -v
Google Cloud SDK 225.0.0
beta 2018.11.09
bq 2.0.37
core 2018.11.09
gsutil 4.34
would assume it's gcloud alpha resources list ...
see the Error Messages of the Resource Manager and scroll down to the bottom, there it reads:
notReady The API server is not ready to accept requests.
which equals HTTP 503, SERVICE_UNAVAILABLE.
adding the --verbosity option might provide some more details.
see the documentation.
We have created cloud spanner instance and databases on google cloud console.
Following code snippet which we are executing.
def getDatabaseList(self,):
try:
parent = "projects/"+self._PROJECT_NAME + "/instances/" + self._INSTANCE_NAME
response = self.service.projects().instances().databases().list(parent=parent).execute()
except Exception, e:
logging.info("Exception while getDatabaseList %s", e)
return False
return response
In the above code snippet is self.service is object googleapiclinet library build object.
We are getting below exception while executing above code snippet using service account id.
Exception while getDatabaseList <HttpError 403 when requesting https://spanner.googleapis.com/v1/projects/<projectName>/instances/<instanceName>/databases?alt=json&key=<APIKEY>
returned "Resource projects/<projectName>/instances/<instanceName> is missing IAM permission: spanner.databases.list.">
Reference document cloud spanner IAM
The following link shows an example to list Databases in an instance using Python Spanner Client Library
https://github.com/googleapis/python-spanner/blob/main/samples/samples/snippets.py#L144
Regarding the IAM permission issue it seems you have not set the GOOGLE_APPLICATION_CREDENTIALS. #ACimander answer is correct.
You can also use gcloud to authenticate using service account by
gcloud auth activate-service-account SERVICE_ACCOUNT#DOMAIN.COM --key-file=/path/key.json --project=PROJECT_ID
More information on this can be found in https://cloud.google.com/sdk/gcloud/reference/auth/activate-service-account
A little late, but hopefully this helps: Did you set path to your service-account's json file correctly? I wasted half a day playing with the permissions until I figured out that I simply missed a an env key.
set export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/service_account/key.json