I need your help , actually i need to create a dasboard in GCP to show TTR Time , TTR response via fetching logs from GCP logging that i am writting using script but unable to achieve it.
Below is command i am using:
gcloud logging write logging/user/TTR4 '{"Gremblin_correlation_exec_id": "correlation_id","SenerioName": "Senerio1","ServiceName": "Service1","SubsystemName": "subsystem1","TTRTime": 500,"EndTimestamp": "2020-11-30 06:06:56+00:00","Node_ipfirst": "10.128.0.55:80","node_ipsecound": "10.128.0.6:80","starttimestamp": "2020-11-30 05:58:08+00:00" }' --payload-type=json
i am getting jason data but not able to show it on dasboard like TTRTime above 500 using filter based upon service name and subsystemName.
Related
I'm having troubles with a job I've set up on dataflow.
Here is the context, I created a dataset on bigquery using the following path
bi-training-gcp:sales.sales_data
In the properties I can see that the data location is "US"
Now I want to run a job on dataflow and I enter the following command into the google shell
gcloud dataflow sql query ' SELECT country, DATE_TRUNC(ORDERDATE , MONTH),
sum(sales) FROM bi-training-gcp.sales.sales_data group by 1,2 ' --job-name=dataflow-sql-sales-monthly --region=us-east1 --bigquery-dataset=sales --bigquery-table=monthly_sales
The query is accepted by the console and returns me a sort of acceptation message.
After that I go to the dataflow dashboard. I can see a new job as queued but after 5 minutes or so the job fails and I get the following error messages:
Error
2021-09-29T18:06:00.795ZInvalid/unsupported arguments for SQL job launch: Invalid table specification in Data Catalog: Could not resolve table in Data Catalog: bi-training-gcp.sales.sales_data
Error 2021-09-29T18:10:31.592036462ZError occurred in the launcher
container: Template launch failed. See console logs.
My guess is that it cannot find my table. Maybe because I specified the wrong location/region, since my table is specified to be location in "US" I thought it would be on a US server (which is why I specified us-east1 as a region), but I tried all us regions with no success...
Does anybody know how I can solve this ?
Thank you
This error occurs if the Dataflow service account doesn't have access to the Data Catalog API. To resolve this issue, enable the Data Catalog API in the Google Cloud project that you're using to write and run queries. Alternately, assign the roles/datacatalog.
I am trying to fetch data access logs for Cloud Profiler API which is created using VM Instance. I can see profile is created successfully in Logs Explorer and logName contains data_access.
Now, I am trying to fetch those logs programmatically. I tried through Cloud Function entries.list API. Number of ways I have tried, I am not getting any error but no logs are showing, all other logs except for Data_Acess logs are visible, just when i filter through data access logs output is nothing, but when i do the same in Console, its there
Same way I tried with gcloud logging read command, still nothing output i am getting.
gcloud beta logging read 'timestamp>="2021-05-13T12:09:05Z" AND logName:"projects/******/logs/cloudaudit.googleapis.com%2Fdata_access"' --limit=10 --format=json --order=asc
I have tried changing the order to desc, different filters I have tried but not working
I am getting proper response from Google API Explorer
Update I got it working after re-authentication but still my cloud function doesn't work. How would i re-authenticate in Cloud Function
headers = {"Authorization": "Bearer "+ credentials.token}
r = requests.post("https://logging.googleapis.com/v2/entries:list", params=payload, headers=headers)
this is how i am running my code in Cloud Function. As a output with same parameters as in gcloud, i am getting {}
Is it possible to download a Log Storage (Log bucket) from Google Cloud Platform, specifically the one created by default? In case someone knows they can explain how to do it.
The possible solution for the question is you need to choose the required logs and then get the logs for the time period of 1 day to download them in JSON or CSV format.
Step1- From the logging console goto advanced filtering mode
Step2- To choose the log type use filtering query, for example
resource.type="audited_resource"
logName="projects/xxxxxxxx/logs/cloudaudit.googleapis.com%2Fdata_access"
resource.type="audited_resource"
logName="organizations/xxxxxxxx/logs/cloudaudit.googleapis.com%2Fpolicy"
Step3- You can download them as JSON and CSV format
If you have a huge number of audit logs generated per day then above one will not work out. So, you need to export logs to Cloud storage and a big query for further analysis. Please note that cloud logging doesn’t charge to export logs but destination charges might apply.
Another option, you can use the following gcloud command to download the logs.
gcloud logging read "logName : projects/Your_Project/logs/cloudaudit.googleapis.com%2Factivity" --project=Project_ID --freshness=1d >> test.txt
Maybe I missed something, under https://console.cloud.google.com/apis/api/sqladmin.googleapis.com/overview
I saw there are a lot of errors, but when I go to Logs Viewer, I couldn't find anything. Is any way I can obtain the error log?
Basically, you should create a query to obtain the data you need in the Log Viewer UI: specify a type of resource and an instance name whose logs you want to view.
GCP Console => Operations => Logging => Logs Viewer
=> Query builder => Resource
Cloud SQL Database = my-project:my-sql-instance
The query builder will show a query preview like below:
resource.type="cloudsql_database"
resource.labels.database_id="my-project:my-sql-instance"
Once you click the “Run Query” button, the log entries will appear. By default log entries for the last 1 hour are shown. You can use the "Edit time" option to change this.
Please see Cloud Logging > Doc > Basic logs queries for more details.
I am trying to identify the users who created tables in BigQuery.
Is there any command line or API that would provide this information. I know that audit logs do provide this information, but I was looking for a command line which could do the job so that i could wrap this in a shell script and run them against all the tables at one time. Same for Google Storage Buckets as well. I did try
gsutil iam get gs://my-bkt and looked for "role": "roles/storage.admin" role, but I do not find the admin role with all buckets. Any help?
This is a use case for audit logs. BigQuery tables don't report metadata about the original resource creator, so scanning via tables.list or inspecting the ACLs don't really expose who created the resource, only who currently has access.
What's the use case? You could certainly export the audit logs back into BigQuery and query for table creation events going forward, but that's not exactly the same.
You can find it out using Audit Logs. You can access them both via Console/Log Explorer or using gcloud tool from the CLI.
The log filter that you're interested in is this one:
resource.type = ("bigquery_project" OR "bigquery_dataset")
logName="projects/YOUR_PROJECT/logs/cloudaudit.googleapis.com%2Factivity"
protoPayload.methodName = "google.cloud.bigquery.v2.TableService.InsertTable"
protoPayload.resourceName = "projects/YOUR_PROJECT/datasets/curb_tracking/tables/YOUR_TABLE"
If you want to run it from the command line, you'd do something like this:
gcloud logging read \
'
resource.type = ("bigquery_project" OR "bigquery_dataset")
logName="projects/YOUR_PROJECT/logs/cloudaudit.googleapis.com%2Factivity"
protoPayload.methodName = "google.cloud.bigquery.v2.TableService.InsertTable"
protoPayload.resourceName = "projects/YOUR_PROJECT/datasets/curb_tracking/tables/YOUR_TABLE"
'\
--limit 10
You can then post-process the output to find out who created the table. Look for principalEmail field.