How to get info about Gcloud logs similar to logs explorer? - google-cloud-platform

I am using #google-cloud/logging package to get logs from gcloud, and it works nicely, you can get logs, event (and query them if needed). But how I can get the same info as Logs Explorer? I mean different type of fields which can be queried and etc:
On this picture you see Log fields like, FUNCTION NAME which may be a list of values. And it seems that #google-cloud/logging can't get this meta (or fields info)? So is it possible to obtain it using some other APIs?

If I understand your question correctly, you're asking how Logs Viewer is determining the values that allows it to present you with the various log fields to filter|refine your log queries.
I suspect (don't know) that the viewer is building these lists from the properties as it parses the logs. This would suggest that, the lists are imperfect and that e.g. FUNCTION_NAME's would only appear once a log including the Function's name were parsed.
There is a way to enumerate definitive lists of GCP resources. This is done using list or equivalent methods available using service-specific libraries (SDKs) e.g. #google-cloud/functions.
The easiest way to understand what functionality is provided by a given Google service is to browse the service using Google's APIs Explorer. Here's Cloud Logging API v2 and here's Cloud Functions API.
You can prove to yourself that there's no method under Cloud Logging that allows enumeration of all a project's Cloud Functions. But there is a method in Cloud Functions projects.locations.functions.list. The latter returns a response body that includes a list of functions that are a type CloudFunction that have a name.
Another way to understand how these APIs ("libraries") are used is to add --log-http to any gcloud command to see what API calls are being made by the command.

Related

Update GCP asset labels

What is the most efficient way to update all assets labels per project?
I can list all project resources and their labels with gcloud asset search-all-resources --project=SomeProject. The command also returns the labels for those assets.
Is there something like gcloud asset update-labels?
I'm unfamiliar with the service but, APIs Explorer (Google's definitive service documentation), shows a single list method.
I suspect (!?) that you will need to iterate over all your resource types and update instances of them using any (there may not be) update (PATCH) method that permits label changes for that resource type.
This seems like a reasonable request and you may wish to submit a feature request using Google's issue tracker
gcloud does not seem to have a update-labels command.
You could try the Cloud Resource Manager API. For example, call the REST or Python API: https://cloud.google.com/resource-manager/docs/creating-managing-labels#update-labels

Available filters for client.get_products function in Boto3

I am trying to develop a python script that gets different parameters of any AWS service (for EC2 e.g., those parameters would be operating system, billing type etc.). Where can I find a listing of all the available Filters that can be used with the get_products function in boto3 for each different supported Service?
Thanks in advance,
Andreas
Actually, there is no direct API or doc available for getting all the attributes. At least I didnt find any.
What you can do is combine various API calls:
You can use DescribeServices
, you get all the attributes of the all the services or if you want to have for one particular you can provide the name. Boto3 call describe_services
Returns the metadata for one service or a list of the metadata for all services
Then you need to use GetAttributeValues to determine the possible values of the attributes. Boto3 call get_attribute_values
And finally depending on the attributes collected in the earlier step you can build a filter for get_producs

Production Access controls for GoogleCloud using Stackdriver

How have people implemented Production Access Controls (i.e. logging and reporting on access to compute instances by services and humans over SSH). Our goal is to forward all user logon entries to our SIEM consistently across projects and ideally avoid having project specific Stackdriver sinks (and associated setup and maintenance).
We've tried the following:
Enabled auth log forwarding in Fluentd as only syslog is done by default
Enabled organization level sinks that send to a topic (to forward on to SIEM via HTTP subscriber) that include all children
Can see syslog/auth at the project level for non-Container OS images (i.e. Ubuntu)
Issues we're seeing:
- Limited documentation on filter format at org level (seems to differ from project level for things like logName). log_id function does appear to work
- Some log types appear at the org level (things like cloudapis activity) but syslog does not appear to get processed
- Container OS appears to not enable ssh/sudo forwarding by default in fluentd (or I haven't found which log type has this data). I do see this logged to journalctl on a test node
Does anyone have a consistent way to achieve this?
In case anyone else comes across this, we found the following:
It is possible to set up Stackdriver sinks at org level through CLI. Not visible through Cloud Console UI and also CLI does not allow you to list log types at org
Filters can be defined on the sinks in addition to logName but format can differ to project level filters
You need to enable auth log logging in fluentd which is platform specific (i.e. one process for google-fluentd on Ubuntu is different to stackdriver setup on Container OS)
SSHD for some reason does not log the initial log stating user and IP through syslog (and thus fluentd) and therefore is not visible to Stackdriver
Use or org sinks to topics is a child project with subscription to forward to your SIEM of choice, works well
Still trying to get logs of gcloud ssh commands
A way to approach this could be to by exporting your log sink to BigQuery. Note that sink setup to export BigQuery Logs for all projects under the Organization contains a parameter that is set to 'False', the field 'includeChildren' must be set to 'True'. Once set to true then logs from all the projects, folders, and billing accounts contained in the sink's parent resource are also available for export, if set to false then only the logs owned by the sink's parent resource are available for export. Then you must be able to filter the logs needed from BigQuery.
Another way to approach this will be to script it out by listing all the projects using command: gcloud projects list | tail -n +2 | awk -F" " '{print $1}' This can be made into an array that can be iterated over and the logs for each project can be retrieved using a similar command as the one in this doc.
Not sure if all this can help somehow to solve or workaround your question, hope so.

Creating Dynamically Cron jobs at particular intervals

Looking for dynamically creating cron jobs that gets created and configured using the request parameters send by the Cloud Functions or normal HTTP request.
There is already manual way by visiting the Google Cloud console but I actually make this manual task by configuring and creating jobs according to request parameters.
I am already aware that we can provide a cron.yaml file that can have all the configuration but I need some help or any reference that contains detail way to achieve this.
I am also beginner so indeed correct me or provide any alternate solution.
You'll want to use the Cloud Scheduler API. Specifically, this is a REST API that lets you do everything you could do via the console or the gcloud command.

How do I know what key value pairs are available for deployment manager?

For example when I try to figure out what properties I can put into deployment manager for creating a bigquery table, I had to reference the REST API docs as the best place to find parameters and required fields.
Is there a good place from within gcloud command or online docs that are specific to deployment manager yamls? I would like to be able to reference required fields and optional fields for creating GCP resources. Currently it's very difficult to figure out.
From the documentation at: https://cloud.google.com/deployment-manager/docs/configuration/supported-resource-types
You can get a list of the supported resource types by running:
gcloud deployment-manager types list
That said the yaml reference from documentation on the that page looks pretty complete.
Edit: Refer to this github link for a list of deployment manager examples.
If anything you need is not described in the documentation/exemplary schemas there is a brutal walk around.
You can make an api call with developer console open (F12) and have a look on network activity where your call will be described with all used and available properties.
It will not provide any addtional information about implementation besides parameter's name itself, so you will have to follow rules covering alike parameter.