Can I see the log of ElasticCloud in Stackdriver Logging? - google-cloud-platform

My service is currently deployed on GCP, and all the monitoring & alerts are aggregated to Google Cloud Logging.
Recently I'm planning to introduce full text search feature to my service, and ElasticCloud is an option to achieve it because it supports deployment on GCP.
However, as far as I know, the documentation says that it supports import logs from Google Cloud Logging, but there is no description how to export the logs to Google Cloud Logging.
Is there a good way to export logs from ElasticCloud to Google Cloud Logging?

Related

Cloud Services Orchestration

How can I orchestrate and call cloud services from different providers: Google Cloud, AWS and Azure
We did check workflows, but it seems that workflows are limited in each provider's platform, so if I build a workflow in Azure, I can only include services from Azure, but not from Google Cloud or AWS.
Cloud Workflow is an orchestrator that call APIs. There is a better integration for Google Cloud services (and with Google Security) but you can also use other API, such as other Cloud Provider services API.
You can also use Cloud Composer, based on Apache Airflow, which has a large library of operator that help you to call different services on different platform. Because it's initially (and still) an open source project, it's portable and there is no Cloud Preference on that solution.
If you want an orchestration tool for multi Cloud usage, Airflow should be the best approach because it proposes many operators for each Cloud Provider.
It's open source, well documented, there are many supports and example in each Cloud Provider.
It's also the best tool if you want to be Cloud Agnostic and migrate it easily to any Cloud provider if needed.
If you host Airflow in Google Cloud, you can use a managed solution with Cloud Composer 2 (GKE autopilot).
Cloud Workflow is a good solution if you stay in the GCP world, it's serverless and based on API calls but it is less documented than Airflow for an usage outside of GCP. It's also not Cloud agnostic and strongly coupled to GCP.
I recommend you using Airflow for your need.

Google Cloud service account monitoring alerts

Background
I have a Google Cloud project running my N applications. Each application has an exclusive IAM service account (total N service account) with minimal permissions.
Scenario
Let's imagine that one of the service accounts was leaked out. An attacker will try to take advantage of these credentials. Because he doesn't know exactly which kind of permissions this account has, we will try to make calls and see if it working for him.
Question
I want to "listen" to audit logs. Once I will see the log from kind "access denied", I will know that something is wrong with this service account.
Is this possible to write all those access denied incidents to Google Cloud Stackdriver?
How you recommend implementing it?
Thank you
Is this possible to write all those access denied incidents to Google
Cloud Stackdriver?
Most but not all Google Cloud services support this. However, access success will also be logged.
You will need to enable Data Access Audit Logs.
This could generate a massive amount of logging information.
Access logs for Org and Folder are only available via API and not the console.
Review pricing before enabling data access audit logs.
How you recommend implementing it?
This question is not suitable for Stackoverflow as it seeks recommendations and opinions. In general you will export your logs to Google Cloud Pub/Sub to be processed by a service such as Cloud Functions, Cloud Run, etc. There are also commercial services such as DataDog designed for this type of service support. Exporting logs to Google BigQuery is another popular option.
Read this article published on DataDog's website on Data Access Audit Logging and their services. I am not recommended their service, just providing a link to more information. Their article is very good.
Best practices for monitoring GCP audit logs
To understand the fields that you need to process read this document:
AuthorizationInfo
This link will also help:
Understanding audit logs
Here is one way to go about it:
Create a new cloud pubsub topic
Create a new log routing sink with destination service of cloud pubsub topic created in the previous step (set a filter to be something like
protoPayload.authenticationInfo.principalEmail="<service-account-name>#<project-name>.iam.gserviceaccount.com" AND protoPayload.authorizationInfo.granted="false" to only get messages about unsuccessful auth action for your service account)
Create a cloud function that's triggered with a new message for the pubsub topic is published; this function can do whatever you desire, like send a message to the email address, page you or anything else you can come up with in the code.

GCP APIs requests without using them

When I have a look at one of my GCP projects under APIs & Services > Dashboard, I can see requests for "Cloud Pub/Sub API" although I do not use this API, all requests are errors, so 100% error rate.
I can see similar situations in other projects with other APIs, the above is only an example.
So far I have tried to find out more within: Logging > Logs Explorer, but it seems that APIs are not included in these logs. In addition I have tried to click on the API within APIs & Services > Dashboard and investigate the menu items "Overview", "Metrics", "Quotas" and "Credentials" but I could not find anything which would describe who/what uses this API.
Any ideas?
EDIT (12. Nov 2020):
I am using the following resources in this project:
App Engine
Compute Engine
Cloud Storage
BigQuery
Cloud SQL

Find out who activated which feature in Google Kubernetes Engine resp. Google Cloud Platform

Istio has been recently activated on the Google Kubernetes Engine cluster we are using. However we don't know who of our team did it.
Is there a way to find out who activated which features on our Google Kuberentes Engine respectively on the Google Cloud Platform in general?
You might want to check the "Activity" tab within GCP Console. It register all the actions that have been made in the project. Also, you can take a look at Stackdriver Logging and check the Cluster activity logs, it will show you the user that executed the Istio activation.

Talend connector to Google Cloud Dataproc

is it possible to connect Talend to Google Cloud Dataproc? And are there any connectors available for it?
On 1 it says it does but can't find any documentation related to it.
If the above is true, I would also like to know if it's is possible to run a job on Spark/Hadoop in Dataproc using Talend.
Thanks.
Yes, it should be possible to connect Talend with multiple Google Cloud products.
Specifically, Talend should support:
Google BigQuery
Google Cloud Dataproc
Google Cloud Storage
Google Pub/Sub
I don't have a specific link on hand to walk through the install and configuration process; however, Talend should be able to provide such details if you contact them.