How to generate uptime reports through Google Cloud Stackdriver? - google-cloud-platform

I am a new user with Google cloud (Stackdriver).
I would like to set and generate uptime reports on a monthly basis which would include the past 4 weeks through e-mail in the cloud but I have not been able to find from where I could do this.
I have done research but have not managed to find what I am looking for. The closes I got to was TRACE but is still not what I would like to have.

It's not possible to generate that kind of reports using tools available in Google Cloud.
Using traces is probably the best you can do now - although you can try the Cloud Trace API which may give you a way to extract the information in a more structured way.
If you want this feature included in GCP please go to IssueTracker and create a new feature request with detailes explanation of what your goal is and mention the time-span you want to be able to get data from.

Related

Analyze Number value in Different Conditions with google cloud platform logging

I'm struggling to find out how to use GCP logging to log a number value for analysis, I'm looking for a link to a tutorial or something (or a better 3rd party service to do this).
Context: I have a service that I'd like to test different conditions for the function execution time and analyze it with google-cloud-platform logging.
Example Log: { condition: 1, duration: 1000 }
Desire: Create graph using GCP logs to compare condition 1 and 2.
Is there a tutorial somewhere for this? Or maybe there is a better 3rd party service to use?
PS: I'm using the Node google cloud logging client which only talks about text logs.
PSS: I considered doing this in loggly, but ended up getting lost in their documentation and UI.
There are many tools that you could use to solve this problem. However, you suggest a willingness to use Google Cloud Platform services (e.g. Stackdriver monitoring), so I'll provide some guidance using it.
NOTE Please read around the topic and understand the costs involved with using e.g. Cloud Monitoring before you commit to an approach.
Conceptually, the data you're logging (!) more closely matches a metric. However, this approach would require you to add some form of metric library (see Open Telemetry: Node.js) to your code and instrument your code to record the values that interest you.
You could then use e.g. Google Cloud Monitoring to graph your metric.
Since you're already producing a log with the data you wish to analyze, you can use Log-based metrics to create a metric from your logs. You may be interested in reviewing the content for distribution metric.
Once you've a metric (either directly or using logs-based), you can then graph the resulting data in Cloud Monitoring. For logs-based metrics, see the Monitoring documentation.
For completeness and to provide an alternative approach to producing and analyzing metrics, see the open-source tool, Prometheus. Using a 3rd-party Prometheus client library for Node.js, you could instrument you code to produce a metric. You would then configure Prometheus to scrape your app for its metrics and graph the results for you.

Google Cloud APIs usage data by projects

Is there any way to programmatically get data similar to APIs overview of Google CLoud dashboard. Specifically, I'm interested in the list of APIs enabled for the project and their usage/error stats for some predefined timeframe. I belive there's an API for that but I struggle to find it.
There's currently no API that gives you a report similar to the one you can see through the Google Cloud Console.
The Compute API can retrieve some quotas with the get method but it's somewhat limited (only Compute Engine quotas) and, for what I understood from your question, not quite what you're looking for.
However, I've found in Google's Issue Tracker a feature request that's close to what you're asking for.
If you would need something more specific or want to do the feature request yourself, check the "Report feature requests" documentation and create your own. The GCP team will take a look at it to evaluate and consider implementation.

Is there any way to print to stdout, stderr, or log files in google deployment manager?

I would like to write debug information in my DM templates, but I cannot see nor find a way to generate print statements, logs, or anything to aid in debugging when something goes wrong with my template.
How do I add print or logging to deployment manager?
I checked, currently the only way to troubleshoot is to rely on the expanded template from the Deployment Manager Dashboard. You can check it in the following URL for a given deployment, but I guess you were already aware of this possibility:
https://console.cloud.google.com/dm/deployments/details/DEPLOYMENTNAME?project=PROJECTID
However a feature request has been opened and currently the engineering team is working and discussing the best way to provide to the customers this possibility.
https://issuetracker.google.com/80368273
I advise you to star the feature request in order to get updates via email and to place a comment in order to show the interest of the community.
All the official communication regarding that feature will be posted there.
Disclaimer: I work for Google Cloud Platform Support

Getting data from local running java app to google cloud app and back

I wanted to dive into the world of distributed systems, cloud computing, IoT, etc., and I gotta be honest, I imagined everything being a little more intuitive than it finally turned out.
I had a tiny testing architecture in mind, that I'd like to set up with Google Clouds and their services, but I am kinda stuck since I can't get my head around some concepts.
What I basically wanted to do (as a first step) is writing a simple java application that would run locally on my computer. This application should just generate random numbers and send those numbers somehow to the google cloud. On the cloud I wanted to define another java application that would manipulate those random numbers in some kind of way (it doesn't matter actually). Afterwards, the output should somehow get back to me of course. And actually, at the moment, I don't even care about how exactly. It could be somehow back to my local app (with some kind of listener, would that be possible?). But it could also simply store the results somewhere on the google cloud? Or maybe upload them to my google drive?
I guess you already noticed that - at some points - I don't even know what i want exactly, since I'm not sure of what is possible, and what not.
Could you provide me some help to get this set up?
The most important questions for me right now are:
Do I need to use a pubsub system, where my generated numbers are sent
to, and which then forwards this to the cloud app, that transforms my
data?
How do I get my data from the local app to the cloud services?
Would my data transforming app run on Google Dataflow?
Above I wrote "as a first step"... because later I would also like to send config files (for example in json format, or xml) to the cloud, and the
cloud application should transform those config files... if I get the
first scenario running the I guess this woul also be no problem
right?
Those are just a few of the questions that are on my mind currently. The most important ones I guess.
It would be a big help. Sorry, if the questions are not very precise, but I really need some kind of pointing into the right direction.
Thank you in advance!
I think it would be good to read up on some of the technologies you mention here:
Google Cloud Pubsub: Pub/Sub enables you to publish messages to a topic, and consume them in another place in the (Google) Cloud. You can see some different examples of publishers and consumers in the link. In your case you could for example write a Java application that writes random numbers to the Pub/Sub queue, where they will sit for 7 days to be consumed by another component (for example, Google Cloud Dataflow). To get started developing, you can find the SDKs here (there is a Java SDK).
Google Cloud Dataflow is managed service running Apache Beam pipelines to process your data at scale. You can learn about the different concepts here and get started designing your pipeline here. I suggest taking a look at some examples first though, which will make it more easy to grasp what is actually going on. Dataflow has a PubSub connector, so in your application you will be able to read from the topic you created before. In Dataflow you can for example multiply all your random numbers and write them to a certain sink (for example Google Cloud Storage, or even BigQuery or PubSub again).
Google Cloud Storage: is a cloud storage where you can put files, for example the output of your Dataflow pipeline. You will be able to manually download the files using the Cloud Console UI, or you can use one of the SDKs to download the output programmatically.
Hope this gives you an overview and some pointers to start. Whenever you are ready and have a more concrete use case in mind, you can start looking at some more components.

What is the best tool to use for real-time web statistics?

I operate a number of content websites that have several million user sessions and need a reliable way to monitor some real-time metrics on particular pieces of content (key metrics being: pageviews/unique pageviews over time, unique users, referrers).
The use case here is for the stats to be visible to authors/staff on the site, as well as to act as source data for real-time content popularity algorithms.
We already use Google Analytics, but this does not update quickly enough (4-24 hours depending on traffic volume). Google Analytics does offer a real-time reporting API, but this is currently in closed beta (I have requested access several times, but no joy yet).
New Relic appears to offer a few analytics products, but they are quite expensive ($149/500k pageviews - we have several times this).
Other answers I found on StackOverflow suggest building your own, but this was 3-5 years ago. Any ideas?
Heard some good things about Woopra and they offer 1.2m page views for the same price as Relic.
https://www.woopra.com/pricing/
If that's too expensive then it's live loading your logs and using an elastic search service to read them to get he data you want but you will need access to your logs whilst they are being written to.
A service like Loggly might suit you which would enable you to "live tail" your logs (view whilst being written) but again there is a cost to that.
Failing that you could do something yourself or get someone on freelancer to knock something up for you enabling logs to be read and displayed in a format you recognise.
https://www.portent.com/blog/analytics/how-to-read-a-web-site-log-file.htm
If the metrics that you need to track are just limited to the ones that you have listed (Page Views, Unique Users, Referrers) you may think of collecting the logs of your web servers and using a log analyzer.
There are several free tools available on the Internet to get real-time statistics out of those logs.
Take a look at www.elastic.co, for example.
Hope this helps!
Google Analytics offers real time data viewing now, if that's what you want?
https://support.google.com/analytics/answer/1638635?hl=en
I believe their API is now released as we are now looking at incorporating this!
If you have access to web server logs then you can actually set up Elastic Search as a search engine and along with log parser as Logstash and Kibana as Front end tool for analyzing the data.
For more information: please go through the elastic search link.
Elasticsearch weblink