I'm looking to use the Google Charts for generating charts using JavaScript and I read that the old Google Chart Api was limited for the amount of queries per day per users (read post here).
So I was wondering if this limitation was applied to the Google Charts (using JavaScript) or only for the images, and/or if there is an other limitation (free for non commercial, etc).
Basically, the Interactive Charts are not limited, since with most charts, the workload is entirely handled on the client side, so there is no workload for Google to limit. For the few charts that do access Google's servers (geocoding requests from GeoCharts/GeoMaps are the only ones that come to mind), there is not currently a limit.
The API is free to use for both commercial and non-commercial purposes.
Read the Terms of Service for details.
Related
I am implementing public API using Google Cloud Endpoints & Google Cloud Functions. This API will later be used in web application, and I need a way to throttle number of requests to prevent people from flooding API with huge number of requests, which could increase project maintenance costs. I don't care if API becomes unavailable due to throttling, protecting myself from those costs takes higher priority.
What should I do or which tools should I use to achieve this on Google Cloud?
If we look at the Google Cloud Endpoints documentation in the section called About Quotas we will find a description of a capability to limit the number of requests from calling applications. The article then goes into depth on how to set it all up which appears to be to add additional attributes to your API's exposed Open API spec.
There is also a great article called Rate-limiting strategies and techniques that provides a rich set of alternatives and thoughts. My suggestion would be to read this article in depth which will arm you with an overview of each of the choices at your disposal. There is also a rich set of additional references at the end of the article for further reading.
Is there any way to programmatically get data similar to APIs overview of Google CLoud dashboard. Specifically, I'm interested in the list of APIs enabled for the project and their usage/error stats for some predefined timeframe. I belive there's an API for that but I struggle to find it.
There's currently no API that gives you a report similar to the one you can see through the Google Cloud Console.
The Compute API can retrieve some quotas with the get method but it's somewhat limited (only Compute Engine quotas) and, for what I understood from your question, not quite what you're looking for.
However, I've found in Google's Issue Tracker a feature request that's close to what you're asking for.
If you would need something more specific or want to do the feature request yourself, check the "Report feature requests" documentation and create your own. The GCP team will take a look at it to evaluate and consider implementation.
Our product requires support for a high level dashboard highlighting metrics pertaining to the business modeled by the product. Its a B2B product with a web application as a front-end.
We use Amazon RDS to store business data. What is the best approach to build a customizable dashboard in AWS?
The traditional approach would be to create metrics, process business data (analysis) and store the metrics/results in output tables. And then use a fancy charts library in the web application to build a support a panel/chart/graphic view per such metric. Support drill downs etc.
However, I believe this approach is outdated and too much work. Are there any readmade solutions available? Ideally, I should just be able to push results data to a third party solution, and the third party solution should allow for creation of custom dashboards (custom implies that the user can choose to select particular metrics/panels he would like to see in the dashboard, and their relative order& position in the dashboard) and embedding them in the products web application, to enable to end user to view as well as customize dashboards.
I'm aware of AWS quicksight, but it does not support embedding output views/dashboards into a web application.
I like Azure Power BI; At least you get to embed the dashboard in the web application, but I do not believe the end user can customize the embedded view. And ofcourse, Azure PowerBI is not on AWS ;).
Is there a mature 3rd party solution that we can evaluate? The scale of the data is very very small. So we are not looking for a high performance enterprise solution (might be too expensive anyway!). However, the need for customizability of dashboards view is high.
Embedded dashboards were added to QuickSight in November 2018, so you can now use that tool if you'd prefer.
https://docs.aws.amazon.com/quicksight/latest/user/embedded-dashboards-setup.html
Tableau is the first possibility, but it is expensive.
Infogram may also be an option: https://infogram.com/examples/dashboards
If you have small data volumes and you want cusomisation you may unfortunately have to look into writing or at least customising an existing solution.
I operate a number of content websites that have several million user sessions and need a reliable way to monitor some real-time metrics on particular pieces of content (key metrics being: pageviews/unique pageviews over time, unique users, referrers).
The use case here is for the stats to be visible to authors/staff on the site, as well as to act as source data for real-time content popularity algorithms.
We already use Google Analytics, but this does not update quickly enough (4-24 hours depending on traffic volume). Google Analytics does offer a real-time reporting API, but this is currently in closed beta (I have requested access several times, but no joy yet).
New Relic appears to offer a few analytics products, but they are quite expensive ($149/500k pageviews - we have several times this).
Other answers I found on StackOverflow suggest building your own, but this was 3-5 years ago. Any ideas?
Heard some good things about Woopra and they offer 1.2m page views for the same price as Relic.
https://www.woopra.com/pricing/
If that's too expensive then it's live loading your logs and using an elastic search service to read them to get he data you want but you will need access to your logs whilst they are being written to.
A service like Loggly might suit you which would enable you to "live tail" your logs (view whilst being written) but again there is a cost to that.
Failing that you could do something yourself or get someone on freelancer to knock something up for you enabling logs to be read and displayed in a format you recognise.
https://www.portent.com/blog/analytics/how-to-read-a-web-site-log-file.htm
If the metrics that you need to track are just limited to the ones that you have listed (Page Views, Unique Users, Referrers) you may think of collecting the logs of your web servers and using a log analyzer.
There are several free tools available on the Internet to get real-time statistics out of those logs.
Take a look at www.elastic.co, for example.
Hope this helps!
Google Analytics offers real time data viewing now, if that's what you want?
https://support.google.com/analytics/answer/1638635?hl=en
I believe their API is now released as we are now looking at incorporating this!
If you have access to web server logs then you can actually set up Elastic Search as a search engine and along with log parser as Logstash and Kibana as Front end tool for analyzing the data.
For more information: please go through the elastic search link.
Elasticsearch weblink
i want a simple hosted data store for licensed for business applications. i want the following features:
REST-like access for CRUD operations (primarily adding records)
private and authenticated
makes for easy integration with a front end charting client like Google Visualization Apis
easy to use and set up
what about:
* Google Fusion Tables
* Google Cloud Services
* Google BigQuery
* Google Cloud SQL
or other non-google products. but i am imagining a cleaner integration between Google Charts and one of their backend data services.
Pros, Cons, Advice?
First, since this is Stack Overflow, I won't attempt to provide a judgement about how about "easy to use and setup" - that can be done by you reading the documentation for each product.
That being said, overall, the "right" answer really depends on what you are trying to do, and how much data you have. It also depends on what type of application you are building (this is Stack Overflow, so I am assuming you are a developer).
Relational Databases (like Google Cloud SQL) are great for maintaining transactional consistency but once your data grows massive it becomes difficult, expensive, or impossible to run analysis queries in a reasonable timeframe.
Google BigQuery is an analysis tool that allows developers to ask questions about really really big datasets using an SQL like language. It is 100% cloud based and is accessed via RESTful API - but it only allows for appending data, not changing individual records.