Trying to figure out Google Cloud Platform quota/ GSuite support - google-cloud-platform

I am new to Google Cloud Platform. I built an app that uses Google Sheets. I have several scripts for the sheet.
I am having an error - invoking too many times. URLfetch error., so I know I am calling the url too many times. I need help:
How do I find out exactly how many url fetches I am calling.
If i need to add more calls, what Google suite subscription do I
need to get?
Thanks

Looks like you are developing Google Apps Script
You can check the quota limits of App Script in the link,
It also points out the quota of URL Fetch calls is 20,000 / day which you already knew.
To your question:
To my acknowledgment, there's no way to check how much daily quota has been used.
You can consider at least G Suite basic for 100,000 / day quota.
If you need more than 100,000 / day, you'll need at least G Suite Business to apply for Early Access flexible quota.

Related

Get Google Cloud Endpoints API usage data by API key

I'm looking recommended ways to extract usage data by API key for a given set of Google Cloud Endpoints APIs. The project bills customers the usage of a certain API.
My goal is to know how many times each client calls each of the ESP apis in a given month so that they can be billed based on their usage.
Example
client_a:
api-key: 12345
client_b:
api-key: 67890
ESP1:
api_esp1
ESP2:
api_esp2_foo
api_esp2_bar
Ultimately, I want to know how many times client_a used each of the available ESP APIs, and how many times client_b used them.
One (part of the) solution is to create a GCP project per client as outlined here. I didn't figure out how to go from there though.

Google Cloud hard quota limit

I've been trying for a while to figure out if Google Cloud has a mechanism for a "crowbar" limit on API usage, as a security measure.
The scenario I'm concerned about is, say I have an API keyfile on a server, and a hacker steals it and plugs it into their system that (for the purposes of some nebulous scam) is running an unbounded number of translate requests as quickly as possible. I then receive a $25,000 bill. I don't see any mechanisms to really prevent this.
I can limit the roles for that key, but if the hacker's interest is in the same API I use for my project, that doesn't help. I can set a billing limit, but that just sends me an email, and I am not a person who notices incoming email, so in practice it would take me days to realize this had happened.
I can set a quota, but all the quotas in the dashboard seem to be per-minute or per-day limits. I need a per-month quota, and I don't see any way to do it. Am I just missing something, or does Google simply provide no option for this?
I understand that "well just don't let your API key get compromised" is probably the common wisdom, but that's unacceptable to my use cases for a number of reasons, so I'm not interested in discussing it. Thanks!
Edit: I should note, Google's documentation says "you can set requests per day caps" - but there are no instructions on that page for how to do this, and I can see no mechanism for it. I don't want a per-day cap, I want a per-month cap, but I'd take a per-day cap if I could find one. These are the only quotas I see for Cloud Vision, for instance:
Quotas part 1
Quotas part 2
As per Link 1 there is no Hard quota limit for Vision API on a monthly basis . If you need to use this feature you can request for this feature using the link 2.
In the meantime or as workaround you can control your vision API budget by using the Cloud Billing Budget API by following the link 3.

Deploy multiple agents with dialogflow

I'm developing a dialogflow agent for bookings. My problem is that I need to deploy the agent for multiple clients with their own calendars. Unfortunately on the Google Cloud Platform is possible to have just one agent per project but at the same time the number of project is limited. How can i solve this? I may have 3 solutions but I'm open to suggestions.
Ask more projects to Google and associate each project to each of my clients. I will be able to manage the projects with a service account. But how much will it cost? May I request like more than 1000 projects?
Create a new Google Cloud Platform account for every client and create a project for each account (Like the qwicklabs account in the google courses). The problem is that I don't know how to scale this solution since I'd need to automate this process and i don't want to create an account manually each time.
Use the same GCP account and the same agent for multiple clients. This may require to insert a unique code when starting the chat to identify to which calendar we are referring. In this way though I won't be able to integrate the chat on the client's website or facebook page unless I don't give the same credentials to everyone.
What do you think could be the best solution? Do you have any other ideas to solve this problem?
Thank you guys
In terms of the best solution, it would best to create a project for each client. As for when using dialogflow products, Each project can have at most one agent, so you need multiple projects if you need multiple agents either way.
Additionally, when it comes to the amount of projects you can have in GCP, the limit for the average user is 30 projects. However, you can always increase the amount of projects by requesting a higher limit. You can do so by referencing this document here.

What is the best tool to use for real-time web statistics?

I operate a number of content websites that have several million user sessions and need a reliable way to monitor some real-time metrics on particular pieces of content (key metrics being: pageviews/unique pageviews over time, unique users, referrers).
The use case here is for the stats to be visible to authors/staff on the site, as well as to act as source data for real-time content popularity algorithms.
We already use Google Analytics, but this does not update quickly enough (4-24 hours depending on traffic volume). Google Analytics does offer a real-time reporting API, but this is currently in closed beta (I have requested access several times, but no joy yet).
New Relic appears to offer a few analytics products, but they are quite expensive ($149/500k pageviews - we have several times this).
Other answers I found on StackOverflow suggest building your own, but this was 3-5 years ago. Any ideas?
Heard some good things about Woopra and they offer 1.2m page views for the same price as Relic.
https://www.woopra.com/pricing/
If that's too expensive then it's live loading your logs and using an elastic search service to read them to get he data you want but you will need access to your logs whilst they are being written to.
A service like Loggly might suit you which would enable you to "live tail" your logs (view whilst being written) but again there is a cost to that.
Failing that you could do something yourself or get someone on freelancer to knock something up for you enabling logs to be read and displayed in a format you recognise.
https://www.portent.com/blog/analytics/how-to-read-a-web-site-log-file.htm
If the metrics that you need to track are just limited to the ones that you have listed (Page Views, Unique Users, Referrers) you may think of collecting the logs of your web servers and using a log analyzer.
There are several free tools available on the Internet to get real-time statistics out of those logs.
Take a look at www.elastic.co, for example.
Hope this helps!
Google Analytics offers real time data viewing now, if that's what you want?
https://support.google.com/analytics/answer/1638635?hl=en
I believe their API is now released as we are now looking at incorporating this!
If you have access to web server logs then you can actually set up Elastic Search as a search engine and along with log parser as Logstash and Kibana as Front end tool for analyzing the data.
For more information: please go through the elastic search link.
Elasticsearch weblink

What's the easiest way to do a one-time mass geocode? (580,000 addresses)

I am working on a civics related project and I need to be able to display all the properties in the City of Philadelphia on a map, so I'll need to get the latitude & longitude for all 580,000 properties. (Only once)
Most APIs like Google/Yahoo have limits of 5,000 per day, and even BatchGeo has a similar limit.
Is there a way I can do a one-time geocoding of all these addresses?
You can find a list of free and paid geocoding services at USC site.
Also check Microsoft's Geocode Dataflow API, it allows up to 200,000 entries / 300 Mb and takes up to 14 days.
Another possibility to combine several services at once: use 4 services that allow 5,000 entries a day and you'll finish your task in a month.
You can use Map Quest of Cloud Made.
I have created a small utility to help compare these API's.
The utility is hosted at below url:
http://ankit-zalani.appspot.com/GeoCode/index.jsp
Tobias, I work for an address verification (and recently, geocoding) company called SmartyStreets.
Many services have usage restrictions based on volume and license agreements which prevent users from storing the results of geocoding queries. There are some vendors, however, which don't have limits or restrictions like that.
I would recommend something like LiveAddress which will not only geocode the addresses but also perform CASS-Certified verification to make sure your addresses are correct before giving you potentially faulty coordinates. You can run 580,000 or even millions at a time in a few minutes, and we allow you to store your results.
Hope this helps. If you have any more questions about addresses, I'll personally assist.
This thread is pretty old by now, but there have been some developments in recent years making bulk geocoding very cheap. My favorite option is to just obtain a geocoding server on AWS ( google: geocoding on aws), many options there, some free some with low hourly rates (total cost depends on the server you choose, of course.)