How do I get the current total cost for Google Cloud Platform programmatically so I can optimize costs? - google-cloud-platform

The Google Cloud Platform billing interface (at https://console.cloud.google.com/billing/<account_number>/reports?project=<project_name>&organizationId=<org_id>) that you get when you click on billing in the Cloud Console has a table with the current month-to-date cost -- the column labeled subtotal in the image whose value I scratched out in blue.
I would like to get that value (or the aggregate cost for a project over some other period) using the RPC API (or other appropriate programmatic method). The keywords subtotal and total don't seem to come up anywhere in the API.
I want to use this for benchmarking. I want to run version X of my code with a certain configuration and load and see how much it cost by looking at the total cost before and after the run. Then I can compare configurations and versions and see the effects of various optimizations.
I did some basic Google searches and checked StackOverflow, but the closest I got were this answer to a question about programmatic costs and this answer to a similar question, but they do not help because the export happens every day and I'd like to check the change in an interval of a few minutes.
In the worst-case scenario, I can scrape the billing web page. But I'm hoping there's a better way since scraping is usually brittle (and Google might not like it).

Related

Google Cloud: how to get billing breakdown (compute time, build time, etc.)?

Edit (solution), summarized from answer below:
Cloud console > billing > reports > preset views (at right) > current month all services (instead of "all projects")
--
I'm trying to see which aspects of my project are incurring costs, specifically the cost for build time and compute time on a given day.
I can access my invoice, but this only gives my a total per day per project, without itemizing resource costs.
There's this question: Where to find Google Cloud Compute Cost Breakdown?
But that doesn't point to specifics. Is there not some easy way to just see a chart or cost list per project?
I see in the docs there used to be a way to download a file, which is now deprecated (!).

Google Cloud hard quota limit

I've been trying for a while to figure out if Google Cloud has a mechanism for a "crowbar" limit on API usage, as a security measure.
The scenario I'm concerned about is, say I have an API keyfile on a server, and a hacker steals it and plugs it into their system that (for the purposes of some nebulous scam) is running an unbounded number of translate requests as quickly as possible. I then receive a $25,000 bill. I don't see any mechanisms to really prevent this.
I can limit the roles for that key, but if the hacker's interest is in the same API I use for my project, that doesn't help. I can set a billing limit, but that just sends me an email, and I am not a person who notices incoming email, so in practice it would take me days to realize this had happened.
I can set a quota, but all the quotas in the dashboard seem to be per-minute or per-day limits. I need a per-month quota, and I don't see any way to do it. Am I just missing something, or does Google simply provide no option for this?
I understand that "well just don't let your API key get compromised" is probably the common wisdom, but that's unacceptable to my use cases for a number of reasons, so I'm not interested in discussing it. Thanks!
Edit: I should note, Google's documentation says "you can set requests per day caps" - but there are no instructions on that page for how to do this, and I can see no mechanism for it. I don't want a per-day cap, I want a per-month cap, but I'd take a per-day cap if I could find one. These are the only quotas I see for Cloud Vision, for instance:
Quotas part 1
Quotas part 2
As per Link 1 there is no Hard quota limit for Vision API on a monthly basis . If you need to use this feature you can request for this feature using the link 2.
In the meantime or as workaround you can control your vision API budget by using the Cloud Billing Budget API by following the link 3.

google prediction api pricing

For google prediction api in the documentation pages it'showing different quota limits in different places for 10$ plan.
In the above link its saying that the prediction limits are 10000/day
https://cloud.google.com/prediction/
whereas in the next link it's saying the limit is 10,000/month
https://cloud.google.com/prediction/pricing?csw=1
If there is anybody who has used this and could tell me which is the correct one I would really appreciate it.
I'd say it's a typo on the first link. 10K free per day seems way too high -
considering you need to contact Google if you're going to do more than 40K per day (see "Usage Limits).
For now, until a Googler can confirm, I'd go with per month i.e. https://cloud.google.com/prediction/pricing?csw=1

What's the easiest way to do a one-time mass geocode? (580,000 addresses)

I am working on a civics related project and I need to be able to display all the properties in the City of Philadelphia on a map, so I'll need to get the latitude & longitude for all 580,000 properties. (Only once)
Most APIs like Google/Yahoo have limits of 5,000 per day, and even BatchGeo has a similar limit.
Is there a way I can do a one-time geocoding of all these addresses?
You can find a list of free and paid geocoding services at USC site.
Also check Microsoft's Geocode Dataflow API, it allows up to 200,000 entries / 300 Mb and takes up to 14 days.
Another possibility to combine several services at once: use 4 services that allow 5,000 entries a day and you'll finish your task in a month.
You can use Map Quest of Cloud Made.
I have created a small utility to help compare these API's.
The utility is hosted at below url:
http://ankit-zalani.appspot.com/GeoCode/index.jsp
Tobias, I work for an address verification (and recently, geocoding) company called SmartyStreets.
Many services have usage restrictions based on volume and license agreements which prevent users from storing the results of geocoding queries. There are some vendors, however, which don't have limits or restrictions like that.
I would recommend something like LiveAddress which will not only geocode the addresses but also perform CASS-Certified verification to make sure your addresses are correct before giving you potentially faulty coordinates. You can run 580,000 or even millions at a time in a few minutes, and we allow you to store your results.
Hope this helps. If you have any more questions about addresses, I'll personally assist.
This thread is pretty old by now, but there have been some developments in recent years making bulk geocoding very cheap. My favorite option is to just obtain a geocoding server on AWS ( google: geocoding on aws), many options there, some free some with low hourly rates (total cost depends on the server you choose, of course.)

How do sites count other sites' visitors and "value", and how can they tell users' location?

Hi actually this is a simple question but just came up out of the curiosity...
I have seen a web evaluation online tool recently called teqpad.com.I have lots of queries on it
How do they do it?? eg:page views daily visitors etc. without mapping real website??...
Website worth...is this getting any near to any site??
I don't know how do they got daily revenue??
I like traffic by country..it has seen same like in Google analytic s..how they got that info??
another one is ISP info and Google map location of server..
is there any one here done similar scripts?? if so what is your opinion??
They may be tracking user browser stats like Alexa does. (More info on Wikipedia.) A group of users installs a plug-in that reports which sites each user visits, like TV ratings work in most (all?) countries. This method is obviously not very reliable, and often nowhere near the actual numbers of visitors.
This is usually based on bullshit pseudo-scientific calculations and never a viable basis for evaluating the "value" of a web site, even though it may be possible to guesstimate the approximate ad revenues a site yields (see 3) But that is only one revenue stream - it says nothing about how expensive the site's daily maintenance is - servers, staff, content creation....
It should be possible to very roughly estimate daily revenue by taking the guesses on daily visitors/page views, count the frequency with which ads are shown, and look at what those ads usually yield per page view. It is probably pretty easy to get some rough numbers on what an ad view is worth on a big site if you're in the market.
and 5. It is possible to track down most IP addresses down to the visitor's country and sometimes even city. See the Geo targeting article on Wikipedia