I am trying to extract price information about the virtual machines supplied by the compute engine service in Google Cloud. I have successfully extracted a JSON file with some pricing information on it using an HTTP Get request from Google Cloud's Pricing API, however all of the pricing data in the file is mapped to individual machines via the SKU number, and there are no machine specifications associated with any of the SKU numbers. Here is the request:
GET https://cloudbilling.googleapis.com/v1/services?key=API_KEY
Ideally, I would like to find a way to have the machine specifications included in the JSON file returned by the HTTP request, but if that isn't possible I'd like to find a way use the SKU to look up a machine's specifications. For example, if my SKU is: 19E4-D27B-7C12, I'd like to use that code to look up the machine it specifies and see details about it such as amount of RAM, number of CPUs, etc... Does anyone know of any Google Cloud resources that would allow me to do such a thing? And if not, is there any other way to accomplish this task? I'd like this process to be programmatic so I cannot use the built in calculator in Google Cloud.
Thank you!
It looks like currently there is no an API that can get the features of a certain machine type, however, there is an API:
https://cloud.google.com/compute/docs/reference/rest/v1/instances/get
That allows you to get detailed features of an existing virtual machine based on the project, the zone and the instance name.
Related
I'm working on a project which requires me to generate a list of machine types available in gcp compute engine along with their price.
I'm able to generate the list of machines using the compute client for a particular region but am unable to get their price. I'm exploring the billing client to see if it is possible.
Can anyone suggest what would be best for this problem?
I would like to know if there is a way to collect the technical specifications of a virtual machine from Google Cloud (CPU, frequency, memory, storage) ?
I am using the billing API (https://cloudbilling.googleapis.com/v1/services/6F81-5844-456A/skus?key=API_KEY) to get the pricing information but there is no technical specifications.
Is there a way to get that data ? May be using the products' SKU or something ? I know AWS and Azure SDKs/APIs allows developers to get the technical information but I did not find the GCP equivalent for this.
I searched for a while for something like this but it seems like a lot of people had the same issue but no one had a working answer.
The compute API offers you several operations that you can use to obtain the desired information.
Especially consider review the instances get operation, it will provide you all the details of the instance.
You can obtain the list of instances in a certain zone if required using the instances list operation.
If you are trying to obtain information about machines families, GCP provides a rich API as well.
I am a new user with Google cloud (Stackdriver).
I would like to set and generate uptime reports on a monthly basis which would include the past 4 weeks through e-mail in the cloud but I have not been able to find from where I could do this.
I have done research but have not managed to find what I am looking for. The closes I got to was TRACE but is still not what I would like to have.
It's not possible to generate that kind of reports using tools available in Google Cloud.
Using traces is probably the best you can do now - although you can try the Cloud Trace API which may give you a way to extract the information in a more structured way.
If you want this feature included in GCP please go to IssueTracker and create a new feature request with detailes explanation of what your goal is and mention the time-span you want to be able to get data from.
We are trying to look into the details of Google Cloud Vision transactions. We are interested on the Cloud Vision requests where the returned processing is below satisfactory (e.g. empty JSON). In general: we are interested in what input was received and what did GCV process with that?
I had assumed this would be auto-logged?
It seems that the default logging solution does not provide much information about the value of the transaction other than the time or error type. (Is there a way to dig deeper into the log?)
Is there a way to log (or somehow view the uploaded url of) the original image that the service received and/or the results of the processed request?
Could you provide an example of how to retrieve the detected results and/or the input image, say, for "DOCUMENT_TEXT_DETECTION"
Can you be a bit more specific? Which specific Google Cloud Vision service are you trying to use (Image Classification, Object Detention)? Are you using the GCP console (i.e. UI), the API, ...? Which can of information do you want to get?
In any case, you can use advanced logs to have a look at your Google Cloud Vision logs. For instance, you can use the following filter to see the error logs:
protoPayload.serviceName="vision.googleapis.com"
severity>=ERROR
Or remove the second line for getting all the logs related with Cloud Vision. You can then click on "Expand" to get all the information about the job.
I 'm not sure if this is the correct platform to ask architecture related question, actually I have a webapplication developed in nodejs & typescript hosted in AWS, and the backend is mongodb and my requirement is to include a search box with wild card & auto suggest search functionality so when I start typing on the text box, it will autosuggest just like we do in google search, so how would I achieve this, querying everytime to mongodb will be kind of slow and if 100's of user start doing that, then my application might start dangling so need your suggestion.
Not tried as this more of architecture help required
Not tried as this more of architecture help required
It's not a very detailed answer but may point you in a direction.
I just built something similar using AWS Lambda, ElasticSearch and API Gateway.
ElasticSearch is great for text searches but needs to be populated with indexed data.
If your dataset is changing, you will have to remember about updating ElasticSearch.
API Gateway routes requests from HTTP to Lambda, of which there are two:
one for analysing data in my data warehouse and producing indices for ElasticSearch, the other for doing the actual search and returning results.