Is there any way that i can read from BigQuery using Dialogflow Chatbot - google-cloud-platform

I want to implement a function where I want to display the data in the dialogflow chatbot that is retrieved from BigQuery using a select statement.Is this possible.Kindly help

Based on what was mentioned in the comments sections, seemingly Dialogflow Fulfillment is exactly what you need here. When user types an expression, Dialogflow matches the intent and send webhook request based on the adjusted fulfillment function, the webhook service then performs the intended action like calling API services or affording some other business logic processes.
Dialogflow integration with Bigquery also requires applying fulfillment code to compose appropriate GCP Cloud function that will handle communication with Bigquery API service. Said this, you can use built-in Inline editor and write your fulfillment function, however it doesn't accept any other then Node.js programming language.
Saying more about implementation, I think you can follow codelabs tutorial which provides in detail the general workflow, assuming that you can inject your own code replacing the domestic addToBigQuery() function in Index.js from the example. For this purpose you can visit nodejs-bigquery Github repository containing a lot of useful code samples, in particular the generic query() function that might match your initial aim.

Related

POST Request to REST API with Apache Beam

I have the use case that we're pulling messages from PubSub, and then the idea is to POST those messages to the REST API of PowerBI. We want to create a Live Report using the PushDatasets feature.
The main idea should be something like this:
PubSub -> Apache Beam -> POST REST API -> PowerBI Dashboard
I haven't found any implementation about POST Request inside an Apache Beam job (the runner is not a problem right now), just a GET request inside a DoFn. I don't even know if this is possible.
Has someone experienced doing something like this? or maybe another framework/tool that may be more helpful?
Thanks.
Sending POST requests to an external API is certainly possible, but requires some care. It could be as simple as making the POST inside the body of a DoFn, but be aware that this could lead to duplicates since messages within your pipeline belong to a batch and the Beam model allows entire batches to be reprocessed in case of worker failures, exceptions, etc.
There is some advice in the beam docs on grouping elements for efficient external service calls.
Choosing the best course of action here largely depends on the details of the API you're calling. Does it take message IDs that can be used for deduplication on the PowerBI side? Can the API accept batches of messages? Is there rate limiting?

GCP Best way to manage multiple cloud function flow

I'm studying GCP and reading about different ways to communicate and manage cloud functions I end up wondering when to use each of the services that offer GCP.
So, I have been reading about GCP Composer, GCP Workflows, Cloud Pub/Sub and I don't see clearly when to use each one, or use simple HTTP calls.
I understand that it depends a lot on the application that you are building, but for example, If I'm building a payment gateway and some functions should be fired after the payment was verified, like sending emails, making not related business logic, adding the purchase to a sales platform. So which one should be the way I manage this flow and in which case would be better to use the others? Should I use events to create an async flow with Pub/Sub, or use complex solutions like composer and workflows? or just simple HTTP calls?
As always, it depends!! Even in your use case, it depends! Ok, after a payment you want to send an email, make business logic, adding the order to your databases,...
But, is all theses actions can be done in parallel, or you need to execute them in a certain order and if a step fails, you stop the process?
In the first case, you can use Cloud PubSub with 1 message published (payment OK) and then a fan out to several functions in parallel. Else, you can use workflow to test the response of the fonction and then to call, or not the following fonctions. With composer you can perform much more checks and actions.
You can also imagine to send another email 24h after to thank the customer for their order, and use Cloud Task to delayed an action.
You talked about Cloud Functions, but you also have other solutions to host code on GCP: App Engine and Cloud Run. Cloud function is, most of the time, single purpose. Sending an email is perfect for a function.
Now, if you have "set of functions" to browse your stock, view the object details, review the price, and book an object (validate an order "books" the order content in your warehouse), the "functions" are all single purpose but related to the same domain: warehouse management. Thus you can create a webserver that propose different path to manage the warehouse (a microservice for the warehouse if you prefer) and host it on CloudRun or App Engine.
Each product has its strength and weakness. You will also see this when you will learn about the storage on GCP. Most of the time, you can achieve things with several product, but if you don't use the right one, it will be slower, or cost much more.

How create a combined response from multiple microservices (cloud run containers) in a single api endpoint using Google Cloud Endpoints (gateway)?

I am familiar with firebase platform, but I am relatively a new user of the google cloud platform as whole.
I am working on a project built using a microservices structure, and I do have so many question for which I cannot find an answer or better I cannot find any example.
Unfortunately all the example that I am able to find are way to simple to be able to extrapolate a viable answer for my issues.
I adopted the new cloud run offer, and I decided to play with the full managed version (not kubernetes). I built few microservices (each service is built using express for node or flask for python - depending on what the services does). Each microservices expose it's own endpoint and has it's own api to call the methods - and I use a service account to allow the application to perform the internal calls.
I now want to expose the application to the external (specifically to my client built using vuejs technology), and I was trying to leverage another google product to create and expose an api: the google endpoints.
My question (specifically referred to the cloud run structure) is related to how is possible and what I need to do to create an api endpoints to communicate with the client app, that internally calls multiple services and combine their response in one.
Just to be clear, let's make an example:
Cloud run service 1 -> crud user api
Cloud run service 2 -> crud product api
Cloud endpoint external visible api -> get user from service 1, and after get products from service 2 and return the combined response all green products for user Jane Doe.
How I can aggregate the response directly in the endpoint gateway, check for failure and if everything goes smooth send the aggregate response to the client?
I need to build the aggregate endpoint in something else, like a cloud function for example? or I can do it directly in the google endpoints gateway?
Note that for cloud run the google endpoints is another cloud run container.
Thanks guys for some help, running pretty much out of option here.
As per my understanding, API Gateway should just work as a proxy, presenting all micro services as a single endpoint. To this scenarios I think you can have following 2 approaches :
1: Implement a new micro service (or on any of the existing one) which will do invocations and aggregation of responses.
2: Client(like UI) can invoke the services and do the aggregation on their side as well.
I feel, it is not a good idea to do it at api-gateway.
In my opinion, from an architectural point of view, the best option for you is to create a new microservice which will take the responses from the other two and then, it will aggregate them.
I understand that you want to aggregate the responses in a api-geteway and you are not able to find code examples for it. Here I was able to find a guide on what are you wanting to implement. The full code implementation can be found in this repository.
Keep in mind though, this idea of implementation is not a best practice.
This is ok, only if those two services that are going to be combined are independent. Meaning there is no functional/business relation between them and the concurrency or inconsistency problem will not occur in the process of aggregating.

Is there any way to have multiple intents code flow in the chatbot created using Amazon Lex?

I am trying to build a conversation chatbot which has branching conversations resulting in different intents being called based on the results. Something like
A
|\
| C
|/
D
So the intent A is the trigger for the bot and based on inputs taken in A it might go to C for further data or go to D. I was going through the documentation but couldn't find anything to implement this kind of interface. How can i implement this kind of chat interface in Lex.
Thanks for your time and help
This is possible by using lambda.
you have to write a middle ware service using aws lambda and then your lex service will call lambda service for the results and that result can be now trigger the next bot using lambda trigger service.
Might need lot of work around but it is possible.
For more information http://docs.aws.amazon.com/lex/latest/dg/using-lambda.html

Google Cloud Spanner: Want Java API for doing my own retries

This is really a question for the Google Cloud Spanner Java API team...
Looking at the new Google Cloud Spanner service, it appears that the only way to perform read/write transactions is by providing a callback, via the TransactionRunner interface.
I understand that the API is trying to hide the details of the need to automatically retry transactions as a convenience to the programmer, but this limitation is a serious problem, at least for me. I need to be able to manage the transaction lifecycle myself, even if that means I have to perform my own retries (e.g., based on catching some sort of "retryable" exception).
To make this problem more concrete, suppose you wanted to implement Spring's PlatformTransactionManager for Google Cloud Spanner, so as to fit in with your existing code, and use your existing retry logic. It appears impossible to do that with the current Java API.
It seems like it would be easy to augment the API in a backward compatible way, to add a method returning a TransactionContext to the user, and let the user handle the retries.
Am I missing something? Can this alternate (more traditional) transaction API style be added to the Java API?
You are right in that TransactionRunner is the only way to do Read write transactions in the Java Client for Cloud Spanner. We believe that most users would prefer using that vs hand rolling their own retry logic. But we realize that it might not fit the needs of all the users and would love to hear about such use cases. Can you please file a feature request and we can further discuss there.