How to use multiple Dialogflow agents in one firebase app - google-cloud-platform

I have multiple agents in Dialogflow and would like to connect them into a single app. However, since each agent is created in their respective GCP project, how do I connect one agent into another which I am using to run firebase hosting.

This sounds like you could be looking for a feature such as Mega Agents.
It allows you to run multiple agents under one parent agent and so combining multiple bots together in one experience.
Do note that it is a beta feature at this time, so it could be that some features changes in the future.

Related

Simultaneous Deploys from Github with Cloud Build for Multi-tenant architecture

My company is developing a web application and have decided that a multi-tenant architecture would be most appropriate for isolating individual client installs. An install would represent an organization (a nonprofit, for example) and not an individual users account. Each install would be several Cloud Run applications bucketed to an individual GCP project.
We want to be able to take advantage of Cloud Build's GitHub support to deploy from our main branch in GitHub to each individual client install. So far, I've been able to get this setup working across two individual GCP projects, where Cloud Build runs in each project individually and deploys to the individual GCP project Cloud Runs at roughly the same time and with the same duration. (Cloud Build does some processing unique to each client install so the build processes in each install are not performing redundant work)
My specific question is can we scale this deployment technique up? Is there any constraint preventing using Cloud Build in multiple GCP projects to deploy to our client installs, or will we hit issues when we continue to add more GCP projects? I know that so far this technique works for 2 installs, but will it work for 20, 200 installs?
You are limited to 10 concurrent builds per project. But if you run one cloud build per project, there is no limitation or known issues.

Using cloud functions vs cloud run as webhook for dialogflow

I don't know much about web development and cloud computing. From what I've read when using Cloud functions as the webhook service for dialogflow, you are limited to write code in just 1 source file. I would like to create a real complex dialogflow agent, so It would be handy to have an organized code structure to make the development easier.
I've recently discovered Cloud run which seems like it can also handle webhook requests and makes it possible to develop a complex code structure.
I don't want to use Cloud Run just because it is inconvenient to write everything in one file, but on the other hand it would be strange to have a cloud function with a single file with thousands of lines of code.
Is it possible to have multiple files in a single cloud function?
Is cloud run suitable for my problem? (create a complex dialogflow agent)
Is it possible to have multiple files in a single cloud function?
Yes. When you deploy to Google Cloud Functions you create a bundle with all your source files or have it pull from a source repository.
But Dialogflow only allows index.js and package.json in the Built-In Editor
For simplicity, the built-in code editor only allows you to edit those two files. But the built-in editor is mostly just meant for basic testing. If you're doing serious coding, you probably already have an environment you prefer to use to code and deploy that code.
Is Cloud Run suitable?
Certainly. The biggest thing Cloud Run will get you is complete control over your runtime environment, since you're specifying the details of that environment in addition to the code.
The biggest downside, however, is that you also have to determine details of that environment. Cloud Funcitons provide an HTTPS server without you having to worry about those details, as long as the rest of the environment is suitable.
What other options do I have?
Anywhere you want! Dialogflow only requires that your webhook
Be at a public address (ie - one that Google can resolve and reach)
Runs an HTTPS server at that address with a non-self-signed certificate
During testing, it is common to run it on your own machine via a tunnel such as ngrok, but this isn't a good idea in production. If you're already familiar with running an HTTPS server in another environment, and you wish to continue using that environment, you should be fine.

Separate URL for each git branch in Cloud Run

I am looking into Cloud Run to host my new app, and I am wondering if it is possible to generate a separate URL for each git branch.
I use Netlify to host my other app. When it is connected to GitHub (or other VCS services), it builds the source code in a branch and deploy it to a URL that is specific to the branch.
Can it be done easily or do I have to write some logic?
Or do you think AWS amplify or some other services are of better fit?
The concept of Cloud Run and URLs is quite simple:
https://<service-name>-<project hash>.<region>.run.app
If your project and region are the same for all the branches, you simply have to deploy a different service for each branch to get a different URL.
That was for Cloud Run. Now, I'm not sure that Netlify is compliant with Cloud Run. I found no documentation on this.
This answer won't be directly useful to you but I think it's relevant and worth mentioning
The open source Knative API (and implementation actually exposes a "tag" feature while splitting the traffic between multiple revisions: https://github.com/knative/docs/blob/master/docs/serving/spec/knative-api-specification-1.0.md#traffictarget
This feature is not currently supported on Cloud Run fully managed, but it will be.
By tagging releases this way, you could define tag: v1 and tag: v2 in your traffic configuration, and you would get new URLs like:
https://v1-SERVICE_NAME...run.app
https://v2-SERVICE_NAME...run.app
that directly go to these specific versions.
And the interesting thing is, these revisions you specified in the traffic: block of the Service object do not have to receive any traffic (you can say traffic percentage: 0) but it would still create a domain name like I showed above to the inactive revisions of your app.
So when Cloud Run fully-managed supports tag fields, you can actually achieve this, although it will be less out-of-the-box experience than Netlify.

Running a background task involving a third-party service in Django

The use-case is this: we need to pull in data from a third-party service and update the database with fresh records, every week. The different ways I have been able to explore have been
either creating a custom django-admin command
or running a background task using Celery (and probably ELK for logging)
I just want to know which way is more feasible and simpler? And if there's another way that I can explore. What I want is monitoring the task for the first few runs then just relying on the logs.

aws alternative console/management web ui (stop/start/create/list)

Been searching for the last few days for an alternative to the AWS Console Web ui.
We want to give this to our employees to manage their development/test environment their self without interaction from the IT team.
What we want is an extra Web ui that does the following "management tasks"
List all instances (maybe based on tags)
Stop / Start instances
Create / Destroy instances (from specific AMI's)
Also we would like to have logs + authentication (preferable LDAP)
I found a few but none of them actually was that simple.
We would also prefer to have a django/python based application but sinatra is also fine.
Alternatives that I found:
Asgard from netflix
Spurios (has no EC2 instances)
We also found Flask app builder to build our own app but it would be nice if some things already exsists. I believe many company's want the same but are they keeping that for internal use only ?
Maybe you know more projects that I, for some reason, did not stumble on.