I'm writing a cloud function that process a pub/sub message which contains a grpc message. I would like, at install/(re)deploy time, the cloud function to perform some action: pull the protobuf definition from some GitHub repository and generate the correspondent python code with grpcio-tools, roughly in the same line of this.
So far I can only find in the documentation how to add dependencies, however I'm looking for some sort of on install "hook": something that would allow me to perform some actions before the function is actually deployed.
Is this possible? Any advice?
Cloud Functions has no such hook. You can perform the work in a script on the machine that performs the deployment. It's not uncommon to write scripts to automate work like this.
Related
i am new to AWS.
I need to create a Lambda function in AWS, but before it i need to review some code of previously created functions. But when i want to review code of function there's a message
The deployment package of your Lambda function "tes-GetInfo" is too large to enable inline code editing. However, you can still invoke your function.
Does anyone know is it possible to some how review it in AWS.
I was looking a lot but still haven't found any ways to do it here.
You can download your function code by exporting it, assuming your function was developed in some interpreted language like JavaScript/Python.
This can be done by doing an export to the function:
Go to your function and in the Actions dropdown select Export function:
Chose Download deployment package.
This will download the deployed function locally and you will be able to investigate your code.
I don't know much about web development and cloud computing. From what I've read when using Cloud functions as the webhook service for dialogflow, you are limited to write code in just 1 source file. I would like to create a real complex dialogflow agent, so It would be handy to have an organized code structure to make the development easier.
I've recently discovered Cloud run which seems like it can also handle webhook requests and makes it possible to develop a complex code structure.
I don't want to use Cloud Run just because it is inconvenient to write everything in one file, but on the other hand it would be strange to have a cloud function with a single file with thousands of lines of code.
Is it possible to have multiple files in a single cloud function?
Is cloud run suitable for my problem? (create a complex dialogflow agent)
Is it possible to have multiple files in a single cloud function?
Yes. When you deploy to Google Cloud Functions you create a bundle with all your source files or have it pull from a source repository.
But Dialogflow only allows index.js and package.json in the Built-In Editor
For simplicity, the built-in code editor only allows you to edit those two files. But the built-in editor is mostly just meant for basic testing. If you're doing serious coding, you probably already have an environment you prefer to use to code and deploy that code.
Is Cloud Run suitable?
Certainly. The biggest thing Cloud Run will get you is complete control over your runtime environment, since you're specifying the details of that environment in addition to the code.
The biggest downside, however, is that you also have to determine details of that environment. Cloud Funcitons provide an HTTPS server without you having to worry about those details, as long as the rest of the environment is suitable.
What other options do I have?
Anywhere you want! Dialogflow only requires that your webhook
Be at a public address (ie - one that Google can resolve and reach)
Runs an HTTPS server at that address with a non-self-signed certificate
During testing, it is common to run it on your own machine via a tunnel such as ngrok, but this isn't a good idea in production. If you're already familiar with running an HTTPS server in another environment, and you wish to continue using that environment, you should be fine.
I had once created firebase cloud functions in multiple files and deployed and it was pretty straight forward.
Now I am creating google cloud function with node.js without firebase, and facing issues with organizing multiple files and deploying multiple functions.
I looked over google many links, it looks either I am missing some basic concept or not getting proper link,
as of now whatever I found demosntrating only single function written in index.js file and deploying with mentioning the function name
Any help
appreciated.
Thanks
When you deploy functions with gcloud, you can only deploy one function per invocation of the CLI. It doesn't work at all like the Firebase CLI where you can collect a bunch of function into a single file for simultaneous deployment. If you want to deploy multiple functions with gcloud, you are just going to have to run a command for each function. If you like deploying them all together, I suggest creating some sort of shell or batch script to run them all together.
Looking for dynamically creating cron jobs that gets created and configured using the request parameters send by the Cloud Functions or normal HTTP request.
There is already manual way by visiting the Google Cloud console but I actually make this manual task by configuring and creating jobs according to request parameters.
I am already aware that we can provide a cron.yaml file that can have all the configuration but I need some help or any reference that contains detail way to achieve this.
I am also beginner so indeed correct me or provide any alternate solution.
You'll want to use the Cloud Scheduler API. Specifically, this is a REST API that lets you do everything you could do via the console or the gcloud command.
I have many scripts that I use to manage a multi server infrastructure. Some of these scripts require root access, some require access to a databases, and most of them are perl based. I would like to convert all these scripts into very simple web services that can be executed from different applications. These web services would take regular request inputs and would output json as a result of being executed. I'm thinking that I should setup a simple perl dispatcher, call it action, that would do logging, checking credentials, and executing these simple scripts. Something like:
http://host/action/update-dns?server=www.google.com&ip=192.168.1.1
This would invoke the action perl driver which in turn would call the update-dns script with the appropriate parameters (perhaps cleaned in some way) and return an appropriate json response. I would like for this infrastructure to have the following attributes:
All scripts reside in a single place. If a new script is dropped there, then it automatically becomes callable.
All scripts need to have some form of manifest that describe, who can call it (belonging to some ldap group), what parameters does it take, what is the response, etc. So that it is self explained.
All scripts are logged in terms of who did what and what was the response.
It would be great if there was a command line way to do something like # action update-dns --server=www.google.com --up=192.168.1.1
Do I have to get this going from scratch or is there something already on top of which I can piggy back on?
You might want to check out my framework Sub::Spec. The documentation is still sparse, but I'm already using it for several projects, including for my other modules in CPAN.
The idea is you write your code in functions, decorate/add enough metadata to these functions (including some summary, specification of arguments, etc.) and there will be toolchains to take care of what you need, e.g. running your functions in the command-line (using Sub::Spec::CmdLine, and over HTTP (using Sub::Spec::HTTP::Server and Sub::Spec::HTTP::Client).
There is a sample project in its infancy. Also take a look at http://gudangapi.com/. For example, the function GudangAPI::API::finance::currency::id::bca::get_bca_exchange_rate() will be accessible as an API function via HTTP API.
Contact me if you are interested in deploying something like this.