google cloud function in multiple files without firebase - google-cloud-platform

I had once created firebase cloud functions in multiple files and deployed and it was pretty straight forward.
Now I am creating google cloud function with node.js without firebase, and facing issues with organizing multiple files and deploying multiple functions.
I looked over google many links, it looks either I am missing some basic concept or not getting proper link,
as of now whatever I found demosntrating only single function written in index.js file and deploying with mentioning the function name
Any help
appreciated.
Thanks

When you deploy functions with gcloud, you can only deploy one function per invocation of the CLI. It doesn't work at all like the Firebase CLI where you can collect a bunch of function into a single file for simultaneous deployment. If you want to deploy multiple functions with gcloud, you are just going to have to run a command for each function. If you like deploying them all together, I suggest creating some sort of shell or batch script to run them all together.

Related

AWS Cloudformation nested stacks templates - how do handle local development and versioning?

Just started a simple AWS serverless project to test it out, so I'm developing locally and hosting the project on GitLab.
Wanna try nested stacks just to split current template file into smaller pieces, but TemplateUrl property value must be an url to a template file located in an S3 bucket, so I can't simply move my stack resources to another local yaml file and just include it in the parent one.
Manually upload nested stacks template files to an S3 bucket and than running sam sync from my console looks too intricate IMHO, and setting up a pipeline that take care of all the process too looks too much work for a simple personal learning project.
The fastest solution seems to be replace a deplyment pipeline with a script that can be run locally.
I know AWS cloud services are meant for enterpries-grade projects, but I'm wondering if there is a simpler and built-in/official way to handle all of this.

Is a single serverless yml enough for all lambda functions?

I am new to serverless framework, I used to create and write lambda codes through aws console. But I am little bit confused about the serverless framework structure. Should I need to create a serverless yml for each lambda function, Or I can use a single yml file for my whole aws lambda functions? I don't know which is the best way to start, because each api gateway end point will point to different lambda functions. Please suggest the best way to start.
My few cents here, based on some experience in the last 6 months. I started using a single Git repo with all lambda functions each in a folder. During the first 2 months, I had a single YML file to define all functions.
Then after some issues, I moved to separate YML files inside each folder for lambda function.
You can use plugins specific to each functions. In my use case, I had a NodeJS and GraphQL function for which I use Webpack for compressing the packages.
The Webpack didn't work for my other function which uses some pre-built binary for running specifically in AWS.
Not all functions need to be deployed every time. So managing in separate folders gave lot of flexibility in CI/CD and deployment version management. Otherwise, you will have releases for all functions.

Using cloud functions vs cloud run as webhook for dialogflow

I don't know much about web development and cloud computing. From what I've read when using Cloud functions as the webhook service for dialogflow, you are limited to write code in just 1 source file. I would like to create a real complex dialogflow agent, so It would be handy to have an organized code structure to make the development easier.
I've recently discovered Cloud run which seems like it can also handle webhook requests and makes it possible to develop a complex code structure.
I don't want to use Cloud Run just because it is inconvenient to write everything in one file, but on the other hand it would be strange to have a cloud function with a single file with thousands of lines of code.
Is it possible to have multiple files in a single cloud function?
Is cloud run suitable for my problem? (create a complex dialogflow agent)
Is it possible to have multiple files in a single cloud function?
Yes. When you deploy to Google Cloud Functions you create a bundle with all your source files or have it pull from a source repository.
But Dialogflow only allows index.js and package.json in the Built-In Editor
For simplicity, the built-in code editor only allows you to edit those two files. But the built-in editor is mostly just meant for basic testing. If you're doing serious coding, you probably already have an environment you prefer to use to code and deploy that code.
Is Cloud Run suitable?
Certainly. The biggest thing Cloud Run will get you is complete control over your runtime environment, since you're specifying the details of that environment in addition to the code.
The biggest downside, however, is that you also have to determine details of that environment. Cloud Funcitons provide an HTTPS server without you having to worry about those details, as long as the rest of the environment is suitable.
What other options do I have?
Anywhere you want! Dialogflow only requires that your webhook
Be at a public address (ie - one that Google can resolve and reach)
Runs an HTTPS server at that address with a non-self-signed certificate
During testing, it is common to run it on your own machine via a tunnel such as ngrok, but this isn't a good idea in production. If you're already familiar with running an HTTPS server in another environment, and you wish to continue using that environment, you should be fine.

Perform action on Google Cloud Function install/setup

I'm writing a cloud function that process a pub/sub message which contains a grpc message. I would like, at install/(re)deploy time, the cloud function to perform some action: pull the protobuf definition from some GitHub repository and generate the correspondent python code with grpcio-tools, roughly in the same line of this.
So far I can only find in the documentation how to add dependencies, however I'm looking for some sort of on install "hook": something that would allow me to perform some actions before the function is actually deployed.
Is this possible? Any advice?
Cloud Functions has no such hook. You can perform the work in a script on the machine that performs the deployment. It's not uncommon to write scripts to automate work like this.

How to avoid deployment of all five functions in a server of serverless framework if only one function is changed

I have a serverless framework service with (say)five aws lambda functions using python. By using github I have created a CodePipeline for CI/CD.
When I push the code changes, it deploys all the functions even only function is changed.
I want to avoid the deployment of all functions and the CI/CD should determine the changed function and deploy it. Rest of functions should not be deployed again.
Moreover, is there anyway to deal with such problems using AWS SAM, as at this stage I have an option to switch towards SAM by quitting serverless framework
Unfortunately there is no "native" way to do it. You would need to write a bash that will loop through the changed files and call sls deploy -s production -f for each one of them.
I was also faced this issue, and eventually it drove me to create an alternative.
Rocketsam takes advantage of sam local to allow deploying only changed functions instead of the entire microservice.
It also supports other cool features such as:
Fetching live logs for each function
Sharing code between functions
Template per function instead of one big template file
Hope it solves your issue :)