How can I disable eventHubs namespace? - azure-eventhub

How can I disable eventHubs namespace? I don´t want to receive charges from this service, because is only for test. And I don´t want to remove it

There is no disable option for an Event Hubs namespace. However, you can export your namespace as an ARM template and delete the namespace after that. When you need it back, redeploy from previously exported template.
See more on export here - https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/export-template-portal

Related

How to review code of previously created function in AWS Lambda

i am new to AWS.
I need to create a Lambda function in AWS, but before it i need to review some code of previously created functions. But when i want to review code of function there's a message
The deployment package of your Lambda function "tes-GetInfo" is too large to enable inline code editing. However, you can still invoke your function.
Does anyone know is it possible to some how review it in AWS.
I was looking a lot but still haven't found any ways to do it here.
You can download your function code by exporting it, assuming your function was developed in some interpreted language like JavaScript/Python.
This can be done by doing an export to the function:
Go to your function and in the Actions dropdown select Export function:
Chose Download deployment package.
This will download the deployed function locally and you will be able to investigate your code.

Is the "dialogflow-fulfillment-nodejs" library still maintained or do I need to switch to the "Dialogflow API: Node.js Client" library?

I realized in the github of the library "dialogflow-fulfillment-nodejs" that there are no new updates and many discussions about whether the library will continue even in the "README.md" they wrote "Warning: This library is no longer maintained. It should only be used when using the inline editor. "
I've been doing tests with the In-line Editor on DialogFlow but I realized that when I changed the Cloud Functions the version from Node 8 to Node 10, because Firebase says that the support for Node 8 will end, I thought about changing the version, however I had a lot of problems when doing Deploy in the In-line Editor so I thought if the problem of the library itself that still uses Node 6 in package.json?
Is that a problem with this library that still uses Node 6 so when changing to Node 10 in Cloud Functions he stopped doing Deploy?
What should I use in my webhook service?
As you can read from the public repository, the library is not longer maintained. However, it also says:
... it should only be used when using the inline editor
Also when looking at the Dialogflow console under the Fulfillment section the Inline Editor option when enabled, it states:
Newly created cloud functions now use Node.js 10 as runtime engine.
Check migration guide for more details.
For a graphical reference, see img.
I've created my last Cloud Function recently and can confirm that when looking at the package.json file the engine has the version properly set.
So even if the library is no longer maintained, the support within the Inline Editor remains available and I don't see anything about it being deprecated on their documentation. My conclusion is that you can use it with confidence.
Finally, regarding your issue about deploying the Cloud Function using the Inline Editor, it may be because of something else. My guess is that you or someone with the required permissions, made a change on the Cloud Function directly and not using the Inline Editor; thus, falling in an scenario mentioned on the limitations section, that states the following:
If you modify your code with the Cloud Functions console, you can no
longer use the inline editor to modify your code. Your function will
continue to provide fulfillment for your agent, but future edits must
be made in the Cloud Functions console.
If you would like to keep using Inline Editor to deploy your future changes, I suggest you to backup your Cloud Function, and create a new one using Inline Editor (for that you may need to disable the Inline Editor and remove manually the Cloud Function previously created, remember to backup your code and configuration).
The Dialogflow API: Node.js Client is not a library for use in the fulfillment webhook. It is meant to be used as a client that calls Dialogflow to either build/edit agents or submit content to determine a matching Intent.
For webhooks, you're expected to parse the JSON yourself and send a validly formatted JSON as part of the response. While the dialogflow-fulfillment-nodejs library isn't deprecated, as noted, it also isn't maintained. So if Dialogflow ES ever does get updates - the library likely will not. There are third-party libraries such as multivocal that are being worked on to provide fulfillment, and these can work in the inline editor.

What is the proper way to build many Lambda functions and updated them later?

I want to make a bot that makes other bots on Telegram platform. I want to use AWS infrastructure, look like their Lamdba functions are perfect fit, pay for them only when they are active. In my concept, each bot equal to one lambda function, and they all share the same codebase.
At the starting point, I thought to make each new Lambda function programmatically, but this will bring me problems later I think, like need to attach many services programmatically via AWS SDK: Gateway API, DynamoDB. But the main problem, how I will update the codebase for these 1000+ functions later? I think that bash script is a bad idea here.
So, I moved forward and found SAM (AWS Serverless Application Model) and CloudFormatting, which should help me I guess. But I can't understand the concept. I can make a stack with all the required resources, but how will I make new bots from this one stack? Or should I build a template and make new stacks for each new bot programmatically via AWS SDK from this template?
Next, how to update them later? For example, I want to update all bots that have version 1.1 to version 1.2. How I will replace them? Should I make a new stack or can I update older ones? I don't see any options in UI of CloudFormatting or any related methods in AWS SDK for that.
Thanks
But the main problem, how I will update the codebase for these 1000+ functions later?
You don't. You use lambda alias. This allows you to fully decouple your lambda versions from your clients. This works because you are using an alias of your function in your client's code (or api gateway). The alias is fixed and does not change.
However, alias is like a pointer - it can point to different versions of your lambda function. Therefore, when you publish a new lambda version you just point alias to it. Its fully transparent from your clients and their alias does not require any change.
I agree with #Marcin. Also it would be worth checking serverless? Seems like you are still experimenting so most likely you are deploying using bash scripts with AWS SDK/SAM commands. This is fine but once you start getting the gist of what your architecture looks like, I think you will appreciate what serverless can offer. You can deploy/teardown cloudformation stacks in matter of seconds. Also you can use serverless-offline so that you can have a local build of your AWS lambda architecture on your local machine.
All this has saved me hours of grunt work.

AWS Cloudformation - How to manually add/delete an export?

Is there a way to add/delete an export manually, say via the console? I could not find any info regarding this?
No. Everything in CloudFormation is controlled via the template file.
You would need to edit the template to add/remove the export, then update the stack to invoke the change.

Force Discard AWS Lambda Container

How to manually forcefully discard a aws lambda function in the cluster using aws console or aws cli for development and testing purposes ?
If you redeploy the function it'll terminate all existing containers. It could be as simple as assigning the current date/time to the description of the Lambda function and redeploying. This will allow you to redeploy as many times as you need because something is unique and it will tear down all existing containers each time you do the deployment.
With that said, Lambda functions are supposed to be stateless. You should keep that in mind when you write your code (eg. avoid using global variables, use random file names if creating something temp, etc). From the sounds of things, I think you might have an issue with your design if you require the Lambda container to be torn down.
If you're using the UI, then a simple way to do this is to add or alter an environment variable on the function configuration page.
When you click "Save" the function will be reloaded.
Note: this won't work if you're using the versioned functions feature.