Re-deploy AWS CloudWatch Logs with serverless framework - amazon-web-services

I deployed an AWS service using the serverless framework, which created and deployed all the resources automatically.
But later on I accidentally deleted the log group for this service, and no logs are beeing created anymore.
Is there a way to recreate / redeploy just the log-group resource, because serverless --remove & serverless --deploy would delete all existing resources and create new ones which would be bad in my case.
Any ideas?

Comment out the function in serverless.yml, deploy, uncomment the function and deploy again. As far as I can tell, serverless tries to remove the log group when you "remove" the function and then create it when you "add" the function.
Also, afaict, you have to actually deploy. You can't just save, you have to do the full stack update for serverless to notice.

sls deploy -f <function name>
Or you can manually just create the log groups for now

Related

can we do something like Terraform Plan in Serverless?

when I change something in my lambda repo and redeploy the lambda with serverless framework it make changes. I want to know the changes that going to happen prior to deploying the lambda.
I tried serverless changeset plugin, but it doesn't show a comparison between my current lambda configuration and the changes going to happen by deploying the lambda after making some changes in my lambda repo [e.g. the lambda name, tags etc.]
You can enable changesets with deploymentMethod: changesets so that serverless deploy doesn't actually execute the changes, but instead creates a changeset inside CloudFormation which you can inspect inside the console and then initiate from there.

Serverless new stage to use existing resources and only deploy the api gateway

I am trying to create new stages for the feature-branches test.
but the problem is that when I am deploying the new stage sls --deploy -s FEATURE
it's creating everything in the serverless configuration. (I tried to find maybe an option in the resources to skip if exists. but didn't find any.)
When I am creating a new env it's failed because when I try to point to an existing resource it's said that it cannot be created.
Thanks!
There is no such option to skip existing resources in Serverless Framework.
One option would be splitting the resources that you want to share into a separately deployed service.

Automatically run AWS Glue job when the job is created or updated

I have AWS Glue jobs setup to upload test data to our database. Uploading takes place only 1 time and no additional runs are required unless additions or changes are required on the test data. However, we have multiple environments where the upload needs to happen. One way is to deploy the jobs using CDK and manually run the jobs in each environment. Looking for pointers to automatically trigger a run when the jobs is either updated or created.
Use a CustomResource to invoke it via command line or the Glue SDK.
It is important to remember that CDK is not a deployment solution - it is an infrastructure as code solution. CDK does not actually do any deployment - the cdk deploy command is just a shortcut for sending the template to CloudFormation.
CDK is just a way to lay code over top the creation of the CloudFormation templates and give developers far more options. All it really does is generate a CloudFormation Template - everything else is window dressing.
As such, anything that has to happen after the CloudFormation template is synthed and deployed is not possible for CDK to interact with. You need to make a custom resource that can watch for Stack Updates and when the stack is done deploying, trigger whatever else you want.
Alternatively, this is a perfect use of CodePipeline - run your cdk in the the pipeline (either with a Synth and CodeDeploy stages or a single codeBuild that just runs cdk deploy) and then in a stage after it have a lambda that triggers your jobs.

AWS Cloud9: deploy only one Lambda function at a time

I am trying to deploy Lambda functions using AWS Cloud9. When I press deploy, all of my functions are deployed/synced at the same time rather than just the one I selected when deploying. Same thing when right clicking on the function and pressing deploy. I find this quite annoying and wondering if there is any work around?
When you click deploy Cloud9 runs aws cloudformation package and aws cloudformation deploy on your template.yaml file in the background. (source: I developed the Lambda integration for AWS Cloud9).
Because all your files are bundled into one serverless application and there is only one CloudFormation stack they can only be all deployed at once with CloudFormation.
If you're only making a code change for one function and are not modifying any configuration settings you could update that one function from the command line with the command:
zip -r - . | aws lambda update-function-code --function-name <function-name>`
Run this in the same folder as your template.yaml file, replacing <function-name> with it's full generated name like cloud9-myapp-myfunction-ABCD1234 (You can see the full name under the remote functions list in the AWS Resources panel).
In AWS Cloud9, Lambda functions are created within serverless applications and are therefore deployed via CloudFormation. With CloudFormation, the whole stack is deployed at once, so all functions are deployed together (see this discussion for more info).

A way to automate cloudformation templates deployment

Is there any way to automate cloudformation templates deployment? I mean it would be awesome if I just push the changes in the code and somebody looking for those changes in the code and once they appear - deploy the updated template.
Yes, with AWS Code Pipeline !
http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/continuous-delivery-codepipeline.html
With AWS CloudFormation and AWS CodePipeline, you can use continuous delivery to automatically build and test changes to your AWS CloudFormation templates [...]
AWS CodePipeline has built-in integration with AWS CloudFormation, so you can specify AWS CloudFormation-specific actions, such as creating, updating, or deleting a stack, within a pipeline.
CodePipeline is a great way to do what you're looking for.
At Giftbit we do this is by having a Github Repo that has our CloudFormation template in it.
When we want to make a change, we make the changes on a branch in the repo, and create a pull request into the staging branch. CodePipeline monitors the staging branch then automates a CodeBuild to validate the templates, package any SubStacks, then creates a Change Set and Executes it.
Below are have some examples to help Quick Start anyone interested:
Continuous Integration CloudFormation Template Example
Serverless Application Model (SAM) that gets deployed