How would I make a webhook (which I'll be using from my CMS) to invoke my CodePipeline?
I've got a simple CodePipeline setup that looks like this.
Source (Github) -> AWS CodeBuild -> AWS S3
It's a Node app that builds an application based on resources from a CMS.
At the moment only committing to the Github Repository invokes CodePipeline to remake my app.
However, I'm wanting a webhook or URL trigger to invoke the CodePipeline to recreate my app when certain changes in the CMS occur.
How do I create a webhook to invoke CodePipeline?
Not sure if it is the cleanest way. If you created an API Gateway that invokes a Lambda. In the lambda you could use one of the AWS sdks to run a script. For example, If you used Python, it would be the boto3.start_build()
That is just the first though I had.
Outside of a sdk or cli, I think Pipelines only work with ECR, S3, CodeCommit, BitBucket, and Github.
Related
I want to write a script/code/serverless script that when run adds a trigger to the specified Lambda function.
I can deploy a lambda function using a script, I have added a trigger by going to the aws console and clicking the options from there, Also added a post method to trigger the lambda.
I want to do this using a piece of code/ script that can run on was cli or anywhere.
Is it possible to do that?
Instead of using CLI or scripts you can use AWS CDK to deploy Infrastructure as Code.
For creating and deploying lambda with API Gateway check this example.
You can get yourself familiar with AWS CDK from this workshop.
There are many other resources available on internet as well.
I need that my AWS Code pipeline will run only when certain files are checked-in to my bitbucket repository in a certain branch.
I manage to achieve this with AWS Code build but as I understand it's not as simple with code pipeline.
I understand I need to attach a webhook to a lambda function which will trigger the pipeline (if needed). But I couldn't find anything about it in the management console. Searching for it only led me in circles.
It should be pretty straight-forward, Anyone has any experience with this?
So what you need to implement is the following workflow:
Create an API Gateway endpoint.
Create your CodePipeline pipeline
Create your Lambda function that will perform any checks against the code and trigger your CodePipeline
Create a method/resource in API Gateway that triggers your Lambda
Then add the API Gateway endpoint to your Bitbucket web hook.
I'd like to know if I can use my git repository for AWS console ? In fact i made a rails project on AWS console but I did not find as to how to transfer it on to my Github repository
You can use AWS Cloud and Github through
Custom Webhooks: Where you have your code store in github and serve through AWS services like EC2, Lambda, etc.
AWS Code Pipeline with Code Commit: The AWS code commit service can be configure to listen to github webhook requests, from there you can trigger events when the webhook is received e.g. Build, tests, deploy
Reference:
Recently AWS codebuild webhook supports PULL_REQUEST_MERGED event,which triggers the build every time a pull request is merged.
However we have to enable the merged event in Bitbucket webhook setting if we create a webhook filter group that uses the PULL_REQUEST_MERGED event type.
I am confused what should be the server URL of the AWS code build.
I read many tutorial but most of them are for integration of bitbucket with jenkins, not sure how to configure with AWS.
Thank you for using AWS CodeBuild. For Bitbucket webhook integration, you can check out this CodeBuild Bitbucket webhook sample in CodeBuild user guide.
Basically when you create a bitbucket webhook through CodeBuild console or calling CodeBuild's CreateWebhook API, CodeBuild will create the webhook and fill in the URL for you. You can also configure filter groups with PULL_REQUEST_MERGED event type on CodeBuild console.
After the webhook is created, you can check its settings on Bitbucket site. And you can edit which events will trigger Bitbucket to deliver a webhook payload.
I'm building an API using AWS API Gateway and AWS Lambda. I would like to achieve continuous delivery for this API. The path I've chosen to do it is to use CloudFormation through AWS CodePipeline. I've managed to to it for another project using Lambdas (without API Gateway), it works perfectly and it is really pleasant to use.
The issue I'm facing when deploying is that the Lambdas are properly updated but not the API definition. From what I understand, the AWS::ApiGateway::Deployment are immutable resources which means that for each deployment of the API I need to create a new AWS::ApiGateway::Deployment resource. This is not practical at all because for each of this AWS::ApiGateway::Deployment I have a new Invoke URL. This is not acceptable since I would have to either change my DNS record to the newly deployed API invoke URL or ask our API users to change the URL in their applications.
What I would like is to be able to change the API definition and the Lambdas implementations without my API users having to change anything in their applications.
How can I achieve this behavior?
I created a tutorial to highlight my issue. You can find it at: https://github.com/JonathanGailliez/aws-api-gateway-lambda-example
As per: https://forums.aws.amazon.com/thread.jspa?messageID=789869󀵭
joey-aws says:
We are currently in the process of rolling out a solution which
addresses this exact problem. In the meantime, a common workaround
would be to update something small, such as a "description" field
which could then be used to "trigger" an API Gateway deployment when
updating the CloudFormation stack.
I'll update this answer and the example repo once it's rolled out.
You could run a Cloudformation update from the command line or in the AWS console. This would change the API definitions and any lambda code without changing the unique id to access your gateway.
The other option is to put your API behind a custom domain name and then you could keep deploy a new API or stage and switch over the custom domain mapping when you are ready. The users wouldn't recognize any change.
A way to achieve that is by leveraging existing frameworks like
AWS SAM
Serverless
Claudia
I was able to achieve this by using CloudFormation template generated by troposphere and boto3 api in Python as follows:
Split the template into two parts
API definition, Method(s), IAM roles, ApiKey and Lambda (a)
Deployment, UsagePlan and UsagePlanKey (b)
Once changed Lambda code is zipped up and uploaded to S3 using boto3 api
Stack (b) is deleted
Stack (a) is updated with new resource id for the GET method connected to lambda
Stack (b) is created anew
Steps 3, 4, 5 are performed using CloudFormation boto3 api with blocking until completed.
Most importantly after all steps are done ApiKey value and stage Invoke URL remain the same, running updated Lambda code as tested with curl.
Note: it may take additional 30-60s for API to become fully functional after CloudFormation update is completed.