Commit a file in Github in AWS CodePipeline - amazon-web-services

I'm creating a AWS CodePipeline with the following phases:
Source: Get code from Github when some change occurs in the staging branch.
Build:Read the buildspec.yml to execute "mvn clean package", docker build and docker push.
Deploy: Deploy to ECS Cluster
Now I need to create a 4 phase (AfterDeploy Phase) that should commit some files in github. So, after all these phases completed with success, the AfterDeploy Phase should commit some files generated by Build phase in github.
Any idea how can I do it?

Should I have 2 buildspec files, because I have 2 build phases ?
Yes, you can do this. For example, you can have primary buildspec.yml for your first build, and secondary buildspec_postdeploy.yml for the second build stage.
How to use multiple buildspec files is documented at:
Buildspec file name and storage location
I don't have an example to share, but it would just execute any git commint and git push commands that are needed. Its exact structure is very use-case specific, thus it is difficult to speculate on it.
https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html#build-spec-ref-name-storage

Related

Google Cloud Build pipeline in Mono-repository architecture with single cloudbuild

We are using multiple python deployments into a single GitHub repository with a folder structure. Each directory contains a separate scripts module.
service-1/
deployment-1/
app/
Dockerfile
cloudbuild.yaml
deployment-2/
app/
Dockerfile
cloudbuild.yaml
service-2/
deployment-1/
app/
Dockerfile
cloudbuild.yaml
service-3/
deployment-1/
app/
Dockerfile
cloudbuild.yaml
deployment-2/
app/
Dockerfile
cloudbuild.yaml
.gitignore
README.md
requirements.txt
where deployment-1 will work as a single deployment and deployment-2 as another deployment for each service.
We are planning to manage a single trigger in a pipeline that triggers the build only for the deployment where the latest commit is found.
If anyone can please provide suggestions on how to keep single YAML files & build it better way using the cloud build. So that we don't require to manage multiple triggers.
Sadly, nothing is magic!! The dispatch is either done by configuration (multiple trigger) or by code.
If you want to avoid multiple trigger, you need to code the dispatch:
Detect the code that have change in GIT (could be several service in the same time)
Iterate over the updated folders and run a Cloud Build (so, a new one) for each of them
It's small piece of shell code. Not so difficult but you have to maintain/test/debug it. Is it easier that multiple trigger? It's up to you, according to your team skills in devops area.

overwrite existing files from codepipeline deployment

I am trying to deploy some new code using aws codepipeline. The first time it works no problem, the second time the deployment fails because of existing files. How can I instruct my flow to overwrite existing files?
Error Message:
The deployment failed because a specified file already exists at this location:
I think the best way to deploy is to delete the project directory and kill the process before deploying. This let the state of the project directory keep pure and in sync with the original repository.
As you can see from this link(appspec.yml hooks for deploying to EC2), CodeDeploy download artifacts on Install phase and we cannot access to that step. Install phase comes after BeforeInstall hook.
So you should delete the directory and kill the processes before
Install phase to be executed.
hooks:
BeforeInstall:
- location: codedeploy-scripts/deleteAndKill.sh
# runas: root # this might be needed depending on your setting.
Define codedeploy-scripts/deleteAndKill.sh properly and try running CodePipeline and CodeDeploy again.
P.S. Deleting the project directory and killing the processes are somewhat bothering. so once you use docker, what you have to do is only docker stop {container name} and docker run {image name}.

aws: .net Core: zip the built code and copy to s3 output bucket

I am a .net developer and using a .net core 2.x application to build and upload the release code to s3 bucket. later that code will be used to deploy to ec2 instance.
I am new to CI/CD using aws and in learning phase.
In order to create CI/CD for my sample project, I gone through some aws tutorials and was able to create the following buildspec.yml file. Using that file I am able to run the successful build.
The problem comes in the phase UPLOAD_ARTIFACTS. I am unable to understand how to create a zip file that will be used to upload to the s3 bucket specified in the build project.
My buildspec.yml files contains the following code, Please help me finding what is wrong or what I am missing.
version: 0.2
phases:
build:
commands:
- dotnet restore
- dotnet build
artifacts:
files:
- target/cicdrepo.zip
- .\bin\Debug\netcoreapp2.1\*
I think I have to add post_build and some commands that will generate the zip file. But don't know the commands.
Following is the output image from the build logs.
your file is good all what you need to do is to create a S3 bucket then
you need to configure your CodeBuild to generate zip (or not) your artifacts for you, and to store it to s3.
this is the step you need to configure:
Edit:
if you want all your files to be copied on the root of your Zip file you can use:
artifacts:
files:
- ...
discard-paths: yes

Error in AWS codepipeline when deploying ElasticBeanstalk

I've created a pipeline which does the following:
Git changes trigger next action (code build)
Codebuild initiates & builds a docker image from git source
Set latest docker container up on Elasticbeanstalk
The first 2 steps are working fine, git changes initiate a codebuild, the codebuild builds a docker image, and then tries to set it up on Elasticbeanstalk (which fails). The following error is thrown:
Invalid action configuration The action failed because either the
artifact or the Amazon S3 bucket could not be found. Name of artifact
bucket: MY_BUCKET_NAME. Verify that this bucket
exists. If it exists, check the life cycle policy, then try releasing
a change.
In my codebuild project, I've set the artifact location to MY_BUCKET_NAME & named it aws-test-artifact. Is this all I have to do?
I've tried looking around and am unable to find anything on this issue.
I had the same problem. Just changed Input artifacts from BuildArtifact to SourceArtifact in the build stage, and everything worked.
As Adam Loving commented we must add artifacts section.
Adding this section to your buildspec.yml file will make this work.
artifacts:
files:
- '**/*'
From documentation https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html#build-spec.artifacts.files adding '**/*' will include all files into the build target.
So I found the fix to this issue! What I had to do was goto codebuild => edit project => Show advanced settings => Artifacts packaging
From here I changed Artifacts packaging to Zip!

Docker image build as AWS CodePipeline step

I was able to setup integration between github and AWS CodePipeline, so now my code is uploaded to S3 after a push event by a lambda function. That works very well.
A new ZIP with source code on S3 trigger a pipeline, which builds the code. That's fine. Now I'd like to also build a docker image for the project.
The first problem is that you can't mix a project (nodejs) build and docker build. That's fine, makes sense. Next issue is that you can't have another buildspec.yml for the docker build. You have specify the build commands manually, ok, that works as a workaround.
The biggest problem though, or lack of my understanding, is how to put the docker build as part of the pipeline? First build step build the project, the the next build step builds the docker image. Two standalone AWS CodeBuilds.
The thing is that a pipeline build step have to produce an artifact on the output. But a docker build doesn't produce any files and it looks that the final docker push after docker build is not qualified as an artifact by the pipeline service.
Is there a way how to do it?
Thanks
A bit late, but hopefully will be helpful for someone. You should have the docker image published as part of your post_build phase commands. Here's an example of a buildspec.yml:
version: 0.1
phases:
pre_build:
commands:
- echo Logging in to Amazon ECR...
- $(aws ecr get-login --region $AWS_REGION)
build:
commands:
- echo Build started on `date`
- echo Building the Docker image...
- docker build -t $IMAGE .
- "docker tag $IMAGE $REPO/$IMAGE:${CODEBUILD_BUILD_ID##*:}"
post_build:
commands:
- echo Build completed on `date`
- echo Pushing the Docker image...
- "docker push $REPO/$IMAGE:${CODEBUILD_BUILD_ID##*:}"
- "echo {\\\"image\\\":\\\"$REPO/$IMAGE:${CODEBUILD_BUILD_ID##*:}\\\"} > image.json"
artifacts:
files:
- 'image.json'
As you can see, the CodeBuild project expects few parameters - AWS_REGION, REPO and IMAGE and publishes the image on AWS ECR (but you can use registry of your choice). It also uses the existing CODEBUILD_BUILD_ID environment variable to extract dynamic value for the image tag. After the image is pushed, it creates json file with the full path to the image and publishes it as an artifact for CodePipeline to use.
For this to work, the CodeBuild project "environment image" should be of type "docker" with the "priviledged" flag activated. When creating the CodeBuild project in your pipeline, you can also specify the environment variables that are used the buildspec file above.
There is a good tutorial on this topic here:
http://queirozf.com/entries/using-aws-codepipeline-to-automatically-build-and-deploy-your-app-stored-on-github-as-a-docker-based-beanstalk-application
Sorry about the inconvenience. Making it less restrictive is in our roadmap. Meanwhile, in order to use CodeBuild action, you can use a dummy file as the output artifact.