Currently, I have built ci/cd for one project through code-deploy.
This time, a new project is completed and we are going to deploy it to the existing EC2.
How to store two or more git repositories in EC2?
Below is the contents of the appspec.yml file that is currently working.
version: 0.0
os: linux
files:
- source: /
destination: /home/ubuntu/build/
Related
I am trying to deploy a Nuxt app on Elastic Beanstalk, I have set up a CodePipeline pipeline which first grabs the source code from GitHub when a new commit on main is detected, then passes it on to a CodeBuild process, which builds the app using a buildspec.yml file. It's dead simple, but I'll include it anyway:
version: 0.2
phases:
build:
commands:
- yarn
- yarn build
artifacts:
files:
- '**/*'
The BuildArtifact from the above is passed on to the Deploy step, which then basically puts the results of the build process inside /var/app/current on the EC2 instance attached to the Elastic Beanstalk environment.
And here lies my problem: the app won't start:
Dec 5 09:08:14 ip-172-31-30-230 web: > nuxt start
Dec 5 09:08:14 ip-172-31-30-230 web: internal/modules/cjs/loader.js:905
Dec 5 09:08:14 ip-172-31-30-230 web: throw err;
Dec 5 09:08:14 ip-172-31-30-230 web: ^
Dec 5 09:08:14 ip-172-31-30-230 web: Error: Cannot find module '../package.json'
However, ff I do a yarn build on my local machine from the same source and use eb deploy to push the build, it works!
I have downloaded the build artifact produced by CodeBuild and compared the contents of the zip files produced both by my locally run eb deploy and CodeBuild and... they're virtually identical.
I have also inspected the contents of /var/app/current on the instance via ssh both when there was a working deployment from eb deploy and when it was not functional after having been deployed via CodeBuild. The contents are the same, I could not find any difference.
Also, running yarn start via terminal worked when it was deployed from my local terminal and gave the same error that I saw in the Elastic Beanstalk logs when it was deployed from CodeBuild.
My hair is turning white, I can't figure out what's causing this mess.
I have a web and a worker environment on Elastic Beanstalk. I have a cron.yaml file that has a task that runs every hour. It was working fine in the beginning when I was deploying using eb-cli.
Then I decided to use AWS code pipeline to deploy code and that's when the cron job stopped working. The way it works is that the build stage of the pipeline creates a docker image and pushes it to my ECR repo. This image is then used (using Dockerrun.aws.json) in the next stage to run the app in both environments. But this approach does not schedule the tasks defined in cron.yaml.
I think with eb-cli, when elastic beanstalk unzips the archive uploaded to s3 for deployment by eb-cli, it finds that cron.yaml file and schedules the task. But with ECR, it's just the docker image and that cron.yaml is not available. If this is the case, how can I provide the cron.yaml file to elastic beanstalk when deploying from ECR repo?
What am I missing?
I was able to solve this issue myself. All you need to do is add cron.yaml file to artifacts in your buildspec.yml file, like so:
artifacts:
files:
- Dockerrun.aws.json
- cron.yaml
I've created a next.js application and have installed the serverless framework so that I can deploy the application.
I can do this via cli with either a .env file containing the AWS credentials or with no .env file but with my AWS credentials sitting in the .aws folder locally. However I can't have the AWS credentials sitting in the root folder permanently as this poses security risks.
What I need to achieve now is for the application to be deployed as part of an Azure devops pipeline(YAML).
The challenge I'm facing is passing the AWS credentials to the serverless deployment without making it part of version control.
This is the current state of my yaml pipeline.
trigger: none
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
npx next build
npx serverless
displayName: 'npm install, build and deploy'
The challenge I'm facing is passing the AWS credentials to the serverless deployment without making it part of version control.
You could try to upload the credentials to Azure Devops as Secure file. Then you could directly use the Download secure file task to download the credentials.
In this case , the secure files will not be a part of Repo.
Here are the steps:
Step1: Upload the credentials to Pipelines-> Library -> Secure files.
Step2: Add the Download secure file task to the Yaml definition.
For example:
- task: DownloadSecureFile#1
displayName: 'Download secure file'
inputs:
secureFile: 'file name'
Note: the secure file will be downloaded to $(Agent.TempDirectory). When the build is complete, this file will be automatically deleted.
In addition, if you want to keep this file or put it elsewhere, you can add a Copy file task.
For example:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(Agent.TempDirectory)'
TargetFolder: '$(build.artifactstagingdirectory)'
Hope this helps.
I'm trying to hook my GitHub repo with S3 so every time there's a commit, AWS CodePipeline will deploy the ./<path>/public folder to a specified S3 bucket.
So far in my pipeline, the Source works (hooked to GitHub and picks up new commits) but the Deploy failed because: Action execution failed
BundleType must be either YAML or JSON.
This is how I set them up:
CodePipeline
Action name: Source
Action provider: GitHub
Repository: account/repo
Branch: master
GitHub webhooks
CodeDeploy
Compute type: AWS Lambda
Service role: myRole
Deployment settings: CodeDeployDefault.LambdaAllAtOnce
IAM Role: myRole
AWS Service
Choose the service that will use this role: Lambda / CodeDeploy
Select your use case: CodeDeploy
Policies: AWSCodeDeployRole
I understand that there must be a buildspec.yml file in the root folder. I've tried using a few files I could find but they don't seem to work. What did I do wrong or how should I edit the buildspec file to do what I want?
Update
Thanks to #Milan Cermak. I understand I need to do:
CodePipeline:
Stage 1: Source: hook with GitHub repo. This one is working.
Stage 2: Build: use CodeBuild to grab only the wanted folder using a buildspec.yml file in the root folder of the repo.
Stage 3: Deploy: use
Action Provider: S3
Input Artifacts: OutputArtifacts (result of stage 2).
Bucket: the bucket that hosts the static website.
CodePipeline works. However, the output contains only files (.html) not folders nested inside the public folder.
I've checked this and figured how to remove path of a nested folder with discard-paths: yes but I'm unable to get all the sub-folders inside the ./<path>/public folder. Any suggestion?
CodeBuild use buildspec, but CodeDeploy use appspec.
Is there any appspec file?
You shouldn't use CodeDeploy, as that's a service for automation of deployments of applications, but rather CodeBuild, which executes commands and prepares the deployment artifact for further use in the pipeline.
These commands are in thebuildspec.yml file (typically in the root directory of the repo, but it's configurable). For your use case, it won't be too complicated, as you're not compiling anything or running tests, etc.
Try this as a starting point:
version: 0.2
phases:
build:
commands:
- ls
artifacts:
files:
- public/*
The phases section is required, that's why it's included (at least, thanks to the ls command, you'll see what files are present in the CodeBuild environment), but it's not interesting for your case. What is interesting is the artifacts section. That's where you define what is the output of the CodeBuild phase, i.e. what gets passed further to the next step in the pipeline.
Depending on how you want to have the files structured (for example, do you want to have the public directory also in the artifact or do you only want to have the files themselves, without the parent dir), you might want to use other configuration that's possible in the artifacts section. See the buildspec reference for details.
Remember to use the output artifact of the CodeBuild step as the input artifact of the Deploy to S3 step.
Buildspec is for CodeBuild as t_yamo pointed out.
You are using CodeDeploy which uses an appspec.yml file, which looks something like this for my config.
version: 0.0
os: linux
files:
- source: /
destination: /path/to/destination
hooks:
BeforeInstall:
- location: /UnzipResourceBundle.sh
ApplicationStart:
- location: /RestartServer.sh
timeout: 3600
UnzipResourceBundle.sh is just a bash script which can be used to do any number of things.
#!/bin/bash
// Do something
You can find a sample for the AppSpec.yml file from Amazon Documentation here - https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file-example.html#appspec-file-example-lambda
CodePipeline recently announced a deploy to S3 action: https://aws.amazon.com/about-aws/whats-new/2019/01/aws-codepipeline-now-supports-deploying-to-amazon-s3/
I am new to DevOps and creating a sample CI/CD pipeline in AWS. Once CI is successful, the code should be moved to S3 bucket. I have to write an appspec.yml file to deploy artifacts from S3 to IIS.
Here I have few queries:
(1) Once CI is successful, are the files moved to S3 bucket as .zip?
(2) Where should I keep the appspec.yml?
(3) What should the appspec.yml code look like so that CodeDeploy reads it and deploys the artifacts to IIS?
If you are using Codedeploy, then you should zip the file locally, upload to S3 and then register the revision against your deployment group.
The appspec.yml is always found in the root folder of the deploy package.
Here is a sample appspec.yml
version: 0.0
os: windows
files:
- source: \index.html
destination: C:\inetpub\wwwroot
This will deploy index.html to the default document root.
Your deployment package / zip file will contain appspec.yml and index.html in the root folder.