Serverless Deploy From another Directory - amazon-web-services

Here is my root folder and i want to deploy AWS Lambda functions to codecommit from the Backend folder.
Therefore i wrote this command, but AWS CodeBuild gives this error (This command can only be run in a Serverless service directory).
version: 0.1
phases:
install:
commands:
- npm install -g serverless#1.20.2
post_build:
commands:
- cd Backend
- serverless deploy --region eu-west-1 --verbose
How can i deploy it from the backend folder?
Edit: I forgot to edit the version. Now i changed it to version: 0.2 and it works fine.

can you change to
- cd Backend && serverless deploy --region eu-west-1 --verbose

I forgot to edit the version of buildspec.yml. Now i changed it to version: 0.2 and it works fine.
version: 0.2
phases:
install:
commands:
- npm install -g serverless#1.20.2
post_build:
commands:
- cd Backend
- serverless deploy --region eu-west-1 --verbose

Related

AWS Elastic Beanstalk - deploy by CodePipeline not working - health status - Severe

I have created pipeline in CodePipeline. Pipelien is runnig fine, every stage are succeeded.
Even deploy stage is in state succeeded, but there is issue with running app on Elastic Beanstalk. I don't know why, mabe by Enhanced health authorization.
When i upload .war file manually to Elastic Beanstalk, app is runnig well. But when CodePipeline is uploading .war file to Elastic Beanstalk its not working.
My buildspec.yml
version: 0.2
phases:
pre_build:
commands:
- echo In the pre_build phase...
build:
commands:
- echo Build started on `date`
- mvn clean package
post_build:
commands:
- echo Build completed on `date`
artifacts:
files:
- target/myapp.war
discard-paths: yes
I would guess that this is because a wrong artifact format is deployed. You can check this by going to the S3 bucket the pipeline is using and checking what artifact is produced. I anyways recommend that you upload a zip file and not a single war. This will allow you to use an .ebextensions file and deploy more than one war, if needed. See here for details.
You can deploy a zip file like this:
version: 0.2
phases:
pre_build:
commands:
- echo In the pre_build phase...
build:
commands:
- echo Build started on `date`
- mvn clean package
- mkdir build-output
- cp target/myapp.war build-output/myapp.war
post_build:
commands:
- echo Build completed on `date`
artifacts:
files:
- build-output/**/*
- base-directory: build-output*

After code deployment, an updated aws cloudformation nested stack custom parameter from console gets its default value

I am trying to update a custom parameter on CloudFormation nested stack from console. It is updated but after code deployment, it gets default value. I want to save last updated value after deployment. I used a bash script as a build command on buildspec.yml. It works if there is a stack but does not work for a stack + a nested stack. Does anyone have any idea about it? I want to use this for updating a parameter from console without code deployment and save it for further usage if it is not changed.
console
publish.sh
mkdir -p build
# SAM Package
aws cloudformation package --template template-build.yml --s3-bucket ${DEPLOYMENT_BUCKET} --output-template build/template.yml --debug
# SAM Deploy
aws cloudformation deploy --template-file build/template.yml --stack-name ${STACK_NAME} --capabilities CAPABILITY_IAM CAPABILITY_AUTO_EXPAND --role-arn ${ROLE_ARN} \
--s3-bucket $DEPLOYMENT_BUCKET \
--parameter-overrides \
DeploymentBucketName=${DEPLOYMENT_BUCKET} \
NodeModulesZipFileName=${packageJsonHash}.zip
buildspec.yml
version: 0.2
phases:
install:
runtime-versions:
python: 3.7
nodejs: 10
commands:
- yum install python3-pip -y
- whereis pip
- pip3 install aws-sam-cli --upgrade --user
- pip3 install awscli --upgrade --user
pre_build:
commands:
- mkdir -p build/
build:
commands:
- echo Build started on `date`
- npm install --only=prod
- chmod +x publish.sh
- bash publish.sh
post_build:
commands:
- echo "no post build needed..."
artifacts:
type: zip
files:
- build/template.yml

I need to deploy on codebuild in a Different Region

I'm trying do a deploy in Ohio from São Paulo, I config the buildspec and the conf on .elasticbeanstalk to set a variable to received us-east as parameter.
I did many tryings to do this work but always the error "EXIT STATUS 4" show. This is my last attempt
COMMAND_EXECUTION_ERROR: Error while executing command: eb deploy Logoneagendamento-teste --region us-east-2. Reason: exit status 4
And the buildspec.yml is as follows
version: 0.2
phases:
install:
runtime-versions:
java: corretto8
commands:
- pip install --upgrade awsebcli awscli
build:
commands:
- echo Iniciando build...
- mvn package
- echo eb list --region
- eb list --region us-east-2
- echo Inciando deploy
- eb deploy $DEPLOY_ENV -r $AWS_DEFAULT_REGION
post_build:
commands:
#- command
#- command artifacts: files: - 'target/LogOne-Agendamento.jar'
# - location #name: $(date +%Y-%m-%d) discard-paths: yes –

`npm` and `node` not working on AWS CodeBuild

Story:
I have built a docker image in my local which is laradock/workspace.
I can use the npm and node inside the docker image in my local.
I uploaded the same image at AWS EC2 Container and use it at AWS CodeBuild.
Problem:
But when I do the node -v inside the buildspec.yml it is not working and always return a status 127.
Reference:
Here is the simple code for my buildspec.yml:
version: 0.2
phases:
install:
commands:
- npm -v
- node -v

How can I get AWS CodeBuild's outputted zip extracted in ElasticBeanstalk?

I'm trying to get AWS CodePipeline working with S3 source, CodeBuild and Elastic Beanstalk (nodejs environment)
My problem lies between CodeBuild and Beanstalk.
I have CodeBuild outputting a zip file of the final nodeJS app via the artifacts. Here is my CodeBuild buildspec.yml
version: 0.1
phases:
install:
commands:
- echo Installing Node Modules...
- npm install -g mocha
- npm install
post_build:
commands:
- echo Performing Test
- npm test
- zip -r app-api.zip .
artifacts:
files:
- app-api.zip
When I manually run CodeBuild it successfully puts the zip into S3. When I run CodePipeline it puts the zip on each Elastic Beanstalk instance in /var/app/current as app-api.zip
What I would like is for it to extract app-api.zip as /var/app/current. Just like the manual deploy via the Elastic Beanstalk console interface.
First, a quick explanation. CodePipeline sends whatever files you specified as artifacts to Elastic Beanstalk. In your case, you are sending app-api.zip
What you probably want to do instead, is to send all the files, but not wrap them in a ZIP.
Let's change your buildspec.yml to not create app-api.zip and instead, send the raw files to CodePipeline.
version: 0.1
phases:
install:
commands:
- echo Installing Node Modules...
- npm install -g mocha
- npm install
post_build:
commands:
- echo Performing Test
- npm test
# - zip -r app-api.zip . **<< Remove this line**
artifacts:
files:
- '**/*'
# Replace artifacts/files with the value shown above