Base on AWS Code Build documentation, we can pass EnvironmentVariables with PARAMETER_STORE type.
I've ensure the parameter store name is typed correctly and the parameter is exists.
I've tried to login via aws cli but it seem not related and still get wrong result.
Here is my cloudformation yaml snippet:
- Name: Build
Actions:
- Name: HelloWord
ActionTypeId:
Category: Build
Owner: AWS
Provider: CodeBuild
Version: "1"
Configuration:
ProjectName: HelloWoldBuild
EnvironmentVariables:
Fn::Sub:
- '[{"name": "NGINX_BASE_IMAGE_TAG","value": "${NGINX_BASE_IMAGE_TAG}","type": "PARAMETER_STORE"}]
- NGINX_BASE_IMAGE_TAG: "/nginx/base"
and here is my buildspec.yaml snippet:
version: 0.2
phases:
install:
runtime-versions:
docker: 18
pre_build:
commands:
- echo "${NGINX_BASE_IMAGE_TAG}"
When I see the CodeBuild log, the output was '***'. The correct one should be value from my parameter store.
How could it happen ? I still don't get it. I've test with PLAINTEXT type and works well.
This is by design.
Values from parameter store are considers as sensitive.
So CodeBuild masks them so they don't appear in plain text in its logs.
To deal with that, you can assign it to different variable, and print that. The following would be my first try to deal with that:
pre_build:
commands:
- NEW_VAR=${NGINX_BASE_IMAGE_TAG}
- echo "${NEW_VAR}"
Alternative. Save it to file and print out a file:
pre_build:
commands:
- echo ${NGINX_BASE_IMAGE_TAG} > /tmp/test.value
- cat /tmp/test.value
Related
I've set up a build job that uses batch builds.
2 batches will build something, upload to S3 and output the location in a json file.
The last batch is supposed to pick up the two json files and use them in some further things.
My problem: I can't find the artifacts in the last job.
When I use ls in the first 2 jobs, they are there - but not in the last one.
Here is my buildspec, with unimportantt parts removed.
version: 0.2
batch:
fast-fail: true
build-graph:
- identifier: template_examplehook
- identifier: s3_checkbucketencryptionhook
- identifier: stackset
buildspec: automation/assemble-template.yaml
depend-on:
- template_examplehook
- s3_checkbucketencryptionhook
phases:
install:
runtime-versions:
python: 3.7
pre_build:
commands:
- echo "Starting ..."
- ...
build:
commands:
- echo "Building with $(python --version)"
- cd $CODEBUILD_BATCH_BUILD_IDENTIFIER
- ---
- echo $S3_URI_PACKAGE > hash.json
- ---
post_build:
commands:
- echo Build completed on $(date)
artifacts:
files:
- '*/hash.json'
I expected to find the hash.json file in their respective folders but they don't exist in the last batch job.
Update after talking to our AWS tech support:
Unexpected behaviors, it should work as we thought it would, but it doesn't.
We ended up rewriting it and went with two different build steps.
I'm currently using AWS Amplify to manage my front-end. I've been manually injecting the environment variables throughout the console.
While I have seen that (at least in this case), the environment variables are correctly protected as mentioned in the AWS docs. I wanted to know if it was possible to set in the amplify.yml file variables per branch that do not necessarily need protection.
Something like this:
version: 0.1
env:
variables:
myvarOne:
branch: master
value: ad
branch: dev
value otherval
frontend:
phases:
preBuild:
commands:
- yarn install
- yarn lint
- yarn test
build:
commands:
- yarn build build
artifacts:
baseDirectory: build
files:
- '**/*'
cache:
paths:
- node_modules/**/*
So far, it seems there is no ideal solution for your problem
However, it is possible to do some workaround to have something like that working
You cannot have per branch environment variables, but you can have per branch commands
So, you can define different variables for different branches and run the appropriate command as you wish
version: 0.1
env:
variables:
myvarOne:
value_master: val
value_dev: otherval
frontend:
phases:
preBuild:
commands:
- if [ "${AWS_BRANCH}" = "master" ]; then export VALUE=${value_master}; fi
- if [ "${AWS_BRANCH}" = "dev" ]; then export VALUE=${value_dev}; fi
- yarn install
- yarn lint
- yarn test
build:
commands:
- yarn build build
artifacts:
baseDirectory: build
files:
- '**/*'
cache:
paths:
- node_modules/**/*
I might be pretty late with the answer, but as of the time of writing, it seems like there exists an out-of-the-box solution to your case.
According to the documentation almost verbatim, it asks you to do the following:
Sign in to the AWS Management Console here.
Navigate to App Settings > Environment variables > Manage variables.
In the Manage variables section, under Variable, enter your key. For Value, enter your value.
Choose Actions and then choose Add variable override.. You can override the environment variable for the key based on branches.
You now have a set of environment variables specific to your branch.
This GIF better illustrates steps 4-5.
I don't have enough stackoverflow reputation to add an image, so please refer to the 5th point in the documentation where it describes a way to Add variable override for specific branches.
https://docs.aws.amazon.com/amplify/latest/userguide/environment-variables.html
Environment variables declared within the Codebuild console, are not getting resolved.
I've tried as many variations of variable usage with YAML as I could think of;
$VARIABLE
${VARIABLE}
"$VARIABLE"
version: 0.2
env:
variables:
AWS_S3_BUCKET_NAME: "AWS_S3_BUCKET_NAME"
API_URL: "API_URL"
parameter_store:
aws_access_key: "my-access-key"
aws_secret_key: "my-secret-key"
phases:
install:
runtime-versions:
python: 3.7
post_build:
commands: >-
AWS_REGION=${AWS_REGION}
SOURCE_REPO_URL=${CODEBUILD_SOURCE_REPO_URL}
SOURCE_BRANCH=${CODEBUILD_SOURCE_VERSION}
AWS_S3_BUCKET_NAME=${AWS_S3_BUCKET_NAME}
AWS_ACCESS_KEY=${aws_access_key}
AWS_SECRET_KEY=${aws_secret_key}
// custom script
The docs specify you need to have this env structure of key/value pairs. Where key is the name I want to use, and value is the name of the variable/parameter to get the value from.
When running in Codebuild, the output simply shows exactly what's written, rather than replacing the variables...
Just remove block chomping indicator, >-, after "commands:". That just prevents variable substitution.
You can use either $VARIABLE or ${VARIABLE} in your buildspec.
try using echo first to see you can see the value. I used as follows it worked for me
version: 0.2
env:
variables:
key: "value"
pre_build:
commands:
- echo ${key}
I'm trying to build & deploy a simple sample project to learn AWS. C# / .NET Core.
My buildspec looks like this:
version: 0.2
phases:
install:
runtime-versions:
dotnet: 2.2
pre_build:
commands:
- echo Restore started on `date`
- dotnet restore AWSServerless1.csproj
build:
commands:
- echo Build started on `date`
- dotnet publish -c release -o ./build_output AWSServerless1.csproj
artifacts:
files:
- ./build_output/**/*
- scripts/**/*
- appspec.yml
discard-paths: yes
My appspec looks like this:
version: 0.0
Resources:
- myStack-AspNetCoreFunction-1HPKUEU7I6GFW:
Type: AWS::Lambda::Function
Properties:
Name: "myStack-AspNetCoreFunction-1HPKUEU7I6GFW"
Alias: "AWSServerless1"
CurrentVersion: "1"
TargetVersion: "2"
The pipeline completes successfully, but when I try to run the lambda, I get a 502. I checked the logs and it says:
Could not find the required 'AWSServerless1.deps.json'. This file should be present at the root of the deployment package.: LambdaException
When I download the package from S3, to me, it looks like everything is there. It's a zip file, no paths anywhere, everything is in the root of the zip including AWSServerless1.deps.json.
Any ideas?
Use dotnet lambda package instead of publish
see https://github.com/aws/aws-extensions-for-dotnet-cli
I have googled and searched within stackoverflow and found some suggestions but still no succedd. My build process in AWS Codebuild runs and gives me a success output but in the log shows -> 'No artifact files specified', and as the result no files are being copied in my S3. Could anybody help me figure this out. Here I share my yml setting:
version: 0.1
phases:
build:
commands:
- echo Nothing to do yet
addons:
artifacts:
s3_region: "eu-central-1"
files:
- '**/*'
I suggest you refer to the Build Specification Reference.
Specifically, you should remove addons: as well as s3_region: as neither of these are valid CodeBuild flags. You should also be using version: 0.2, as version: 0.1 has been deprecated.
Here is what your buildspec.yml should look like:
version: 0.2
phases:
build:
commands:
- echo Nothing to do yet
artifacts:
files:
- '**/*'