Updating env variables and parameters via a task in a job in gocd - go-cd

In GOCD we are able to define env variables and parameters for the entire pipeline, stage and individual jobs. Is it possible to update those already defined env variables and parameters via a task in a job when running the pipeline?

Would you like to update them permanently or just for specific execution?
If you would like to do first, I suppose you have to work with config xml, using Pipeline Config REST API.
For latter case you can trigger pipeline with given environment variables set.
In any case, I would recommend you yagocd library, which would let you do both tasks with ease.

Related

How to have a simple manual ECS deployment in CodePipeline / CodeDeploy?

Basically I would like to have a simple manual deploy step that's not directly linked to a build. For use cases, when using containers, I wouldn't like to perform a build separately per environment (eg: once my build puts an image tag in ECR, I would like to deploy that to any number of environments).
Now, I know in CodePipeline I can have a number of actions and I can precede them with manual approval.
The problem with that is that should I not want to perform the last manually approved deploy, subsequent executions will pile on - the pipeline execution doesn't complete and what comes next will just have to wait. I can set a timeout, for sure, but there are moments when 20 builds come in fast and I don't know which one of them I may want to deploy to which environment (they generally all go to some QA/staging, but some need to manually deployed to a particular dev-related environment or even to production).
Manually updating task definitions all around in ECS is tedious.
I have a solution where I can manually patch a task definition using awscli and yq but is there a way to have a simple pipeline with one step that takes a manual input (aka image tag) and either uses an ECS deploy step (the only place where you can provide a clean straight patch json to patch the task definition) or uses my yq script to deploy?

In AWS CodePipeline do we have an option to provide a parameter at run time

I have an angular project and i wanted to have a single pipeline to build for uat, develop and production. I know in the codebuild we can provide an environment variable but if this is hardcoded each time i need to edit the codebuild.
Like jenkins is there any option which ask for parameter which needs to inject to codebuild ?
You cannot pass a variable "from outside" to the CodePipeline, for example when starting a pipeline, pass a variable 'Environment' like dev, uat etc. The StartPipelineExecution API has no such provision.
Instead, actions within the pipeline can generate and pass variables to subsequent actions. This is useful for say CodeBuild action generating a comment which is later consumed by the Manual approval action. Please see the following links for Variable feature in CodePipeline:
https://docs.aws.amazon.com/codepipeline/latest/userguide/reference-variables.html
https://docs.aws.amazon.com/codepipeline/latest/userguide/actions-variables.html

how to deploy in different environment (dev, uat ,prod) using cdk pipeline?

When I commit to develop branch it must deploy code to specific environment (dev). Similarly when i deploy to uat branch it must deploy to uat environment. How do i achieve this functionality in aws cdk pipeline ?
There is stage and be deployed to multiple region but need to define if pushed to this branch then deploy to this environment likewise.
The best approach depends on a few factors including whether your stack is environment agnostic or not (i.e. whether it needs to look up resources from within a given account.)
For simply switching between different accounts and regions, the CDK team has a decent writeup here which recommends a small wrapper script for each environment that injects the configuration by way of CDK_DEPLOY_ACCOUNT and CDK_DEPLOY_REGION environment variables.
If you want to provide other synth time context then you can do so via the context API, which allows you to provide configuration 'in six different ways':
Automatically from the current AWS account.
Through the --context option to the cdk command.
In the project's cdk.context.json file.
In the project's cdk.json file.
In the context key of your ~/.cdk.json file.
In your AWS CDK app using the construct.node.setContext method.
My team uses the inline context args to define an environment name, and from the environment name, it reads a json config file that defines many environment-dependent parameters.
cdk deploy --context env=Dev
We let the environment name determine the branch name and set it accordingly on the 'Branch' property of the 'GitHubSourceAction'. (C# code)
string env = (string)this.Node.TryGetContext("env");
var pipeline = new CdkPipeline(this, "My-Pipeline", new CdkPipelineProps()
{
SourceAction = new GitHubSourceAction(new GitHubSourceActionProps()
{
Branch = env
})
})

Is there a way of running AWS Step Functions locally when defined by CDK?

AWS Step Functions may be run in a local Docker environment using Step Functions Local Docker. However, the step functions need to be defined using the JSON-based Amazon States Language. This is not at all convenient if your AWS infrastructure (Step Functions plus lambdas) is defined using AWS CDK/CloudFormation.
Is there a way to create the Amazon States Language definition of a state machine from the CDK or CloudFormation output, such that it’s possible to run the step functions locally?
My development cycle is currently taking me 30 minutes to build/deploy/run my Lambda-based step functions in AWS in order to test them and there must surely be a better/faster way of testing them than this.
We have been able to achieve this by the following:
Download:
https://docs.aws.amazon.com/step-functions/latest/dg/sfn-local.html
To run step functions local, in the directory where you extracted the local Step Function files run:
java -jar StepFunctionsLocal.jar --lambda-endpoint http://localhost:3003
To create a state machine, you need a json definition (It can be pulled from the generated template or can get the toolkit plug in for Vs code, type step functions, select from a template and that can be your starter. Can also get it from the AWS console in the definition tab on the step function.
Run this command in the same directory as the definition json:
aws stepfunctions --endpoint http://localhost:8083 create-state-machine --definition "cat step-function.json" --name "local-state-machine" --role-arn "arn:aws:iam::012345678901:role/DummyRole"
You should be able to hit the SF now (hopefully) :)
You can use cdk watch or the --hotswap option to deploy your updated state machine or Lambda functions without a CloudFormation deployment.
https://aws.amazon.com/blogs/developer/increasing-development-speed-with-cdk-watch/
If you want to test with Step Functions local, cdk synth generates the CloudFormation code containing the state machine's ASL JSON definition. If you get that and replace the CloudFormation references and intrinsic functions, you can use it to create and execute the state machine in Step Functions Local.
How some people have automated this:
https://nathanagez.com/blog/mocking-service-integration-step-functions-local-cdk/
https://github.com/kenfdev/step-functions-testing
Another solution that might help is to use localstack what is supports many tools such CDK or CloudFormation and let developers to run stack locally.
There are a variety ways to run it, one of them is to run it manually in docker container, according to the instruction get started.
Next following the instruction what's next configure aws-cli or use awslocal.
All next steps and templates should be the same as for AWS API in the cloud.

Using AWS Batch can a docker image be specified dynamically in a job definition?

I want to create jobs in AWS Batch that vary on the image that is used to launch the container. I'd like to do this without creating a different Job Definition for each image. Is it possible to parameterize the image property using job definition parameters? If not, what's the best way to achieve this or do I have to just create job definitions on the fly in my application?
I would really love this functionality as well. Sadly, it appears the current answer is no.
Batch allows parameters, but they're only for the command.
AWS Batch Parameters
You may be able to find a workaround be using a :latest tag, but then you're buying a ticket to :latest hell.
My current solution is to use my CI pipeline to update all dev job definitions using the aws cli (describe-job-definitions then register-job-definition) on each tagged commit.
To keep my infrastructure-as-code consistent, I've moved the version for batch job definitions into an environment variable that I retrieve before running any terraform commands.
Typically you make a job definition for a docker image.
However that job definition and docker can certainly do anything you've programmed it to do so it can be multi-purpose and you pass in whatever parameter or command line you would like to execute.
You can override most of the parameters in a Job definition when you submit the job.