What is the difference between AWS SSM GetParameter and GetParameters ?
I have a machine with an IAM policy GetParameters and try to read a variable with terraform with the following code:
data "aws_ssm_parameter" "variable" { name = "variable"}
I get an error indicating I'm not authorized to perform GetParameter.
Like the name suggests.
GetParameter provides details about only one parameter per API call.
GetParameters provides details about multiple parameters in one API call.
The parameter details returned are exactly same for both calls, as the two calls return Parameter object:
"Parameter": {
"ARN": "string",
"DataType": "string",
"LastModifiedDate": number,
"Name": "string",
"Selector": "string",
"SourceResult": "string",
"Type": "string",
"Value": "string",
"Version": number
}
The key benefit of the GetParameters is that you can fetch many parameters in a single API call which saves time.
Example use of GetParameter:
aws ssm get-parameter --name /db/password
{
"Parameter": {
"Name": "/db/password",
"Type": "String",
"Value": "secret password",
"Version": 1,
"LastModifiedDate": 1589285865.183,
"ARN": "arn:aws:ssm:us-east-1:xxxxxxxxx:parameter/db/password",
"DataType": "text"
}
}
Example use of GetParameters with two parameters:
aws ssm get-parameters --name /db/password /db/url
{
"Parameters": [
{
"Name": "/db/password",
"Type": "String",
"Value": "secret password",
"Version": 1,
"LastModifiedDate": 1589285865.183,
"ARN": "arn:aws:ssm:us-east-1:xxxxxxxxx:parameter/db/password",
"DataType": "text"
},
{
"Name": "/db/url",
"Type": "String",
"Value": "url to db",
"Version": 1,
"LastModifiedDate": 1589285879.912,
"ARN": "arn:aws:ssm:us-east-1:xxxxxxxxx:parameter/db/url",
"DataType": "text"
}
],
"InvalidParameters": []
}
Example use of GetParameters with non-existing second parameter (/db/wrong)
aws ssm get-parameters --name /db/password /db/wrong
{
"Parameters": [
{
"Name": "/db/password",
"Type": "String",
"Value": "secret password",
"Version": 1,
"LastModifiedDate": 1589285865.183,
"ARN": "arn:aws:ssm:us-east-1:xxxxxxxxx:parameter/db/password",
"DataType": "text"
}
],
"InvalidParameters": [
"/db/wrong"
]
}
Related
This is more of a lack of understanding on my part but I cannot seem to debug this.
I have created an codepipeline which runs terraform apply ( which internally creates the aws infrastructure for me ). the codepipeline seems to be working.
I need to implement the same codepipeline for another account, how can I do so.
I tried to get the json script using the below command.
aws codepipeline get-pipeline --name
I convert json script to yaml script.
When I try to run the yaml script on another account I get below error
Template format error: At least one Resources member must be defined.
ISSUES:
1.) Best Way I can export codepipeline to cloudformation template
2.) The approach which I used didn't work, how to solve it?
{
"pipeline": {
"name": "my-code-pipeline",
"roleArn": "arn:aws:iam::aws-account-id:role/service-role/AWSCodePipelineServiceRole-aws-region-my-code-pipeline",
"artifactStore": {
"type": "S3",
"location": "codepipeline-aws-region-45856771421"
},
"stages": [
{
"name": "Source",
"actions": [
{
"name": "Source",
"actionTypeId": {
"category": "Source",
"owner": "ThirdParty",
"provider": "GitHub",
"version": "1"
},
"runOrder": 1,
"configuration": {
"Branch": "master",
"OAuthToken": "****",
"Owner": "github-account-name",
"PollForSourceChanges": "false",
"Repo": "repo-name"
},
"outputArtifacts": [
{
"name": "SourceArtifact"
}
],
"inputArtifacts": [],
"region": "aws-region",
"namespace": "SourceVariables"
}
]
},
{
"name": "codebuild-for-terraform-init-and-plan",
"actions": [
{
"name": "codebuild-for-terraform-init",
"actionTypeId": {
"category": "Build",
"owner": "AWS",
"provider": "CodeBuild",
"version": "1"
},
"runOrder": 1,
"configuration": {
"ProjectName": "my-code-pipeline-build-stage"
},
"outputArtifacts": [],
"inputArtifacts": [
{
"name": "SourceArtifact"
}
],
"region": "aws-region"
}
]
},
{
"name": "manual-approve",
"actions": [
{
"name": "approval",
"actionTypeId": {
"category": "Approval",
"owner": "AWS",
"provider": "Manual",
"version": "1"
},
"runOrder": 1,
"configuration": {
"NotificationArn": "arn:aws:sns:aws-region:aws-account-id:Email-Service"
},
"outputArtifacts": [],
"inputArtifacts": [],
"region": "aws-region"
}
]
},
{
"name": "codebuild-for-terraform-apply",
"actions": [
{
"name": "codebuild-for-terraform-apply",
"actionTypeId": {
"category": "Build",
"owner": "AWS",
"provider": "CodeBuild",
"version": "1"
},
"runOrder": 1,
"configuration": {
"ProjectName": "codebuild-project-for-apply"
},
"outputArtifacts": [],
"inputArtifacts": [
{
"name": "SourceArtifact"
}
],
"region": "aws-region"
}
]
}
],
"version": 11
},
"metadata": {
"pipelineArn": "arn:aws:codepipeline:aws-region:aws-account-id:my-code-pipeline",
"created": "2020-09-17T13:12:50.085000+05:30",
"updated": "2020-09-21T15:46:19.613000+05:30"
}
}
The given code is the yaml template that I used to create cloudformation template.
The aws codepipeline get-pipeline --name CLI command returns information about the pipeline structure and pipeline metadata, but it is not the same format as a CloudFormation template (or the resource part of it).
There is no built-in support for exporting existing AWS resources to create a CloudFormation template, though you do have a couple of options.
Use former2 (built and maintained by AWS Hero, Ian Mckay) to generate a CloudFormation template from the resources you select.
Take the JSON output from the aws codepipeline get-pipeline --name command you used and manually craft a CloudFormation template. The pipeline will be one resource in the list of resources in the full template. The info it contains is pretty close, but needs some adjustments to conform to the CloudFormation resource specification for a CodePipeline, which you can find here. You'll also need to do the same for other resources that you need to bring into the template, with aws <service name> describe.
If you go with option 2 (and even if you don't), I recommend using cfn-lint with your code editor to help adhere to the spec.
I imported an OpenAPI definition json into AWS API Gateway, and I noticed that none of path or query parameter validations works. I am hoping to do /card/{type}?userId={userId}, where type belongs to a set of enum values and userId with a regex pattern, as follows:
"paths: {
"/card/{type}": {
"get": {
"operationId": "..",
"parameters": [
{
"name": "type",
"in": "path",
"schema": {"$ref": "#/components/schemas/type}
},
{
"name": "userId"
"in": "query",
"schema": {
"type": "string",
"pattern": "<some regex>"
}
},
...
]
}
}
}
Turns out I can input whatever values I want for both path and query parameters. So I try exporting the OpenAPI file from AWS Console, and I got:
...
"parameters": [
{
"name": "type",
"in": "path",
"schema": {
"type": "string"
}
},
{
"name": "userId"
"in": "query",
"schema": {
"type": "string"
}
For API Gateway, are the validators not working for what's part of URL? because the requests with body seem to work fine. Or is there something I am missing?
I'm using the AWS Parameter Store in order to save parameters to be used by my Lambda functions(env variables), 4 parameters actually. But I am observing some performance issues when loading them, It is taking between 0.2 and 0.6 secs to load one parameter only, which is a lot of time for my web app.
I measured the time by running this command
time aws ssm get-parameter --name "sample_parameter"
I would expect less time in order to load the parameter value, since I need to get 4 parameters. So here is my question...Is it a good pracite to load parameters as json text? so I could put all these 4 parameters within a json object.
Is there something to do in order to improve performance when calling the get parameter function?
Thanks
You can get all the parameters at once using the get-parameters. In my tests it's averages the same time to get all 4 parameters in a single call as it does to get 1.
$ time aws ssm get-parameter --name w1
{
"Parameter": {
"Name": "w1",
"Type": "String",
"Value": "say anything",
"Version": 1,
"LastModifiedDate": 1566914540.044,
"ARN": "arn:aws:ssm:us-east-1:1234567890123:parameter/w1"
}
}
real 0m0.811s
user 0m0.509s
sys 0m0.095s
$ time aws ssm get-parameters --names w1 w2 w3 w4
{
"Parameters": [
{
"Name": "w1",
"Type": "String",
"Value": "say anything",
"Version": 1,
"LastModifiedDate": 1566914540.044,
"ARN": "arn:aws:ssm:us-east-1:1234567890123:parameter/w1"
},
{
"Name": "w2",
"Type": "String",
"Value": "say nothing",
"Version": 1,
"LastModifiedDate": 1566914550.377,
"ARN": "arn:aws:ssm:us-east-1:1234567890123:parameter/w2"
},
{
"Name": "w3",
"Type": "String",
"Value": "say what",
"Version": 1,
"LastModifiedDate": 1566914561.301,
"ARN": "arn:aws:ssm:us-east-1:1234567890123:parameter/w3"
},
{
"Name": "w4",
"Type": "String",
"Value": "say hello",
"Version": 1,
"LastModifiedDate": 1566914574.716,
"ARN": "arn:aws:ssm:us-east-1:1234567890123:parameter/w4"
}
],
"InvalidParameters": []
}
real 0m0.887s
user 0m0.561s
sys 0m0.097s
I'm using AWS SSM to compute a long script on an ec2 instance.
I would like to configure the execution timeout (execution time, not launch time) and I don't find how to do this on the official documentation (opposing informations or not working).
I'm using only the CLI interface.
This value is a document property that can be passed with --parameters option using executionTimeout key. You can use aws ssm describe-documents to find this and other document specific parameters.
aws ssm describe-document --name "AWS-RunShellScript"
{
"Document": {
"Hash": "99749de5e62f71e5ebe9a55c2321e2c394796afe7208cff048696541e6f6771e",
"HashType": "Sha256",
"Name": "AWS-RunShellScript",
"Owner": "Amazon",
"CreatedDate": "2017-08-21T22:25:02.029000+02:00",
"Status": "Active",
"DocumentVersion": "1",
"Description": "Run a shell script or specify the commands to run.",
"Parameters": [
{
"Name": "commands",
"Type": "StringList",
"Description": "(Required) Specify a shell script or a command to run."
},
{
"Name": "workingDirectory",
"Type": "String",
"Description": "(Optional) The path to the working directory on your instance.",
"DefaultValue": ""
},
{
"Name": "executionTimeout",
"Type": "String",
"Description": "(Optional) The time in seconds for a command to complete before it is considered to have failed. Default is 3600 (1 hour). Maximum is 172800 (48 hours).",
"DefaultValue": "3600"
}
],
"PlatformTypes": [
"Linux",
"MacOS"
],
"DocumentType": "Command",
"SchemaVersion": "1.2",
"LatestVersion": "1",
"DefaultVersion": "1",
"DocumentFormat": "JSON",
"Tags": []
}
}
I'm setting up a pipeline to automate cloudformation stack templates deployment.
The pipeline itself is created in the aws eu-west-1 region, but cloudformation stacks templates would be deployed in any other region.
Actually I know and can execute pipeline action in a different account, but I don't see where to specify the region I would like my template to be deployed in, like we do with aws cli : aws --region cloudformation deploy.....
Is there anyway to trigger a pipeline in one region and execute a deploy action in another region please?
The action configuration properties don't offer such possibility...
A workaround would be to run aws cli deploy command from cli in the codebuild container and speficy the good region, But I would like to know if there is a more elegant way to do it
If you're looking to deploy to multiple regions, one after the other, you could create a Code Pipeline pipeline in every region you want to deploy to, and set up S3 cross-region replication so that the output of the first pipeline becomes the input to a pipeline in the next region.
Here's a blog post explaining this further: https://aws.amazon.com/blogs/devops/building-a-cross-regioncross-account-code-deployment-solution-on-aws/
Since late Nov 2018, CodePipeline supports cross regional deploys. However it still leaves a lot to be desired as you need to create artifact buckets in each region and copy over the deployment artifacts (e.g. in the codebuild container as you mentioned) to them before the Deploy action is triggered. So it's not as automated as it could be, but if you go through the process of setting it up, it works well.
CodePipeline now supports cross region deployment and for to trigger the pipeline in different region we can specify the "Region": "us-west-2" property in the action stage for CloudFormation which will trigger the deployment in that specific region.
Steps to follow for this setup:
Create two bucket in two different region which for example bucket in "us-east-1" and bucket in "us-west-2" (We can also use bucket already created by CodePipeline when you will setup pipeline first time in any region)
Configure the pipeline in such a way that is can use respective bucket while taking action in respective account.
specify the region in the action for CodePipeline.
Note: I have attached the sample CloudFormation template which will help you to do the cross region CloudFormation deployment.
{
"Parameters": {
"BranchName": {
"Description": "CodeCommit branch name for all the resources",
"Type": "String",
"Default": "master"
},
"RepositoryName": {
"Description": "CodeComit repository name",
"Type": "String",
"Default": "aws-account-resources"
},
"CFNServiceRoleDeployA": {
"Description": "CFN service role for create resourcecs for account-A",
"Type": "String",
"Default": "arn:aws:iam::xxxxxxxxxxxxxx:role/CloudFormation-service-role-cp"
},
"CodePipelineServiceRole": {
"Description": "Service role for codepipeline",
"Type": "String",
"Default": "arn:aws:iam::xxxxxxxxxxxxxx:role/AWS-CodePipeline-Service"
},
"CodePipelineArtifactStoreBucket1": {
"Description": "S3 bucket to store the artifacts",
"Type": "String",
"Default": "bucket-us-east-1"
},
"CodePipelineArtifactStoreBucket2": {
"Description": "S3 bucket to store the artifacts",
"Type": "String",
"Default": "bucket-us-west-2"
}
},
"Resources": {
"AppPipeline": {
"Type": "AWS::CodePipeline::Pipeline",
"Properties": {
"Name": {"Fn::Sub": "${AWS::StackName}-cross-account-pipeline" },
"ArtifactStores": [
{
"ArtifactStore": {
"Type": "S3",
"Location": {
"Ref": "CodePipelineArtifactStoreBucket1"
}
},
"Region": "us-east-1"
},
{
"ArtifactStore": {
"Type": "S3",
"Location": {
"Ref": "CodePipelineArtifactStoreBucket2"
}
},
"Region": "us-west-2"
}
],
"RoleArn": {
"Ref": "CodePipelineServiceRole"
},
"Stages": [
{
"Name": "Source",
"Actions": [
{
"Name": "SourceAction",
"ActionTypeId": {
"Category": "Source",
"Owner": "AWS",
"Version": 1,
"Provider": "CodeCommit"
},
"OutputArtifacts": [
{
"Name": "SourceOutput"
}
],
"Configuration": {
"BranchName": {
"Ref": "BranchName"
},
"RepositoryName": {
"Ref": "RepositoryName"
},
"PollForSourceChanges": true
},
"RunOrder": 1
}
]
},
{
"Name": "Deploy-to-account-A",
"Actions": [
{
"Name": "stage-1",
"InputArtifacts": [
{
"Name": "SourceOutput"
}
],
"ActionTypeId": {
"Category": "Deploy",
"Owner": "AWS",
"Version": 1,
"Provider": "CloudFormation"
},
"Configuration": {
"ActionMode": "CREATE_UPDATE",
"StackName": "cloudformation-stack-name-account-A",
"TemplatePath":"SourceOutput::accountA.json",
"Capabilities": "CAPABILITY_IAM",
"RoleArn": {
"Ref": "CFNServiceRoleDeployA"
}
},
"RunOrder": 2,
"Region": "us-west-2"
}
]
}
]
}
}
}
}