AccessDeniedException while starting pipeline - amazon-iam

I'm using AWS Lambda with AWS CDK to start my Pipeline
const PipelinesParams = {
name: "GatsbyPipelineLolly",
}
try {
const pipeline = new AWS.CodePipeline();
await docClient.put(params).promise();
pipeline.startPipelineExecution(
PipelinesParams,
function (err: any, data: any) {
if (err) {
console.log(err);
} else {
console.log(data);
}
}
)
and that's the action I authorized
const policy = new PolicyStatement();
policy.addActions('s3:*');
policy.addResources('*');
policy.addActions('codepipeline:*');
still getting unauthorized error image is also been attached for review

Are you sure, the policy is attached to the role, with which you are deploying the pipeline?

It looks like you've created a policy but haven't attached it to the role you are using (from your error message). Please see:
https://docs.aws.amazon.com/cdk/api/latest/docs/aws-iam-readme.html#using-existing-roles and https://docs.aws.amazon.com/cdk/api/latest/docs/#aws-cdk_aws-iam.Policy.html#roles
const role = iam.Role.fromRoleArn(this, 'Role', 'arn:aws:iam...')
policy.attachToRole(role)

Related

how to update secrets manager secret value from lambda function?

I have a access token retrieved from an api and its lifetimes is 5 hours. I saw that secrets manager can rotate a secret but minimum rotation time is 1 day. Because of this i thought that i can write a lambda function that works every 5 hours and takes a new token from api and updates secret with this token. This answer https://stackoverflow.com/a/66610124/11902308 mentions that it can be done but i couldn't find any resource about how to update a secret from a lambda function.
Note:
I have multiple task and token is invalid when a new one is generated because of that task have to share it.
You didn't mention which AWS SDK you are using, but here is an example for AWS SDK for JS to create or update a Secret.
Additionally, it handles the case for deleted secrets (SecretsManager doesn't delete secrets immediately, it marks them as scheduled for deletion) and restores and updates them.
import { AWSError } from 'aws-sdk';
import SecretsManager, { SecretARNType } from 'aws-sdk/clients/secretsmanager';
const secretsManager = new SecretsManager({ apiVersion: '2017-10-17' });
const putSecret = async (name: string, secret: Record<string, unknown>, secretId?: SecretARNType): Promise<SecretARNType> => {
try {
const result = secretId
? secretsManager.updateSecret({
SecretId: secretId,
SecretString: JSON.stringify(secret),
})
: secretsManager.createSecret({
Name: name,
SecretString: JSON.stringify(secret),
});
const { ARN } = await result.promise();
if (!ARN) throw new Error(`Error saving secret ${name}`);
return ARN;
} catch (error) {
const awsError = error as AWSError;
// InvalidRequestException indicates that the secret is scheduled for deletion
if (awsError.code === 'InvalidRequestException') {
// restore existng secret
await secretsManager
.restoreSecret({
SecretId: secretId || name,
})
.promise();
// update secret
const { ARN } = await secretsManager
.updateSecret({
SecretId: secretId || name,
SecretString: JSON.stringify(secret),
})
.promise();
if (!ARN) throw new Error(`Error restoring secret ${name}`);
return ARN;
}
throw error;
}
};
The code is in TypeScript, but should be easily converted into the language of your choice because the AWS SDK functions have the same name in other languages.

How to update a Step Function in AWS CLI without Deleting the Logs

I have been working with AWS for the last month and I need to know how can we update the step function without changing the name of the file.
The way that the documentation provided to make the changes in the step function is to change the name of the existing one and add the changes in the file. But that will eliminate the logs that have been created in the AWS CLI.
For Example, if I replace the following code with something else I have to change the whole dynamic of the project in order to make them appear in the AWS CLI
Can somebody please provide a solution for this??
The update part can be done through an AWS configuration command. Follow the commands below, it will keep all the changes in the execution logs as wells.
let aws = require('aws-sdk');
let roleArn = `roleARN goes here`;
let params = {
name: stepFunctionName,
roleArn: roleArn,
definition: JSON.stringify(definitionGoesHere),
};
let stepFunctions = new aws.StepFunctions();
stepfunctions.createStateMachine(params, function (err, data) {
if (err) {
console.log("error occured while creating the step function");
console.log(err, err.stack);
if (err.code === "StateMachineAlreadyExists" && err.statusCode === 400) {
let paramsUpdate = {
stateMachineArn: "stateMachine ARN for the existing stateMachine",
definition: JSON.stringify(definition),
loggingConfiguration: {
includeExecutionData: true,
},
roleArn: roleArn,
};
stepfunctions.updateStateMachine(
paramsUpdate,
function (error, updateData) {
if (error) {
console.log("error occured while updating the step function.");
console.log("Error", error.stack);
}
console.log("step function updated successfully");
console.log("response", updateData);
}
);
}
console.log(
"step function does not exist and the function creation and update faild in the process."
);
console.log("definition", definition for the stateMachine);
} // an error occurred
else console.log(data); // successful response
});

Getting 'The security token included in the request is invalid.' when trying to get a pre signed url for s3

Basically I'm trying to get a pre-signed URL for the putObject method in S3. I send this link back to my frontend, which then uses it to upload the file directly from the client.
Here's my code :
const AWS = require("aws-sdk");
const s3 = new AWS.S3({
accessKeyId: process.env.AWS_IAM_ACCESS,
secretAccessKey: process.env.AWS_IAM_SECRET,
region: 'ap-southeast-1',
});
const getPreSignedUrlS3 = async (data) => {
try {
//DO SOMETHING HERE TO GENERATE KEY
const options = {
Bucket: process.env.AWS_USER_CDN,
ContentType: data.type,
Key: key,
Expires: 5 * 60
};
return new Promise((resolve, reject) => {
s3.getSignedUrl(
"putObject", options,
(err, url) => {
if (err) {
reject(err);
}
else resolve({ url, key });
}
);
});
} catch (err) {
console.log(err)
return {
status: 500,
msg: 'Failed to sync with CDN, Please try again later!',
}
}
}
I'm getting the following error from the aws sdk : The security token included in the request is invalid.
Things I have tried :
Double check the permissions from my IAM user. Even made bucket access public for testing. My IAM user is given full s3 access policy.
Tried using my root user security key and access details. Still got the same error.
Regenerated new security credentials for my IAM user. I don't have any MFA turned on.
I'm following this documentation.
SDK Version : 2.756.0
I've been stuck on this for a while now. Any help is appreciated. Thank you.
Pre-signed URLs are created locally in the SDK so there's no need to use the asynchronous calls.
Instead, use a synchronous call to simplify your code, something like this:
const getPreSignedUrlS3 = (Bucket, Key, ContentType, Expires = 5 * 60) => {
const params = {
Bucket,
ContentType,
Key,
Expires
};
return s3.getSignedUrl("putObject", params);
}

Problems accessing storage from Lambda function in Amplify

What I want to do?
I want to create REST API that returns data from my DynamoDB table which is being created by GraphQL model.
What I've done
Create GraphQL model
type Public #model {
id: ID!
name: String!
}
Create REST API with Lambda Function with access to my PublicTable
$ amplify add api
? Please select from one of the below mentioned services: REST
? Provide a friendly name for your resource to be used as a label for this category in the project: rest
? Provide a path (e.g., /book/{isbn}): /items
? Choose a Lambda source Create a new Lambda function
? Provide an AWS Lambda function name: listPublic
? Choose the runtime that you want to use: NodeJS
? Choose the function template that you want to use: Hello World
Available advanced settings:
- Resource access permissions
- Scheduled recurring invocation
- Lambda layers configuration
? Do you want to configure advanced settings? Yes
? Do you want to access other resources in this project from your Lambda function? Yes
? Select the category storage
? Storage has 8 resources in this project. Select the one you would like your Lambda to access Public:#model(appsync)
? Select the operations you want to permit for Public:#model(appsync) create, read, update, delete
You can access the following resource attributes as environment variables from your Lambda function
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
? Do you want to invoke this function on a recurring schedule? No
? Do you want to configure Lambda layers for this function? No
? Do you want to edit the local lambda function now? No
Successfully added resource listPublic locally.
Next steps:
Check out sample function code generated in <project-dir>/amplify/backend/function/listPublic/src
"amplify function build" builds all of your functions currently in the project
"amplify mock function <functionName>" runs your function locally
"amplify push" builds all of your local backend resources and provisions them in the cloud
"amplify publish" builds all of your local backend and front-end resources (if you added hosting category) and provisions them in the cloud
Succesfully added the Lambda function locally
? Restrict API access No
? Do you want to add another path? No
Successfully added resource rest locally
Edit my Lambda function
/* Amplify Params - DO NOT EDIT
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
Amplify Params - DO NOT EDIT */
const AWS = require("aws-sdk");
const region = process.env.REGION
AWS.config.update({ region });
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: "PublicTable"
}
async function listItems(){
try {
const data = await docClient.scan(params).promise()
return data
} catch (err) {
return err
}
}
exports.handler = async (event) => {
try {
const data = await listItems()
return { body: JSON.stringify(data) }
} catch (err) {
return { error: err }
}
};
Push my updates
$ amplify push
Open my REST API endpoint /items
{
"message": "User: arn:aws:sts::829736458236:assumed-role/myprojectLambdaRolef4f571b-dev/listPublic-dev is not authorized to perform: dynamodb:Scan on resource: arn:aws:dynamodb:us-east-1:8297345848236:table/Public-ssrh52tnjvcdrp5h7evy3zdldsd-dev",
"code": "AccessDeniedException",
"time": "2021-04-21T21:21:32.778Z",
"requestId": "JOA5KO3GVS3QG7RQ2V824NGFVV4KQNSO5AEMVJF66Q9ASUAAJG",
"statusCode": 400,
"retryable": false,
"retryDelay": 28.689093010346657
}
Problems
What I did wrong?
How do I access my table and why I didn't get it when I created it?
Why API_MYPROJECT_PUBLICTABLE_NAME and other constants are needed?
Decision
The problem turned out to be either the NodeJS version or the amplify-cli version. After updating amplify-cli and installing the node on the 14.16.0 version, everything worked.
I also changed the name of the table to what Amplify creates for us, although this code did not work before. The code became like this:
/* Amplify Params - DO NOT EDIT
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
Amplify Params - DO NOT EDIT */
const AWS = require("aws-sdk");
const region = process.env.REGION
const tableName = process.env.API_MYPROJECT_PUBLICTABLE_NAME
AWS.config.update({ region });
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: tableName
}
async function listItems(){
try {
const data = await docClient.scan(params).promise()
return data
} catch (err) {
return err
}
}
exports.handler = async (event) => {
try {
const data = await listItems()
return { body: JSON.stringify(data) }
} catch (err) {
return { error: err }
}
};

How to return a YAML file as an output artifact from Lambda to CodePipeline

I am trying to return a YAML CloudFormation template file to CodePipeline so it can be passed to CloudFormation. The file is uploaded to S3, and can be accessed and read by Lambda just fine. The issue I have is turning this file into an output artifact.
I am currently able to output the raw text as an 'Output variable', which I've confirmed from the CodePipeline Execution History in the AWS Console. In this same page there is also an artifact:
However this link takes me to an error in S3, leading me to believe the artifact is never actually populated in any way.
This the Lambda function I have at the moment:
var AWS = require('aws-sdk')
var s3 = new AWS.S3();
var codepipeline = new AWS.CodePipeline();
exports.handler = (events, context) => {
const params = {
Bucket: 'myBucket',
Key: 'myFile.yml'
};
s3.getObject(params, function(err, data) {
if (err) {
// Handling
} else {
var yamlString = data.Body.toString('utf-8');
var params = {
jobId: jobId,
outputVariables: {
s3Artifact: file
}
};
codepipeline.putJobSuccessResult(params, function(err, data) {
if(err) {
context.fail(err);
} else {
context.succeed(message);
}
});
return;
}
});
}
What confuses me is that CodePipeline clearly allows me to specify an output artifact for this Lambda function, yet I cannot work out how to do so.
Any help is much appreciated.
Ultimately it seems I was going about this the wrong way. I changed the pipeline to have only an S3 source action and have it be triggered by a custom CloudWatch event for when a new image is pushed to ECR. I then manually disabled the S3 event created by CodePipeline so that only an ECR push triggers a deployment.
See more here.