Cross account codepipeline with S3 source in different account - amazon-web-services

I have 2 AWS account:
Account A: Codepipeline
Account B: S3 containing zip
The requirement is such that the codepipeline in account A will need S3 zip file as source stage from Account B.
Also the codepipeline should detect changes in the S3 path whenever there is a new zip file and invoke the pipeline.
Can anyone help me with the cross account roles and steps required in the above process please.

Create a pipeline in CodePipeline that uses resources from another AWS account -- https://docs.aws.amazon.com/codepipeline/latest/userguide/pipelines-create-cross-account.html

Related

AWS cross account access for code commit in build AWS code job source

I have two AWS accounts A, B.
All my code commit repositories are present in account A.
Now I want to create the AWS code Build job in account B for repositories in account A.
I am trying to figure out to get the list of AWS repositories in account B from account A while selecting the source for creating a code build job.
I am not sure how to get the list of repositories from account A to account B in the source Repository field.
I have followed the below tutorial only till the second topic.
https://docs.aws.amazon.com/codecommit/latest/userguide/cross-account.html
Any help will be appreciated.
You can configure access to CodeCommit repositories for IAM users and groups in another AWS account. This is often referred to as cross-account access.
Mainly you be need to do the following:
Will need to create a policy and role for the repository with the needed permissions.
Create a policy and attach to your CodeBuild Role allowing the access on the Resource for the created Role
eg.
"Resource": "arn:aws:iam::REPO_ACCOUNT:role/MyCrossAccountRepositoryContributorRole"
This will enable the CodeBuild to access the needed CodeCommit repository.
This page explain this very well: Configure cross-account access to an AWS CodeCommit repository using roles.
Also, check this blog post that explain a little more detailed what you want:
AWS CodePipeline with a Cross-Account CodeCommit Repository.

Can I run S3 Batch copy operation job from source account

I am trying to run Batch Copy operation job to copy large amount of data from one s3 bucket to another.
Source Account: contains s3 bucket with objects.
Destination Account: contains s3 bucket with manifest, and destination s3 bucket for objects.
I need to run the Batch operation job in source account or a third account altogether.
So far, I am able to succeed in the following:
Run s3 batch job within same aws account https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-managing-jobs.html
Run s3 batch job from destination s3 bucket https://aws.amazon.com/blogs/storage/cross-account-bulk-transfer-of-files-using-amazon-s3-batch-operations/
However when I try to create a batch job at the source account, I get errors.
when I enter manifest file from destination account, I get error:
Unable to get the manifest object’s ETag. Specify a different object to continue.
when I enter the destination s3 bucket from destination account, I get error:
Insufficient permissions to access <s3 bucket>
Is there a way to change configurations to enable running batch job from source account?
Each Amazon S3 Batch Operation job is associated with an IAM Role.
The IAM Role would need permission to access the S3 bucket in the other AWS Account (or permission to access any S3 bucket).
In addition, the Destination Bucket (in the other AWS Account) will also need a Bucket Policy that permits that IAM Role to access the bucket (at minimum GetObject).

I want my lambda code to directly upload files into an Amazon S3 bucket of a different account

So I have a lambda function that triggers an Amazon SageMaker processing job and this job currently writes a few files to my Amazon S3 bucket. I have mentioned my output_uri ='s3://outputbucket-in-my-acc/' Now I want the same files to be directly uploaded to a different AWS account and not in my account. How do i achieve this? I want no traces of the file to be stored in my account.
I found a similar solution here but this copies the file into the different account while the original files are still present in the source account:
AWS Lambda put data to cross account s3 bucket
Your Lambda Function (Account A) needs to assume a role in the other account (Account B) which has the permissions to write to the s3 location. For that you need to establish trust between the accounts with a cross account role.
Afterwards you assume the role in Account B from your Lambda function's code and execute the S3 command.
Find an example with boto3 here: https://aws.amazon.com/premiumsupport/knowledge-center/lambda-function-assume-iam-role/
The SageMaker APIs for job creation typically (always?) include a RoleARN which will be the IAM role that SageMaker assumes to do work on your behalf. That IAM role must have the necessary permissions so that Amazon SageMaker can successfully complete its task (e.g. have PutObject permission to the relevant S3 bucket) and must have the necessary trust policy allowing the SageMaker service (sagemaker.amazonaws.com) to assume the role.

CodeDeploy step of CodePipeline because of insufficient role permissions

I have a 3 stage CodePipeline on AWS.
Source: Checks out upon commit a specific branch of CodeCommit (success)
Build: Runs some tests on a docker image via CodeBuild (success)
Deploy: Performs a deployment on a deployment group (a.k.a. some specifically tagged EC2 instances) via CodeDeploy (failure).
Step 3 fails with
Unable to access the artifact with Amazon S3 object key
'someitem-/BuildArtif/5zyjxoZ' located in the Amazon S3
artifact bucket 'codepipeline-eu-west-1-somerandomnumber'. The provided
role does not have sufficient permissions.
Which role is the later referring to?
The service role of CodePipeline or the service role of CodeDeploy?
I am almost certain I have attached the appropriate policies to both though ...
Here is a snippet of my CodePipeline service role
try to give "CodeDeploy" policy with full access, it should work.
This could also be due to the actual BuildArtifact not existing. Check the specified path in your S3 bucket to see whether the object actually exists. CodePipeline just gives CodeDeploy a reference to an artifact it thinks has been built and uploaded, but it doesn't really know.
This issue is not related to the Roles assigned to either Codepipeline or Codebuild. If you investigate you would find that in the S3 bucket 'codepipeline-eu-west-1-somerandomnumber', there is no folder "BuildArtif" and certainly no file - "5zyjxoZ".
The issue is that Codebuild is not sending any artifact to Codedeploy, change the 'Input artifacts' for Codebuild to the output of the Source stage of the Pipeline and the issue would be resolved.
The error message should be referring to the CodeDeploy role. The CodeDeploy action passes the S3 artifact by reference to CodeDeploy, so the CodeDeploy role needs to have read access to the CodePipeline artifact.

Permission of images in bucket with command aws s3 sync

I have two aws account and i had sync s3 bucket between both account using below command but when i check images in both account do not have same permission i.e source account has public images for particular folder while target account do not have this.
aws s3 sync s3://sourcebucket s3://destinationbucket
I want exact similar images of source and target with permission, Can any one help?