Our AWS accounts are set up so that users login to one account, and then 'assume role' to different accounts to access various services.
We have TravisCI setup so that it runs an integration test against a test account, and then uploads a build artifact into S3.
Currently this is done using a single set of IAM user credentials with the user in the test account. I would like to move the user into a different account, and then have TravisCI assume the correct role in the test account to run the tests, and then assume a different role in another account to upload the build artifact. I do not want to add users to the accounts themselves.
I cannot see this functionality built in to the S3 deployment and have not had any luck finding anyone else trying to do this.
I think that this may be possible by dynamically populating environment variables during a setup phase, and then passing the variable on to later stages, but I cannot work out if this is possible.
Does anyone have assume role working with TravisCI?
Related
This is my first post here, I am working on a AWS CodePipeline which creates new AWS Accounts and assign users through AWS SSO, which has Permissions Set with specific managed IAM policies and inline policy as permission boundary set for the user groups. I would like to use a test pipeline that tests the specific user role from the vended AWS account and test whether the user(s) able to perform certain actions such as enabling internet access, create policy etc and based on the results proceed further pipeline steps.
Example: The pipeline runs on POC environment creates an account, then will have to run the test against the SSO user / local IAM user to check if the user can create internet gateway etc, usually this can be an IAM policy simulator cli and it results whether the user action allowed or not. My pipeline flow should proceed moving the source to the "master" branch for the production environment depend on the test results or discard if fails.
I am trying few tools such as Taskcat and others most of them do not perform such functional test, only checks the existence of the resource.
Any suggestions for tools that can allow me to perform such functional test as part of the pipeline would be appreciated.
Thanks in advance.
I managed to use "awspec" for achieving the functional test for the AWS resources, the one I was specifically looking for is of IAM policy simulator using the below "awspec" resource.
describe iam_role('my-iam-role') do
it { should be_allowed_action('ec2:DescribeInstances') }
it { should be_allowed_action('s3:Put*').resource_arn('arn:aws:s3:::my-bucket-name/*') }
end
I am setting up a amplify project for a certain project. In near future, I would want the project to be transferred to different AWS account but with exact configuration. What's the best way to achieve so? Is there any way I can create some sort of script that would set up same project in different AWS account?
I do something very similar leveraging AWS Organizations with multiple member accounts and AWS SSO. At a high level, here are some things you will want to think about...
You can find a high level architecture diagram about this here: https://aws.amazon.com/blogs/mobile/fintech-startup-creditgenie-ultimate-speed-from-mvp-to-growth/
I've been meaning to write a blog post about this, but at a high level...
Create an AWS organization from your root AWS account and setup AWS SSO.
Create multiple member AWS accounts within the organization. e.g., customer1, customer2, etc.
Create branches in your repository that match your account structure e.g., origin/customer1, origin/customer2.
From each member AWS account, create an Amplify app in the Amplify console with 1 environment that points to the correct branch, e.g, AWS account customer1 should have an Amplify App with 1 environment called customer1 that points to the branch remote/customer1
As you develop and merge changes into your main branch, you will want to also merge main into your "production" branches e.g., merge origin/main -> origin/customer1 etc.
I'm not sure wether or not this is possible. We have a couple of different Amazon Web Service Accounts, in this case let us just call them:
Test environment 1
Test environment 2
Production environment
I really want to manage as much as possible with our google accounts. Let us say, as a developer i have access to all three accounts in AWS, and i want to be able to access all these accounts with this one mail. (is this even possible?).
I have tried following this guide: https://wheniwork.engineering/how-to-setup-google-sso-and-aws-4496f054a707
The saml login with google works just fine, but im not really sure how i can change accounts while logged in?
I would love for it to show all organizations that i have access to, and change between them
What you are looking for is already provided by the AWS and it's called 'AWS Landing Zone'
Using Landing Zone, you can spin up multiple AWS accounts, and log in using one credential (Ex: your existing AD credentials)
SSO
Accounts
If you are new to this, I suggest looking at this AWS Tutorial to get an idea about it.
So we have this aws account with some permissions and it was working fine at first. We were able to deploy to aws using serverless framework. But then the client decided to setup an organization since they have other aws accounts also and to consolidate the billing under 1 account, they added the account they gave us to the organization. Now the problem is when we deployed using serverless again, serverless can no longer see the deployment bucket with an access denied error. But when the account was removed from the organization, serverless is able to locate the bucket. Is there some addition permissions or changes to the permissions that needs to be done when an account is linked to an organization? Can someone explain to me cause I can't seem to find any example of my scenario in a google search. I am new to AWS and this is the first time I experience organzations in AWS.
The only implication to permissions from joining an OU (organization unit) would be via the Service Contol Policy (SCP). Verify that the SCP attached to the organization does not block the actions you are attempting to execute.
We would love to get more information if possible, but I would maybe start looking in the following places in your consolidated account:
Trusted access for AWS services - https://console.aws.amazon.com/organizations/home?#/organization/settings
https://console.aws.amazon.com/organizations/home?#/policies
See if anything was changed there, if someone added a policy, or if the AWS Resource Access Manager is disabled.
Ok so I have the exact build script to build my nodejs app.
The idea is to setup a CI/CD pipeline that will build our app across different aws accounts such as DEV, STAGING, and PROD.
I have a repo in Account A (Dev) that hosts a nodejs app
I want to be able to create a pipeline in Account B that checks out code from repo in account A
And finally when the pipeline is complete, it should deploy the built/compiled nodejs app to Account C (QA) and D (Prod)
My issue is not on how to build a nodejs app but rather how to allow codepipeline, in account B, to checkout the repo in account A and finally deploy the built app to staging and prod.
My ideas from reading around:
Create IAM roles for the pipeline in Account A which allows to
checkout codecommit repos
Have the pipeline in account B assume the role from account A somehow
It's still not clear to me how to go about doing this; I'm just getting into aws
I will update this post if I come across a solution but maybe someone has a tutorial or could point me to a documentation or list the steps or an example here
From my understanding it is not possible to build a cross account pipeline using codepipeline. What you need to do is build your pipeline in your central account first.
The central account pipeline would include the appropriate number of stages which result in a release candidate for the AWS accounts dev, staging and prod.
As you get to the different account stages, you should push your artifacts to S3 buckets in the related accounts. And these S3 buckets should be the sources of codepipelines the those accounts.
This way you create "deployment" pipelines in each account which start in S3 and end in whatever environment you are thinking of. The S3 buckets of these accounts can be created to have bucket policies to only receive files from your central account.
This is of course, not ideal but its how I solved this issue before hand. Build in one account and deliver to deployment pipelines in other accounts. If someone knows a better solution, I would love to hear it.
Good luck!
AWS provide a guide with source that does something close to what you are trying to do.
It should get you close enough and covers the permissions needed for the account to assume a role to checkout your repo in another account.