Auto-generate AWS credentials file from AWS SSO? - amazon-web-services

I use AWS Single Sign-ON (SSO) to get programmatic access keys to our various AWS accounts (production, staging, dev, etc) that I can then use on the command line.
I often need to hop between multiple environments so I need to manually add several sets of credentials to my /.aws/credentials file one at a time from the SSO page.
This isn't the biggest problem but is inconvenient/irritating as it takes time; has to be done a few times a day as the tokens expire; and the profile name included on the individual ~/.aws/credentials snippet has to be manually changed to the account name (e.g. [dev]) rather than the account number and SSO identity that AWS includes by default (e.g. [123456789012_AWSReadOnlyAccess]) so it works with our other tools (in this case Terraform workspaces).
I'd like a way to autogenerate user-friendly content for my ~/.aws/credentials easily covering all the SSO accounts I use day to day.
Is there such a facility/tool/script?

I couldn't find an existing example of something I could use to do this, so I put together this gist which is a bookmarklet that adds a button the AWS SSO landing page that, when clicked, will generate the ~/.aws/credentials content and copy it to the clipboard ready to use!
https://gist.github.com/bennyrw/4c6b18221611332605ea91474ae04f10
I hope it helps someone with the same problem I had :)

Related

Where should private service account key for Google be stored on Mac

I've created a public/private key pair as described here for the Google Cloud Platform (see graphic below)
The problem: I can't find a shred of documentation describing where to put it. This thing is not the typical SSH key pair, but rather a JSON file.
Where should it be stored on a mac to allow the gcloud command to authenticate and push to the GCP?
If you are authenticating locally with a service account to build/push with gcloud, you should set the environment variable on your mac terminal to point to the JSON key file.
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/service-account-file.json"
Once this environment variable is defined, all the requests will be authenticated against that Service Account using the key info from the json file.
Please consider looking at the doc below for reference:
https://cloud.google.com/docs/authentication/production
The CaioT answer is the right one if you want to use a service account key file locally.
However, the question shouldn't be asked because it's a bad practice to have service account key files. They have to be used in only few cases. Else, they are security weakness in your projects.
Have a higher look on this key file. At the end, it's only a file, stored on your mac (or elsewhere) without special security dispositions. You can copy it without any problem, edit it, copy the content. You can send it by email, push it in Git repository (might be public!)...
If you are several developers to work on the same project, it because quickly a mess to know who manage the keys. When you have a leak, it's hard to know which key has been used and need to be removed,...
So, have a closer look to this part of the documentation. I also wrote some articles to propose alternative to use them. Let me know if you are interested.

Google Cloud Platform: project appears in billing reports but doesn't show in the list of projects

Our organization uses Google Cloud APIs for integrating Maps and other services in a number of websites.
We have often used the same API key, without creating a distinct Google Cloud project (and credentials) for each website/project.
We are trying to better organize our API usage, but we are facing an issue.
While we can consult the reports of our Billing account and see the quota for the unique API project used for every implementation, we cannot see and manage this project (it does not appear in the list) even though it seems to belong to the same organization. (EDIT: I am not sure that the organization id is the same, but the name of the organizazion appears as a prefix to the project name in the billing reports)
This project has been created years ago (and the person that created it appears not to have access to it either), but we need to access it to get a clear understanding of where and how APIs are used.
The connected APIs are still in use and working, so we assume the project exists.
Can someone point out the possible reasons why a project is not shown even though it belongs to an organization for which we have access as administrators?
Thank you in advance
In order to see a project in lists, you need the resourcemanager.projects.list IAM permission on the project and to get it's metadata, the resourcemanager.projects.get permission.
How did you find that it has the same organizationId? If you managed to get the metadata via gcloud projects describe, you are likely missing the list permission.
In any case, if the project is indeed part of the organization, an org admin should be able to use gcloud projects add-iam-policy-binding to add a new owner/editor.
There is a special case with Apps Scripts: Those create a hidden project.
If all fails, reach out to GCP Support. Keep in mind though that they will not be able to help you if the project is not within your organization (eg. created with an unrelated gmail.com account or similar)

don't want to login google cloud with service account

I am new at google cloud and this is my first experience with this platform. ( Before I was using Azure )
So I am working on a c# project and the project has a requirement to save images online and for that, I created cloud storage.
not for using the services, I find our that I have to download a service account credential file and set the path of that file in the environment variable.
Which is good and working file
RxStorageClient = StorageClient.Create();
But the problem is that. my whole project is a collection of 27 different projects and that all are in the same solution and there are multi-cloud storage account involved also I want to use them with docker.
So I was wondering. is there any alternative to this service account system? like API key or connection string like Azure provides?
Because I saw this initialization function have some other options to authenticate. but didn't saw any example
RxStorageClient = StorageClient.Create();
Can anyone please provide a proper example to connect with cloud storage services without this service account file system
You can do this instead of relying on the environment variable by downloading credential files for each project you need to access.
So for example, if you have three projects that you want to access storage on, then you'd need code paths that initialize the StorageClient with the appropriate service account key from each of those projects.
StorageClient.Create() can take an optional GoogleCredential() object to authorize it (if you don't specify, it grabs the default application credentials, which, one way to set is that GOOGLE_APPLICATION_CREDENTIALS env var).
So on GoogleCredential, check out the FromFile(String) static call, where the String is the path to the service account JSON file.
There are no examples. Service accounts are absolutely required, even if hidden from view, to deal with Google Cloud products. They're part of the IAM system for authenticating and authorizing various pieces of software for use with various products. I strongly suggest that you become familiar with the mechanisms of providing a service account to a given program. For code running outside of Google Cloud compute and serverless products, the current preferred solution involves using environment variables to point to files that contain credentials. For code running Google (like Cloud Run, Compute Engine, Cloud Functions), it's possible to provide service accounts by configuration so that the code doesn't need to do anything special.

How to set up ray project autoscaling on GCP

I am having real difficulty setting up ray auto-scaling on google cloud compute. I can get it to work on AWS no problem, but I keep running into the following error when running ray up:
googleapiclient.errors.HttpError: https://cloudresourcemanager.googleapis.com/v1/projects?alt=json returned "Service accounts cannot create projects without a parent.">
My project is part of an organization, so I don't understand where this is coming from, or why it would need to create a project in the first place. I have entered my project id in the yaml file like I normally do for AWS.
Thank you very much. I appreciate any help I can get!!
The error message referring to service account, together with the fact that the project already exists, suggests that the googlecloudapiclient used by Ray Autoscaler is authenticated for a service account that doesn't have access to the project.
If this is true, then here's what I believe happens. Typically, when running Ray GCP Autoscaler, it will first check if the project with the given id exists. In your case, this request returns "not found" because there's no project with the given id associated with the service account. Now, because the project did not exist, Ray will automatically try to create one for you. Typically, if we created a new GCP project with a user account (i.e. non-service account), the newly created project would be associated with the user account's default organization. Service accounts, however, must specify a parent organization explicitly when creating a new project. If we look at the ray.autoscaler.config._create_project function, we see that the arguments passed to the projects.create method omit the 'parent' argument, which explains why you see the error.
To verify if this is true (and hopefully fix the problem), you could change the account used for authenticating with the googlecloudapiclient. I believe that the credentials used for the googlecloudapiclient requests are the same as used by the Google Cloud SDK, so you should be able to configure the accounts using the gcloud auth login command.
I think the Ray Autoscaler could be improved by either allowing user to explicitly specify the parent organization when creating a new project, or at least by providing a more elaborate error message for this particular case.
I hope this fixes your problem. If it doesn't, and you believe that that it's a problem with the Autoscaler, don't hesitate to open an issue or feature request to the Ray Issues page!

AWS Cognito - how to create a backup?

We are currently moving our Auth services to AWS Cognito. As it's crucial to have the user profiles + data safe, we need to have a backup of the main user pool. We've noticed that there is an option to Import Users via a .csv file with the headers equal to the pool attributes but there is no option to create the .csv automatically. Does anyone know of a solution which automatically generates such file?The point is to protect the user profiles of accidental delete of the whole user pool (by accident, let's say a tired developer on Friday night)? I've personally tried to implement a workaround solution by doing all manual work (getting headers, users, mapping them and creating the csv) but that is not very reliable.
I know I am late to the party but leaving this here for future searches.
I too faced the same issue while working with Cognito and thus made a tool to take backups and restore them to userpools.
You can find it here: https://www.npmjs.com/package/cognito-backup-restore
This can be used via cli or using imports (incase you want to write your own wrapper or script).
Please suggest any improvements: https://github.com/rahulpsd18/cognito-backup-restore
This is still under development, as I plan to make use of Cognito User Pool Import Job instead of using aws-sdk's adminCreateUser to create users while restoring to improve upon the current implementation. But it works fine for now.
Cross-Region Cognito Replication will be implemented too once I fine tune the restore process.
Amazon has released a Cognito User Profiles Export Reference Architecture for exporting/importing users from a user pool. There are limitations:
Passwords not backed up; users will need to reset
Pools using MFA are not supported
Cognito sub attributes will be new, so if the system depends on them, they need to be copied to a custom user attribute
Federated users also pose challenges WRT sub
Advanced security - no user history is exported
No support for pools that allow the option of either phone or email usernames
No support for tracked devices
I also created a tool for this, which also supports backing up and restoring groups and relations to users:
https://github.com/mifi/cognito-backup
You can install it like this:
npm i -g cognito-backup
and use it like this:
cognito-backup backup-users eu-west-1_1_12345
cognito-backup backup-groups eu-west-1_1_12345
cognito-backup backup-all-users eu-west-1_1_12345
cognito-backup restore-groups eu-west-1_12345
cognito-backup restore-users eu-west-1_12345 Abcd.1234 --file eu-west-1_12345.json
Note that password cannot be backed up due to an AWS limitation.
To prevent accidental pool deletion you could create a Service Control Policy at the org level.