I already have Google authenticator installed in my iPhone and I'm using it to signin to my AWS root account. I want to add the ability to login with MFA using my Android phone as well, using a corresponding token-generator Android app.
Is it possible to add a second device and how exactly? Or is AWS root account MFA bind to one (virtual) device?
🚨 AWS finally provides support for adding additional MFA devices. 🚨
As of November 16, 2022:
https://aws.amazon.com/blogs/security/you-can-now-assign-multiple-mfa-devices-in-iam
I'm leaving the old answer below for reference, but it should no longer be needed.
You can only have one MFA device tied to your root account. You would need to setup a separate IAM user account for your separate device.
From the FAQ:
Q. Can I have multiple authentication devices active for my AWS account?
Yes. Each IAM user can have its own authentication device. However, each identity (IAM user or root account) can be associated with only one authentication device.
Update: So while it's not officially supported, here is one guy who claims he was able to register Google Authenticator on two devices by doing both at the exact same time with the same QR code. Granted he's not doing this with AWS, but it could be worth a try.
https://www.quora.com/Can-Google-Authenticator-be-used-on-multiple-devices
Update 2: I've started using Authy for MFA rather than Google Authenticator. One of the cool things Authy now supports is multi-devices for all your MFA tokens. I currently have my phone and my tablet setup with access to my AWS account using Authy Multi Device.
http://blog.authy.com/multi-device
Here is the solution;
When AWS MFA page shows the barcode, scan barcode from different devices (I've tried with 3) at the same time. They creates same code, filled form with same codes and it works.
This is not really a new answer, but it tries to clarify and to explain a little better (or at least differently) why different virtual devices can be considered to be one virtual device
At the moment (2020-05-07) you cannot have two different authentification devices for the same user. (like more than one of the following: a U2F usb key / a virtual device / a hardware device)
However you can install the same virtual device application on multiple devices (mobile phones / tablets / PCs) if you initialize them all with the same initialisation code (QR code)
The Virtual MFA device is just the implementation of the TOTP algorithm ( https://en.wikipedia.org/wiki/Time-based_One-time_Password_algorithm )
each TOTP application has to be initialized with a 'secret' code (the QR code)
So if you scan the same QR code with different TOTP apps, then all of these apps can authenticate (they will behave indentical)
When initializing at AWS you are asked to enter two consecutive codes generated by your TOTP app.
(Just enter them from any of the apps, that you initialized with the QR code.
Or if you are really crazy. create one code with one app and then create another code with the other app. just enter the code that was generated first first)
Afterwards all virtual devices will work and are completely interchangable.
You could even 'archive' the QR code image in a safe place and add other virtual devices later (the QR code contains just the secret required to initialize the TOTP application). It does not expire.
From AWS Organizations documentation:
If you choose to use a virtual MFA application, then unlike our
recommendation for the management account root user, for member
accounts you can re-use a single MFA device for multiple member
accounts. You can address geographic limitations by printing and
securely storing the QR code used to configure the account in the
virtual MFA application. Document the QR code's purpose, and seal and
store it in accessible safes across the time zones you operate in,
according to your information security policy. Then, when access is
needed in a different geographic location, the local copy of the QR
code can be retrieved and used to configure a virtual MFA app in the
new location.
I actually tried using the same secret configuration key from AWS on an iPhone, iPad and an Android using Google Authenticator and they all worked fine. The same with what #Jaap did.
In addition to the solutions above:
1) You cannot make a QR-code reappear after attaching an MFA device to AWS account. So if you need to add another virtual MFA device, delete the existing device, reattach it, and make a screenshot of the QR-code (or save Secret code) and then scan this QR-code with another device.
2) The QR-code is not expiring. I could use my code weeks after initialization.
You can export your accounts from Google Authenticator to another device without losing access to them from your current device.
I discovered this when I was upgrading my mobile device and found that my new device would show the exact same MFA codes as my current device at the same time.
On your current MFA device, open Google Authenticator and tap "..." in upper right corner
In the menu, select "Export accounts", then tap "Continue"
You will see a list of accounts, so select the ones you want to enable on the new device and then tap "Export"
You will be shown a QR code, which you then scan from the new device
Related
I am new at google cloud and this is my first experience with this platform. ( Before I was using Azure )
So I am working on a c# project and the project has a requirement to save images online and for that, I created cloud storage.
not for using the services, I find our that I have to download a service account credential file and set the path of that file in the environment variable.
Which is good and working file
RxStorageClient = StorageClient.Create();
But the problem is that. my whole project is a collection of 27 different projects and that all are in the same solution and there are multi-cloud storage account involved also I want to use them with docker.
So I was wondering. is there any alternative to this service account system? like API key or connection string like Azure provides?
Because I saw this initialization function have some other options to authenticate. but didn't saw any example
RxStorageClient = StorageClient.Create();
Can anyone please provide a proper example to connect with cloud storage services without this service account file system
You can do this instead of relying on the environment variable by downloading credential files for each project you need to access.
So for example, if you have three projects that you want to access storage on, then you'd need code paths that initialize the StorageClient with the appropriate service account key from each of those projects.
StorageClient.Create() can take an optional GoogleCredential() object to authorize it (if you don't specify, it grabs the default application credentials, which, one way to set is that GOOGLE_APPLICATION_CREDENTIALS env var).
So on GoogleCredential, check out the FromFile(String) static call, where the String is the path to the service account JSON file.
There are no examples. Service accounts are absolutely required, even if hidden from view, to deal with Google Cloud products. They're part of the IAM system for authenticating and authorizing various pieces of software for use with various products. I strongly suggest that you become familiar with the mechanisms of providing a service account to a given program. For code running outside of Google Cloud compute and serverless products, the current preferred solution involves using environment variables to point to files that contain credentials. For code running Google (like Cloud Run, Compute Engine, Cloud Functions), it's possible to provide service accounts by configuration so that the code doesn't need to do anything special.
my development team has been sparingly trying out Google Cloud Platform for about 10 months. Every member was using the same account to access GCP, say team#example.com. We created three projects under this account.
Starting in about July, we cannot see these projects in the GCP console anymore. Instead, there is one project named My First Project, which we have never created.
However, our original GCP projects still seem to exist, as we can still access for example some of the Google Cloud Functions via HTTP.
Therefore, I have the impression that the connection between our account and the projects has been lost.
OR
A second account with the same name has been accidentally created?
Additional curiosities:
Yesterday I tried to create a Google Cloud Identity account, using team#example.com. It did not work; when entering that address the input field showed an error like "Please use another email address. This is a private Google account." (It was actually in German, so I'm guessing the translation.)
When I go to accounts.google.com, the account selection screen offers team#example.com twice. No matter which entry I choose, I always end up in the GCP console with My First Project.
How can I recover my team's GCP projects?
Which Google support site may I consult to check on the account(s)?
Usually, there is a 1:1 mapping between a certain email address and a Google Account. However, this can be broken under certain situations - for example when creating / deleting / migrating G Suite or Cloud Identity accounts under the domain the email address uses.
If you hit such an edge case, there's not much you can do yourself. Reach out to GCP Support who should be able to resolve the issue for you.
Keep in mind that orphaned resources have a timer on them before they are deleted - so act quickly and do not rely on apps still responding being a sign that they will continue indefinitely.
I'm using Amazon Directory Services with a Simple AD instance. I can join computers to the domain, but I can't figure out how to add users to the domain (and do not see in the documentation whether this is even possible).
How do I create a user in Amazon Simple AD?
You can manage users (and groups) via a bound instance's Active Directory Users and Computers tool. Details are here.
Note that due to a bug, this must be done from a Windows Server 2008 R2 instance at the time of writing. Windows Server 2012 is not supported at the time of writing per this post (registration required).
Q: Are "domain limited" Desire2Learn API keys 100% locked to the D2L domain they were issued for, or can they be used in a pinch for work on a different domain -- say, several weeks of testing an upgrade?
Details specific to our case:
My institution is preparing to upgrade our D2L Learning Environment. We have one Production LE and one Dev LE, and we're expecting to get a 2nd Dev LE specifically for upgrade testing (all 3 instances hosted by D2L, fyi).
We have 2 homegrown Valence client apps to test with the upgraded LE. I know that our Valence API keys were issued specifically for our existing (not upgraded) Dev domain. I also know our client app is hard-coded with that key.
But it's not clear to me whether we have to get a new API key and edit our client app accordingly, or whether we can use the existing key on a "wrong" domain for just a few weeks while we're testing the upgrade.
Could such an arrangement be used temporarily?
There are several possible approaches; the one you choose will depend upon your circumstances.
Use another test application's key already granted for the new domain. If you already have an App ID/Key granted for an application limited to your new DEV2 LE, then you can try using that application's credentials temporarily. This would require rebuilding, or reconfiguring, your client application with the new credentials. We do not recommend this approach because for effective testing you definitely want to have traceability as to which application is making which calls to the LE; however if you already have a set of app credentials for a narrowly deployed test application, for example, you can in a pinch switch to sharing those credentials.
Use the LMSID/Key credentials from DEV1 LE on DEV2 LE. The "domain limitation" applied to app keys corresponds to the LMSID/Key credentials assigned to an LE instance at deployment. If your DEV2 instance is only being floated to test integrations in an upgrade scenario and these integrations are already (in their test form) all working against your DEV1 instance, then it may be possible to have your DEV2 LE use the same LMSID/Key credentials as your DEV1 LE. This would mean that the DEV2 LE fetches its known-application credential list from D2L's Key Tool Service, it will get exactly the same list of credentials as given to the DEV1 LE. This is the most radical suggestion, will require D2L's Support Desk to get involved, and will most definitely require shepherding by your DEV2 LE's Approved Support Contact -- this kind of deployment can make sense for certain very specific kinds of testing LMS instances, but it is a very big hammer to apply, so it may not be the right choice here.
Note that this solution is the only one that will work if you have no access to change the application's code/configuration itself (an the app credentials are baked into the app) -- if the app you want to test must work against an LE that acts as if it were the DEV1 instance, then this may be the only solution possible, and in this case you may have to wait until the upgraded LE gets deployed on DEV1 to test your application. I am not at all confident that a granted set of app credentials can be "repointed" to a new domain limitation.
Apply for a new application ID/Key pair, and work to expedite the request. The chief latency in granting application ID/Keys and deploying them lies in having the partner and or account managers for the target LMS Domain in question approve the request: if you line up your partner and/or account manager on your end with the situation and ask them to shepherd the request, this latency can get lowered. This would be the desired choice, because it uses the "proper channels" with the existing business relationship in a way it was intended to be used.
Getting a new set of application credentials for a test app in your new DEV2 domain should not take very long, especially if you already have an existing relationship that's been exercised to get app creds granted through a partner and or account manager. This solution still requires you to change/re-configure your app.
If at all possible, you should take this last path.
Those all reside in the root\RSOP\Computer namespace. The only class from which I got non-empty results is RSOP_RegistryPolicySetting, and that one only gave me settings for Windows Update and System Restore configuration.
I do know there are password policies in our network (age, length etc), but queries on the following classes only gave empty resilts:
RSOP_ScriptPolicySetting
RSOP_SecuritySettingNumeric
RSOP_SecuritySettingBoolean
RSOP_SecuritySettingString
Does it have to be via WMI?
If you're running a domain, Microsoft's Scripting Guys have an article How Long Until My Password Expires?... but it uses ADSI to read policies from Active Directory, rather than policies on the local machine.