Separating user pools for JavaScript AWS Amplify SDK / AWS Cognito based authentication in React app - amazon-web-services

I have a React-native web application working with the authentication completely based on a user pool managed by AWS Cognito. All the authentication-related screens (login, registration, change password) are managed by AWS Amplify JavaScript SDK. There is almost no custom client code.
We face the following requirement: we need to separate the user pools between the development and the production: 2 flavors of the same application (same code) need to use different pools managed inside AWS Cognito.
What is needed to achieve such a configuiration?
Specifically, we have an auto-generated (long time ago) aws-exports.js file which looks like:
// WARNING: DO NOT EDIT. This file is automatically generated by AWS Amplify. It will be overwritten.
const awsmobile = {
"aws_project_region": "us-east-1",
"aws_cognito_identity_pool_id": "us-east-1:[id]",
"aws_cognito_region": "us-east-1",
"aws_user_pools_id": "us-east-1_[id]",
"aws_user_pools_web_client_id": "[id]",
"oauth": {}
};
export default awsmobile;
The ID set for "aws_user_pools_id" is indeed the ID of the pool we manage in Cognito.
The source code loading this data looks like:
import Amplify, { Auth } from 'aws-amplify';
import awsconfig from '../aws-exports';
Amplify.configure(awsconfig);
Auth.configure(awsconfig);
What needs to be done? Seems like overriding the auto-generated file is not a good idea.
Should we just fix the "aws_user_pools_id" value, before calls to .configure()?
Do we need to have multiple "aws_cognito_identity_pool_id" and/or "aws_user_pools_web_client_id" (under this scenario), or we can share the values between the configurations?
If there is a good documentation on "how to", I will appreciate the reference.
Thanks in advance.
Max.

Assuming you are using the amplify cli you should use environments as shown here. The aws-exports.js file should not be committed to git. Instead it should be generated and kept up to date using amplify pull

Related

AWS Making a User Store based on Userpools

I am creating an Android and iOS app and a lot of network features that I'm implementing utilize AWS Amplify. I have an authentication system that sets up and verifies user info through AWS Cognito and AWS User Pools. However, I'm also trying to store more attributes for each user beyond just usernames, emails, phone numbers and passwords. I also want to store things such as scores, number of days logged in, etc. Some AWS services such as S3 appear to allow me to create a database with these attributes, but it appears that everyone can access them, and I need an extremely secure system where the only people who can access these user attributes are the user themselves and me as the admin.
What is the best way to implement this feature within AWS while integrating it with an Android/iOS app? Thank you
I would recommend adding either a Rest API or GraphQL API to your Amplify backend. This will create a secure API that will use a JWT token generated by Cognito for authentication. Your data will be stored in DynamoDB tables that will be generated via the #model directive in your GraphQL schema.
Create a GraphQL API
Navigate into the root of a JavaScript, iOS, or Android project and run:
amplify add api
Select the following options:
Select GraphQL
When asked if you have a schema, say No
Select one of the default samples; you can change this later
Choose to edit the schema and it will open the new schema.graphql in your editor
A simple model for tracking user scores and days since last log in might look like:
type UserData #model {
id: ID!
cognitoUserId: String!
score: Float!
lastLoggedInAt: AWSDate!
}
More details on building an Amplify GraphQL API here https://docs.amplify.aws/cli/graphql-transformer/overview/
Create a REST API
Follow the wizard to create a new app. After finishing the wizard run:
amplify add api
Select the following options:
Please select from one of the below-mentioned services: REST
Provide a friendly name for your resource to be used as a label for this - category in the project: itemsApi
Provide a path (e.g., /book/{isbn}): /items
This will be the configuration for /items path in API Gateway:
/
|_ /items Main resource. Eg: /items
ANY Includes methods: DELETE, GET, HEAD, OPTIONS, PATCH, POST, PUT
OPTIONS Allow pre-flight requests in CORS by browser
|_ /{proxy+} Proxy resource. Eg: /items/, /items/id, items/object/{id}
ANY Includes methods: DELETE, GET, HEAD, OPTIONS, PATCH, POST, PUT
OPTIONS Allow pre-flight requests in CORS by browser
More on creating an Amplify REST API here

Two GOOGLE_APPLICATION_CREDENTIALS files

I've had to download a key for a Google's Firebase service and yet another key for the pub/sub. How am I supposed to reference both keys with the GOOGLE_APPLICATION_CREDENTIALS key word?
Normally you only use one service account that has the required permissions.
Application Default Credentials (ADC) support one and only one service account JSON key file specified by the environment variable GOOGLE_APPLICATION_CREDENTIALS.
When writing code for Google Cloud, the SDK clients support specifying a service account as a parameter. In your example, you will need to create SDK clients using the appropriate credentials (service account JSON key file). The Firebase admin client can use one credential and the Pub/Sub client can use the other credential.
I'm having exactly the same issue. I'm trying to run two different Firestore services on one machine. Each service uses a different Firestore project. As far as I can see, to explicitly authenticate by directly accessing my JSON key file, I need to do something like this:
FirestoreClient client = Use ClientBuilderBase and its property: CredentialsPath
FirestoreDb = FirestoreDb.Create(firestoreProjectId, client);
But as ClientBuilderBase is an abstract class, I'm stumped. Anyone whose got some sample code that does this for real would be a real help.
Cheers
Keith

How do you connect Cognito to Pinpoint for analytics?

I'm trying to get some basic analytics for a Cognito user pool. It seemed simple enough to do, I created a project in Pinpoint, then I went to my user pool, went to General settings > Analytics, clicked the Add Analytics button, choose the Amazon Cognito app client that my app uses, pointed to the Pinpoint project I just created, checked the "Share user profile data" checkbox (though I assume that's not absolutely necessary), and it had the message telling me if would use the IAM role and so on. Clicked Save, got no error, I assumed at this point I would start seeing Analytics in Pinpoint, but there's absolutely nothing showing up.I do have a message saying I haven't enabled any features yet, but I don't see any features I'd need to enable. I don't care about the campaigns as of now, and then under Application analytics it seems geared to you manually updating your mobile or web app to send something, but I thought that was if you need to customize something. Am I mistaken? Will this integration only work if you change your web app to explicitly send things to Pinpoint? I just assumed if I connected Cognito and Pinpoint the analytics would show since Cognito obviously knows people are logging in without you needing to manually make some extra request.
From my research, I found out that since you are using a Web Application without using AWS Amplify framework, you need to add additional lines of code to your application in order to send Cognito authentication analytics data to your Amazon Pinpoint project.
If you are using the Javascript SDK you may add the initate-Auth property code snippet to your front-end application:
var cognitoidentityserviceprovider = new AWS.CognitoIdentityServiceProvider();
var params = {
AuthFlow: "USER_PASSWORD_AUTH",
ClientId: 'STRING_VALUE', /* the client ID attached to the Pinpoint project */
AuthParameters: {
'USERNAME': 'STRING_VALUE',
'PASSWORD': 'STRING_VALUE'
},
AnalyticsMetadata: {
AnalyticsEndpointId: 'STRING_VALUE' /* the Pinpoint project ID */
},
};
cognitoidentityserviceprovider.initiateAuth(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
In the above code snippet, the Pinpoint project/application ID is added as part of the "AnalyticsMetadata" parameter when the client makes the API calls (e.g. sign-in, sign-up, etc.) to the Cognito user pool. Therefore, the API calls will have the pinpoint project ID attached to them and Cognito can use that information to send the data to your Pinpoint project. Without this crucial step, analytics data will not be sent to Pinpoint and will result in the behavior you have described.
If using CLI (for verification/testing purpose), you may execute the following AWS CLI initiate-auth command below :
$ aws cognito-idp initiate-auth --auth-flow USER_PASSWORD_AUTH --auth-parameters USERNAME=STRING_VALUE,PASSWORD=STRING_VALUE --client-id STRING_VALUE --analytics-metadata AnalyticsEndpointId=STRING_VALUE
The take away point is that :
Amazon Cognito integration with Amazon Pinpoint only works for apps (clients) that are using Cognito SDK for Mobile Apps (AWS Mobile SDKs or JavaScript - AWS Amplify JavaScript library).
Just a note to syumaK's response, yes you need to update your code's initAuth call to include the AnalyticsMetadat property. However, I found out that the AnalyticsEndpointId should NOT be the Pinpoint project ID.
I believe that since you've configured Cognito already to integrate with the pinpoint project, it knows which pinpoint project you are sending metrics to already. The AnalyticsEndpointId should be a uniquely identifier for the login.
The documentation is definitely flaky here. The cli one from syumaK's seems to describe it differently than Cognito API Reference
What happened to me was that I used the pinpoint project id, and when I log in with multiple users, it thinks it's the same one. The daily active users remains at one. AWS Support clarified that it should be a unique identifier for each user. So after changing the id to be the user's email, I am starting to get accurate numbers for the daily active users.
It also seems like this is just about as useful as it is going to be for not using the Cognito SDK for Mobile Apps or Amplify. You can only get information like daily/monthly active users, authentication metrics. You don't have the ability to further break down by segments, etc.

PermissionDenied: 403 IAM permission 'dialogflow.intents.list'

I'm trying to get the list of the intents in my Dialogflow agent using Dialogflow's V2 APIs but have been getting the following error:
PermissionDenied: 403 IAM permission 'dialogflow.intents.list' on 'projects/xxxx/agent' denied.
I adopted the following steps:
I created a new agent(with V2 APIs enabled) and a new service account for it.
I downloaded the JSON key and set my GOOGLE_APPLICATION_CREDENTIALS variable to its path.
Following is my code:
import dialogflow_v2 as dialogflow
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="/home/user/folder/service-account-key.json"
client=dialogflow.IntentsClient()
parent = client.project_agent_path('[PROJECT_ID]')
for element in client.list_intents(parent):
pass
I have made various agents and service accounts and even changed the role from Admin to Client but can't figure out any solution. I tried the following solution but didnt' work
Tried Solution: DialogFlow PermissionDenied: 403 IAM permission 'dialogflow.sessions.detectIntent'
There is no need for creating a new Agent. You can edit the existing agents IAM.
In Dialogflow's console, go to settings ⚙ > under the general tab, you'll see the project ID section with a Google Cloud link to open the Google Cloud console > Open Google Cloud.
In google cloud, go to IAM Admin > IAM under tab Members. Find the name of your agents and then click on edit.
Give admin permissions to the agent to give permissions to list intent.
The problem lies in the IAM section of GCP. Probably you are making a POST request with a role that does not have the necessary authorizations.
Look into your key.json file that contains the field "client_email"
Proceed to the IAM page and set the relevant role with that email to
a role that has posting capabilities. (e.g. Admin)
This solved my problem.
In Dialogflow's console, go to settings ⚙ > under the general tab, you'll see the project ID section with a Google Cloud link to open the Google Cloud console > Open Google Cloud.
(Optional) In the Cloud console, go to the menu icon > APIs & Services > Library. Select any APIs (if any) > Enable.
In Cloud Console > under the menu icon ☰ > APIs & Services > Credentials > Create Credentials > Service Account Key.
Under Create service account key, select New Service Account from the dropdown and enter a project name and for role choose Owner > Create.
JSON private key file will be downloaded to your local machine that you will need.
For Javascript:
In the index.js file you can do service account auth with JWT:
const serviceAccount = {}; // Starts with {"type": "service_account",...
// Set up Google Calendar Service account credentials
const serviceAccountAuth = new google.auth.JWT({
email: serviceAccount.client_email,
key: serviceAccount.private_key,
scopes: 'https://www.googleapis.com/auth/xxxxxxx'
});
For Python:
There's a Google Auth Python Library available via pip install google-auth and you can check out more here.
When you create the intentClient, use following:
key_file_path = "/home/user/folder/service-account-key.json";
client=dialogflow.IntentsClient({
keyFilename: key_file_path
})
Intents list
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes. Keep in mind that when you set an environment variable value in a session, it is reset every time the session is dropped.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide, in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly.
Passing the path to the service account key in code example:
def explicit():
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json('service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
Try also to create project in DialogFlow Console
https://dialogflow.cloud.google.com/
You need to create the following as environment variable
googleProjectID: "",
dialogFlowSessionID: "anything",
dialogFlowSessionLanguageCode: "en-US",
googleClientEmail: "",
googlePrivateKey:
I think you might have missed the Enable the API section in the documentation setup.
Here is that link:
https://cloud.google.com/dialogflow/cx/docs/quick/setup#api
After clicking the link, select the chatbot project you created and fill the necessary instructions given there.
The permissions that I have given for that project are Owner, and editor.
After this, try the code in this link:
https://cloud.google.com/dialogflow/es/docs/quick/api#detect_intent
You should get a response from your chatbot
Hope this helps!

Manually set current user credentials in AWS Appsync

I'm using Nativescript with AWS Amplify to login to AWS AppSync.
The session stored by Amplify is cleared after app refresh. I get no user from Auth.currentSession() after app refresh.
However, I store the jwt from login and can access endpoints manually by passing that in header.
I would like to use this to make Amplify believe that a user is logged in to use it in other parts of the application.
Is there a way to manually set credentials to use all the functionality of Amplify?
And how is successful login handled? How can I emulate the behaviour after successful login to set current session credentials manually?
There is no possible way to this this as of now, however according to this and this issue on github, they have acknowledged this as a feature-request and are probably working on this.
The solution to my problem was to use the nativescript-localstorage module and assign global["storage"] to this module.
import * as storage from "nativescript-localstorage"
global["storage"] = storage;