I am trying to implement app-to-app account linking for alexa skills with my app.
I have followed the guide found here https://developer.amazon.com/en-US/docs/alexa/account-linking/app-to-app-account-linking-starting-from-your-app.html and have reached Step 6: Enable the skill and complete account linking. At this point, I am creating the final post request within an AWS lambda function using axios. The request is of the following form:
const header = {
"headers": {
"Content-Type": "application/json",
"Authorization": "Bearer " + event.amazonAccessToken
}
};
const body = {
"stage": event.skillStage,
"accountLinkRequest": {
"redirectUri": event.redirectURI,
"authCode": event.userAuthorizationCode,
"type": "AUTH_CODE"
}
};
and I am sending the post request to each of the possible regional endpoints and using the one call that succeeds, as shown in the guide's sample code.
endpoints.forEach((endpoint)=> {
alexaServicePromises.push(axios.post(endpoint, body, header).catch(function(error) {
if (error.response) {
console.log(error.response.data);
console.log(error.response.status);
console.log(error.response.headers);
}
}));
});
return new Promise((resolve, reject) => {
var failures = 0;
alexaServicePromises.forEach((promise) => {
promise.then((res)=> {
if (res.status == 201 || res.status == 200) {
resolve(res.data);
} else {
if (++failures == alexaServicePromises.length) {
reject(res.data);
}
}
}).catch((err)=> {
if (++failures == alexaServicePromises.length) {
reject(err.data);
}
})
})
});
However, the issue is that each of the three calls to each endpoint are returning error code 400 with message: 'Invalid account linking credentials'. I am completely unable to solve this problem. Each of the previous steps are running perfectly, I am sending the Amazon access token from step 5, skill stage is 'development' (skill is not published), redirectUri is the uri used in step 4 when I obtained an Amazon authorization code to redirect the user back into the app, the user authCode I am sending was returned from directing the user to sign into our authentification service (Cognito), and I am sending the skill id in the url used in the axios post request. The account I am testing with is my Amazon developer account with access to the skill (I did not create the skill though), and I am using the Alexa client ID and secret found in the account linking and permissions tab of the skill. Finally, each time I test, it is running the whole process, getting me a new authorization code, exchanging for a new token, signing in for a new user auth code, and then sending everything needed to this lambda function.
I have also seen the post here Alexa Account Linking - "Invalid account linking credentials", and from what I wrote above, I don't think I'm making any of the 4 mistakes.
How can I fix this?
Related
My express server has a credentials.json containing credentials for a google service account. These credentials are used to get a jwt from google, and that jwt is used by my server to update google sheets owned by the service account.
var jwt_client = null;
// load credentials form a local file
fs.readFile('./private/credentials.json', (err, content) => {
if (err) return console.log('Error loading client secret file:', err);
// Authorize a client with credentials, then call the Google Sheets API.
authorize(JSON.parse(content));
});
// get JWT
function authorize(credentials) {
const {client_email, private_key} = credentials;
jwt_client = new google.auth.JWT(client_email, null, private_key, SCOPES);
}
var sheets = google.sheets({version: 'v4', auth: jwt_client });
// at this point i can call google api and make authorized requests
The issue is that I'm trying to move from node/express to npm serverless/aws. I'm using the same code but getting 403 - forbidden.
errors:
[ { message: 'The request is missing a valid API key.',
domain: 'global',
reason: 'forbidden' } ] }
Research has pointed me to many things including: AWS Cognito, storing credentials in environment variables, custom authorizers in API gateway. All of these seem viable to me but I am new to AWS so any advice on which direction to take would be greatly appreciated.
it is late, but may help someone else. Here is my working code.
const {google} = require('googleapis');
const KEY = require('./keys');
const _ = require('lodash');
const sheets = google.sheets('v4');
const jwtClient = new google.auth.JWT(
KEY.client_email,
null,
KEY.private_key,
[
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/drive.file',
'https://www.googleapis.com/auth/spreadsheets'
],
null
);
async function getGoogleSheetData() {
await jwtClient.authorize();
const request = {
// The ID of the spreadsheet to retrieve data from.
spreadsheetId: 'put your id here',
// The A1 notation of the values to retrieve.
range: 'put your range here', // TODO: Update placeholder value.
auth: jwtClient,
};
return await sheets.spreadsheets.values.get(request)
}
And then call it in the lambda handler. There is one thing I don't like is storing key.json as file in the project root. Will try to find some better place to save.
I am looking to add Basic User Authentication to a Static Site I will have up on AWS so that only those with the proper username + password which I will supply to those users have access to see the site. I found s3auth and it seems to be exactly what I am looking for, however, I am wondering if I will need to somehow set the authorization for pages besides the index.html. For example, I have 3 pages- index, about and contact.html, without authentication setup for about.html what is stopping an individual for directly accessing the site via www.mywebsite.com/about.html? I am more so looking for clarification or any resources anyone can provide to explain this!
Thank you for your help!
This is the perfect use for Lambda#Edge.
Because you're hosting your static site on S3, you can easily and very economically (pennies) add some really great features to your site by using CloudFront, AWS's content distribution network, to serve your site to your users. You can learn how to host your site on S3 with CloudFront (including 100% free SSL) here.
While your CloudFront distribution is deploying, you'll have some time to go set up your Lambda that you'll be using to do the basic user auth. If this is your first time creating a Lambda or creating a Lambda for use #Edge the process is going to feel really complex, but if you follow my step-by-step instructions below you'll be doing serverless basic-auth that is infinitely scalable in less than 10 minutes. I'm going to use us-east-1 for this and it's important to know that if you're using Lambda#Edge you should author your functions in us-east-1, and when they're associated with your CloudFront distribution they'll automagically be replicated globally. Let's begin...
Head over to Lambda in the AWS console, and click on "Create Function"
Create your Lambda from scratch and give it a name
Set your runtime as Node.js 8.10
Give your Lambda some permissions by selecting "Choose or create an execution role"
Give the role a name
From Policy Templates select "Basic Lambda#Edge permissions (for CloudFront trigger)"
Click "Create function"
Once your Lambda is created take the following code and paste it in to the index.js file of the Function Code section - you can update the username and password you want to use by changing the authUser and authPass variables:
'use strict';
exports.handler = (event, context, callback) => {
// Get request and request headers
const request = event.Records[0].cf.request;
const headers = request.headers;
// Configure authentication
const authUser = 'user';
const authPass = 'pass';
// Construct the Basic Auth string
const authString = 'Basic ' + new Buffer(authUser + ':' + authPass).toString('base64');
// Require Basic authentication
if (typeof headers.authorization == 'undefined' || headers.authorization[0].value != authString) {
const body = 'Unauthorized';
const response = {
status: '401',
statusDescription: 'Unauthorized',
body: body,
headers: {
'www-authenticate': [{key: 'WWW-Authenticate', value:'Basic'}]
},
};
callback(null, response);
}
// Continue request processing if authentication passed
callback(null, request);
};
Click "Save" in the upper right hand corner.
Now that your Lambda is saved it's ready to attach to your CloudFront distribution. In the upper menu, select Actions -> Deploy to Lambda#Edge.
In the modal that appears select the CloudFront distribution you created earlier from the drop down menu, leave the Cache Behavior as *, and for the CloudFront Event change it to "Viewer Request", and finally select/tick "Include Body". Select/tick the Confirm deploy to Lambda#Edge and click "Deploy".
And now you wait. It takes a few minutes (15-20) to replicate your Lambda#Edge across all regions and edge locations. Go to CloudFront to monitor the deployment of your function. When your CloudFront Distribution Status says "Deployed", your Lambda#Edge function is ready to use.
Deploying Lambda#edge is quiet difficult to replicate via console. So I have created CDK Stack that you just add your own credentials and domain name and deploy.
https://github.com/apoorvmote/cdk-examples/tree/master/password-protect-s3-static-site
I have tested the following function with Node12.x
exports.handler = async (event, context, callback) => {
const request = event.Records[0].cf.request
const headers = request.headers
const user = 'my-username'
const password = 'my-password'
const authString = 'Basic ' + Buffer.from(user + ':' + password).toString('base64')
if (typeof headers.authorization === 'undefined' || headers.authorization[0].value !== authString) {
const response = {
status: '401',
statusDescription: 'Unauthorized',
body: 'Unauthorized',
headers: {
'www-authenticate': [{key: 'WWW-Authenticate', value:'Basic'}]
}
}
callback(null, response)
}
callback(null, request)
}
By now, this is also possible with CloudFront functions which I like more because it reduces the complexity even more (from what is already not too complex with Lambda). Here's my writeup on what I just did...
It's basically 3 things that need to be done:
Create a CloudFront function to add Basic Auth into the request.
Configure the Origin of the CloudFront distribution correctly in a few places.
Activate the CloudFront function.
That's it, no particular bells & whistles otherwise. Here's what I've done:
First, go to CloudFront, then click on Functions on the left, create a new function with a name of your choice (no region etc. necessary) and then add the following as the code of the function:
function handler(event) {
var user = "myuser";
var pass = "mypassword";
function encodeToBase64(str) {
var chars =
"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
for (
// initialize result and counter
var block, charCode, idx = 0, map = chars, output = "";
// if the next str index does not exist:
// change the mapping table to "="
// check if d has no fractional digits
str.charAt(idx | 0) || ((map = "="), idx % 1);
// "8 - idx % 1 * 8" generates the sequence 2, 4, 6, 8
output += map.charAt(63 & (block >> (8 - (idx % 1) * 8)))
) {
charCode = str.charCodeAt((idx += 3 / 4));
if (charCode > 0xff) {
throw new InvalidCharacterError("'btoa' failed: The string to be encoded contains characters outside of the Latin1 range."
);
}
block = (block << 8) | charCode;
}
return output;
}
var requiredBasicAuth = "Basic " + encodeToBase64(`${user}:${pass}`);
var match = false;
if (event.request.headers.authorization) {
if (event.request.headers.authorization.value === requiredBasicAuth) {
match = true;
}
}
if (!match) {
return {
statusCode: 401,
statusDescription: "Unauthorized",
headers: {
"www-authenticate": { value: "Basic" },
},
};
}
return event.request;
}
Then you can test with directly on the UI and assuming it works and assuming you have customized username and password, publish the function.
Please note that I have found individual pieces of the function above on the Internet so this is not my own code (other than piecing it together). I wish I would still find the sources so I can quote them here but I can't find them anymore. Credits to the creators though! :-)
Next, open your CloudFront distribution and do the following:
Make sure your S3 bucket in the origin is configured as a REST endpoint and not a website endpoint, i.e. it must end on .s3.amazonaws.com and not have the word website in the hostname.
Also in the Origin settings, under "S3 bucket access", select "Yes use OAI (bucket can restrict access to only CloudFront)". In the setting below click on "Create OAI" to create a new OAI (unless you have an existing one and know what you're doing). And select "Yes, update the bucket policy" to allow AWS to add the necessary permissions to your OAI.
Finally, open your Behavior of the CloudFront distribution and scroll to the bottom. Under "Function associations", for "Viewer request" select "CloudFront Function" and select your newly created CloudFront function. Save your changes.
And that should be it. With a bit of luck a matter of a couple of minutes (realistically more, I know) and especially not additional complexity once this is all set up.
Thanks for the useful post. An alternative to listing the pain text user name and password in the code, and to having base64 encoding logic, is to pre-generate the base64 encoded string. One such encoder, https://www.debugbear.com/basic-auth-header-generator
From there the script becomes simpler. The following is for 'user' / 'password'
function handler(event) {
var base64UserPassword = "Y3liZXJmbG93c3VyZmVyOnRhbHR4cGNnIzIwMjI="
if (event.request.headers.authorization &&
event.request.headers.authorization.value === ("Basic " + base64UserPassword)) {
return event.request;
}
return {
statusCode: 401,
statusDescription: "Unauthorized ",
headers: {
"www-authenticate": { value: "Basic" },
},
}
}
Here is already exists answer how to use Cloudfront functions, but I want to add improved version of the function:
Hardcoded credentials stored as SHA256 hash instead of plain (or base64 that is the same as plain) text. And that is more secure.
It is possible to allow access by whitelisted global IP addresses:
function handler(event) {
var crypto = require('crypto');
var headers = event.request.headers;
var wlist_ips = [
"1.1.1.1",
"2.2.2.2"
];
var authString = "9c06d532edf0813659ab41d26ab8ba9ca53b985296ee4584a79f34fe9cd743a4";
if (
typeof headers.authorization === "undefined" ||
crypto.createHash(
'sha256'
).update(headers.authorization.value).digest('hex') !== authString
) {
if (
!wlist_ips.includes(event.viewer.ip)
) {
return {
statusCode: 401,
statusDescription: "Unauthorized",
headers: {
"www-authenticate": { value: "Basic" },
"x-source-ip": { value: event.viewer.ip}
}
};
}
}
return event.request;
}
Command below may be used to get correct authString hash value for username user and password password:
printf "Basic $(printf 'user:password' | base64 -w 0)" | sha256sum | awk '{print$1}'
I added website authentication for s3 bucket using lambda function and then connect the lambda function with the CloudFront by using behavior settings in distribution settings and it worked fine and added authentication(means htaccess authentication in simple servers). Now I want to change the password for my website authentication. For that, I updated the password and published the new version of the lambda function and then in the distribution settings; I created a new invalidation to clear cache. But it didn't work, and website authentication password didn't change. Below is my lambda function code to add authentication.
'use strict';
exports.handler = (event, context, callback) => {
// Get request and request headers
const request = event.Records[0].cf.request;
const headers = request.headers;
// Configure authentication
const authUser = 'user';
const authPass = 'pass';
// Construct the Basic Auth string
const authString = 'Basic ' + new Buffer(authUser + ':' + authPass).toString('base64');
// Require Basic authentication
if (typeof headers.authorization == 'undefined' || headers.authorization[0].value != authString) {
const body = 'Unauthorized';
const response = {
status: '401',
statusDescription: 'Unauthorized',
body: body,
headers: {
'www-authenticate': [{key: 'WWW-Authenticate', value:'Basic'}]
},
};
callback(null, response);
}
// Continue request processing if authentication passed
callback(null, request);
};
Can anyone please help me to solve the problem.
Thanks in advance.
On Lambda function view, After you save your changes (using Firefox could be a safer option, see below if you wonder why)
you will see a menu item under Configuration - > Designer -> CloudFront. You will see following screens.
After you deploy :
You can publish your change to CloudFront distribution. Once you publish this, it will automatically start deploying CF distribution which you can view on CF menu.
Also i would prefer using "Viewer Request" as a CloudFront trigger event, not sure which one you are using as this should avoid Cloudfront caching. On top of this Chrome sometimes fails to save changes on Lambda. There should be a bug on aws console. Try Firefox just to be safe when you are editing lambda functions.
I created a bot in AWS Lex and I am trying to integrate it with Slack. I created a Slack app and followed the documentation as mentioned in-
https://docs.aws.amazon.com/lex/latest/dg/slack-bot-association.html
However, while trying to integrate with the Lex Postback URL I get an error saying
Your URL didn't respond with the value of the challenge parameter.
Our Request:
POST
"body": {
"type": "url_verification",
"token": "VbODUleNdk2hieCvDwlScrQF",
"challenge": "HRUXnK6YYLpx5U1s9AiADZgA0BAhWuTzfjAAzLEJIw1zz4GfuMAb"
}
Your Response:
"code": 200
"error": "challenge_failed"
"body": {
}
Per my knowledge, Lex by default should provide the response. Am I doing something wrong here? Any leads will help.
Thanks in advance.
I encountered this this morning and I thought I'd add my own experience. Slack appears to be pushing a 'Verification Token' as a replacement for the 'Signing Key', and claims they're interchangeable but that the token is more secure. I wasn't able to get the challenge response when using the token, but it worked fine when using the key.
Came across the same issue. The POST request that Slack was sending my endpoint was not what my function was designed for. I followed the tutorial at https://api.slack.com/tutorials/events-api-using-aws-lambda and had to add a line:
exports.handler = (data, context, callback) => {
data = JSON.parse(data.body); // added this line
switch (data.type) {
case "url_verification": verify(data, callback); break;
case "event_callback": process(data.event, callback); break;
default: callback(null);
}
};
Identity server is implemented and working well. Google login is working and is returning several claims including email.
Facebook login is working, and my app is live and requests email permissions when a new user logs in.
The problem is that I can't get the email back from the oauth endpoint and I can't seem to find the access_token to manually request user information. All I have is a "code" returned from the facebook login endpoint.
Here's the IdentityServer setup.
var fb = new FacebookAuthenticationOptions
{
AuthenticationType = "Facebook",
SignInAsAuthenticationType = signInAsType,
AppId = ConfigurationManager.AppSettings["Facebook:AppId"],
AppSecret = ConfigurationManager.AppSettings["Facebook:AppSecret"]
};
fb.Scope.Add("email");
app.UseFacebookAuthentication(fb);
Then of course I've customized the AuthenticateLocalAsync method, but the claims I'm receiving only include name. No email claim.
Digging through the source code for identity server, I realized that there are some claims things happening to transform facebook claims, so I extended that class to debug into it and see if it was stripping out any claims, which it's not.
I also watched the http calls with fiddler, and I only see the following (apologies as code formatting doesn't work very good on urls. I tried to format the querystring params one their own lines but it didn't take)
(facebook.com)
/dialog/oauth
?response_type=code
&client_id=xxx
&redirect_uri=https%3A%2F%2Fidentity.[site].com%2Fid%2Fsignin-facebook
&scope=email
&state=xxx
(facebook.com)
/login.php
?skip_api_login=1
&api_key=xxx
&signed_next=1
&next=https%3A%2F%2Fwww.facebook.com%2Fv2.7%2Fdialog%2Foauth%3Fredirect_uri%3Dhttps%253A%252F%252Fidentity.[site].com%252Fid%252Fsignin-facebook%26state%3Dxxx%26scope%3Demail%26response_type%3Dcode%26client_id%3Dxxx%26ret%3Dlogin%26logger_id%3Dxxx&cancel_url=https%3A%2F%2Fidentity.[site].com%2Fid%2Fsignin-facebook%3Ferror%3Daccess_denied%26error_code%3D200%26error_description%3DPermissions%2Berror%26error_reason%3Duser_denied%26state%3Dxxx%23_%3D_
&display=page
&locale=en_US
&logger_id=xxx
(facebook.com)
POST /cookie/consent/?pv=1&dpr=1 HTTP/1.1
(facebook.com)
/login.php
?login_attempt=1
&next=https%3A%2F%2Fwww.facebook.com%2Fv2.7%2Fdialog%2Foauth%3Fredirect_uri%3Dhttps%253A%252F%252Fidentity.[site].com%252Fid%252Fsignin-facebook%26state%3Dxxx%26scope%3Demail%26response_type%3Dcode%26client_id%3Dxxx%26ret%3Dlogin%26logger_id%3Dxxx
&lwv=100
(facebook.com)
/v2.7/dialog/oauth
?redirect_uri=https%3A%2F%2Fidentity.[site].com%2Fid%2Fsignin-facebook
&state=xxx
&scope=email
&response_type=code
&client_id=xxx
&ret=login
&logger_id=xxx
&hash=xxx
(identity server)
/id/signin-facebook
?code=xxx
&state=xxx
I saw the code parameter on that last call and thought that maybe I could use the code there to get the access_token from the facebook API https://developers.facebook.com/docs/facebook-login/manually-build-a-login-flow
However when I tried that I get a message from the API telling me the code has already been used.
I also tried to change the UserInformationEndpoint to the FacebookAuthenticationOptions to force it to ask for the email by appending ?fields=email to the end of the default endpoint location, but that causes identity server to spit out the error "There was an error logging into the external provider. The error message is: access_denied".
I might be able to fix this all if I can change the middleware to send the request with response_type=id_token but I can't figure out how to do that or how to extract that access token when it gets returned in the first place to be able to use the Facebook C# sdk.
So I guess any help or direction at all would be awesome. I've spent countless hours researching and trying to solve the problem. All I need to do is get the email address of the logged-in user via IdentityServer3. Doesn't sound so hard and yet I'm stuck.
I finally figured this out. The answer has something to do with Mitra's comments although neither of those answers quite seemed to fit the bill, so I'm putting another one here. First, you need to request the access_token, not code (authorization code) from Facebook's Authentication endpoint. To do that, set it up like this
var fb = new FacebookAuthenticationOptions
{
AuthenticationType = "Facebook",
SignInAsAuthenticationType = signInAsType,
AppId = ConfigurationManager.AppSettings["Facebook:AppId"],
AppSecret = ConfigurationManager.AppSettings["Facebook:AppSecret"],
Provider = new FacebookAuthenticationProvider()
{
OnAuthenticated = (context) =>
{
context.Identity.AddClaim(new System.Security.Claims.Claim("urn:facebook:access_token", context.AccessToken, ClaimValueTypes.String, "Facebook"));
return Task.FromResult(0);
}
}
};
fb.Scope.Add("email");
app.UseFacebookAuthentication(fb);
Then, you need to catch the response once it's logged in. I'm using the following file from the IdentityServer3 Samples Repository, which overrides (read, provides functionality) for the methods necessary to log a user in from external sites. From this response, I'm using the C# Facebook SDK with the newly returned access_token claim in the ExternalAuthenticationContext to request the fields I need and add them to the list of claims. Then I can use that information to create/log in the user.
public override async Task AuthenticateExternalAsync(ExternalAuthenticationContext ctx)
{
var externalUser = ctx.ExternalIdentity;
var claimsList = ctx.ExternalIdentity.Claims.ToList();
if (externalUser.Provider == "Facebook")
{
var extraClaims = GetAdditionalFacebookClaims(externalUser.Claims.First(claim => claim.Type == "urn:facebook:access_token"));
claimsList.Add(new Claim("email", extraClaims.First(k => k.Key == "email").Value.ToString()));
claimsList.Add(new Claim("given_name", extraClaims.First(k => k.Key == "first_name").Value.ToString()));
claimsList.Add(new Claim("family_name", extraClaims.First(k => k.Key == "last_name").Value.ToString()));
}
if (externalUser == null)
{
throw new ArgumentNullException("externalUser");
}
var user = await userManager.FindAsync(new Microsoft.AspNet.Identity.UserLoginInfo(externalUser.Provider, externalUser.ProviderId));
if (user == null)
{
ctx.AuthenticateResult = await ProcessNewExternalAccountAsync(externalUser.Provider, externalUser.ProviderId, claimsList);
}
else
{
ctx.AuthenticateResult = await ProcessExistingExternalAccountAsync(user.Id, externalUser.Provider, externalUser.ProviderId, claimsList);
}
}
And that's it! If you have any suggestions for simplifying this process, please let me know. I was going to modify this code to do perform the call to the API from FacebookAuthenticationOptions, but the Events property no longer exists apparently.
Edit: the GetAdditionalFacebookClaims method is simply a method that creates a new FacebookClient given the access token that was pulled out and queries the Facebook API for the other user claims you need. For example, my method looks like this:
protected static JsonObject GetAdditionalFacebookClaims(Claim accessToken)
{
var fb = new FacebookClient(accessToken.Value);
return fb.Get("me", new {fields = new[] {"email", "first_name", "last_name"}}) as JsonObject;
}