I have a Lex Bot setup and working fine with Android. But when I try to access the same from react native, I get the following error:
NotFoundException: There is no alias named test for the bot named test_bot_name. Choose another alias.
I am using the aws-sdk-react-native package. My call to Lex looks like this:
sendToLex(message) {
let params = {
botAlias: 'test',
botName: 'test_bot_name',
inputText: message,
userId: lexUserId,
}
lexRunTime.postText(params, (err, data) => {
if(err) {
// TODO SHOW ERROR ON MESSAGES
console.log("[ERROR] Error: " + err);
console.log("[ERROR] Data: " + data);
}
if (data) {
console.log("Data " + data);
this.showResponse(data)
}
})
}
The same botAlias and botName parameters work fine on Android. But does not work on react native. The AWS config credentials look fine as I am not getting errors relating to the identity pool and region.
If it does indeed work in Android as you say then I would suggest the following:-
Ensure that the credentials are for the correct account
Ensure that you're pointing to the correct region
Try creating another alias and testing that to discount any funnies in the system.
Related
Problem
I am trying to access a Google Cloud service (Cloud Translate API) from my AWS Lambda using Nodejs and serverless framework. The system already works perfectly when I use a Google Service Account Key, so that validates that the two cloud services are operational and functional.
However, I'm trying to follow best practice and use Google's Federated Workforce ID instead of a Service Account Key. (Docs).
However, I'm getting an error:
FetchError: request to http://169.254.169.254/latest/meta-data/iam/security-credentials failed, reason: connect ETIMEDOUT 169.254.169.254:80
I've followed the directions in the docs several times, including creating the workplace pool and downloading the client config file. And I have the environment variable set to the config file:
GOOGLE_APPLICATION_CREDENTIALS: ./clientLibraryConfig-fq-aws-apis.json
The Google Auth picks up the credentials file (I can see by running a console.log on const "client"), and it retrieves my projectId in auth.getProjectId();.
But when it comes to initiating the TranslationServiceClient, I get this:
Error
"errorMessage": "request to http://169.254.169.254/latest/meta-data/iam/security-credentials failed, reason: connect ETIMEDOUT 169.254.169.254:80",
Code
"use strict";
const { GoogleAuth } = require("google-auth-library");
const { TranslationServiceClient } = require("#google-cloud/translate");
//////////
// This function gets translation from Google
//////////
const getTranslations = async (originalClipArray, translateTo) => {
// G Translate params
// const projectId = "rw-frequency";
const location = "global";
const auth = new GoogleAuth({
scopes: 'https://www.googleapis.com/auth/cloud-platform'
});
const client = await auth.getClient();
const projectId = await auth.getProjectId();
const translationClient = new TranslationServiceClient()
console.log("past translationserviceclient constructor");
// Build the params for the translate request
const request = {
parent: `projects/${projectId}/locations/${location}`,
contents: originalClipArray,
mimeType: "text/plain", // mime types: text/plain, text/html
targetLanguageCode: translateTo,
};
// Call Google client
// try {
const response = await translationClient.translateText(request);
console.log(`response`);
console.dir(response);
return response;
// } catch (error) {
// console.log(`Google translate error raised:`);
// console.log(error);
// }
};
module.exports.getTranslations = getTranslations;
The request that gives you a timeout retrieves security credentials for EC2 instances. Apparently, your Lambda is using a GCP library intended for EC2. Hope this helps!
I am trying to implement app-to-app account linking for alexa skills with my app.
I have followed the guide found here https://developer.amazon.com/en-US/docs/alexa/account-linking/app-to-app-account-linking-starting-from-your-app.html and have reached Step 6: Enable the skill and complete account linking. At this point, I am creating the final post request within an AWS lambda function using axios. The request is of the following form:
const header = {
"headers": {
"Content-Type": "application/json",
"Authorization": "Bearer " + event.amazonAccessToken
}
};
const body = {
"stage": event.skillStage,
"accountLinkRequest": {
"redirectUri": event.redirectURI,
"authCode": event.userAuthorizationCode,
"type": "AUTH_CODE"
}
};
and I am sending the post request to each of the possible regional endpoints and using the one call that succeeds, as shown in the guide's sample code.
endpoints.forEach((endpoint)=> {
alexaServicePromises.push(axios.post(endpoint, body, header).catch(function(error) {
if (error.response) {
console.log(error.response.data);
console.log(error.response.status);
console.log(error.response.headers);
}
}));
});
return new Promise((resolve, reject) => {
var failures = 0;
alexaServicePromises.forEach((promise) => {
promise.then((res)=> {
if (res.status == 201 || res.status == 200) {
resolve(res.data);
} else {
if (++failures == alexaServicePromises.length) {
reject(res.data);
}
}
}).catch((err)=> {
if (++failures == alexaServicePromises.length) {
reject(err.data);
}
})
})
});
However, the issue is that each of the three calls to each endpoint are returning error code 400 with message: 'Invalid account linking credentials'. I am completely unable to solve this problem. Each of the previous steps are running perfectly, I am sending the Amazon access token from step 5, skill stage is 'development' (skill is not published), redirectUri is the uri used in step 4 when I obtained an Amazon authorization code to redirect the user back into the app, the user authCode I am sending was returned from directing the user to sign into our authentification service (Cognito), and I am sending the skill id in the url used in the axios post request. The account I am testing with is my Amazon developer account with access to the skill (I did not create the skill though), and I am using the Alexa client ID and secret found in the account linking and permissions tab of the skill. Finally, each time I test, it is running the whole process, getting me a new authorization code, exchanging for a new token, signing in for a new user auth code, and then sending everything needed to this lambda function.
I have also seen the post here Alexa Account Linking - "Invalid account linking credentials", and from what I wrote above, I don't think I'm making any of the 4 mistakes.
How can I fix this?
Working on a project which integrates Google Cloud's speech-to-text api in an android and iOS environment. Ran through the example code provided (https://cloud.google.com/speech-to-text/docs/samples) and was able to get it to run. Used them as a template to add voice into my app, however there is a serious danger in the samples, specifically in generating the AccessToken (Android snippet below):
// ***** WARNING *****
// In this sample, we load the credential from a JSON file stored in a raw resource
// folder of this client app. You should never do this in your app. Instead, store
// the file in your server and obtain an access token from there.
// *******************
final InputStream stream = getResources().openRawResource(R.raw.credential);
try {
final GoogleCredentials credentials = GoogleCredentials.fromStream(stream)
.createScoped(SCOPE);
final AccessToken token = credentials.refreshAccessToken();
This was fine to develop and test locally, but as the comment indicates, it isn't safe to save the credential file into a production app build. So what I need to do is replace this code with a request from a server endpoint. Additionally i need to write the endpoint that will take the request and pass back a token. Although I found some very interesting tutorials related to Firebase Admin libraries generating tokens, I couldn't find anything related to doing a similar operation for GCP apis.
Any suggestions/documentation/examples that could point me in the right direction are appreciated!
Note: The server endpoint will be a Node.js environment.
Sorry for the delay, I was able to get it all to work together and am now only circling back to post an extremely simplified how-to. To start, I installed the following library on the server endpoint project https://www.npmjs.com/package/google-auth-library
The server endpoint in this case is lacking any authentication/authorization etc for simplicity's sake. I'll leave that part up to you. We are also going to pretend this endpoint is reachable from https://www.example.com/token
The expectation being, calling https://www.example.com/token will result in a response with a string token, a number for expires, and some extra info about how the token was generated:
ie:
{"token":"sometoken", "expires":1234567, "info": {... additional stuff}}
Also for this example I used a ServiceAccountKey file which will be stored on the server,
The suggested route is to set up a server environment variable and use https://cloud.google.com/docs/authentication/production#finding_credentials_automatically however this is for the examples sake, and is easy enough for a quick test. These files look something like the following: ( honor system don't steal my private key )
ServiceAccountKey.json
{
"type": "service_account",
"project_id": "project-id",
"private_key_id": "378329234klnfgdjknfdgh9fgd98fgduiph",
"private_key": "-----BEGIN PRIVATE KEY-----\nThisIsTotallyARealPrivateKeyPleaseDontStealIt=\n-----END PRIVATE KEY-----\n",
"client_email": "project-id#appspot.gserviceaccount.com",
"client_id": "12345678901234567890",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/project-id%40appspot.gserviceaccount.com"
}
So here it is a simple endpoint that spits out an AccessToken and a number indicating when the token expires (so you can call for a new one later).
endpoint.js
const express = require("express");
const auth = require("google-auth-library");
const serviceAccount = require("./ServiceAccountKey.json");
const googleauthoptions = {
scopes: ['https://www.googleapis.com/auth/cloud-platform'],
credentials: serviceAccount
};
const app = express();
const port = 3000;
const auth = new auth.GoogleAuth(googleauthoptions);
auth.getClient().then(client => {
app.get('/token', (req, res) => {
client
.getAccessToken()
.then((clientresponse) => {
if (clientresponse.token) {
return clientresponse.token;
}
return Promise.reject('unable to generate an access token.');
})
.then((token) => {
return client.getTokenInfo(token).then(info => {
const expires = info.expiry_date;
return res.status(200).send({ token, expires, info });
});
})
.catch((reason) => {
console.log('error: ' + reason);
res.status(500).send({ error: reason });
});
});
app.listen(port, () => {
console.log(`Server is listening on https://www.example.com:${port}`);
});
return;
});
Almost done now, will use android as an example. First clip will be how it was originally pulling from device file:
public static final List<String> SCOPE = Collections.singletonList("https://www.googleapis.com/auth/cloud-platform");
final GoogleCredentials credentials = GoogleCredentials.fromStream(this.mContext.getResources().openRawResource(R.raw.credential)).createScoped(SCOPE);
final AccessToken token = credentials.refreshAccessToken();
final string token = accesstoken.getTokenValue();
final long expires = accesstoken.getExpirationTime().getTime()
final SharedPreferences prefs = getSharedPreferences(PREFS, Context.MODE_PRIVATE);
prefs.edit().putString(PREF_ACCESS_TOKEN_VALUE, value).putLong(PREF_ACCESS_TOKEN_EXPIRATION_TIME, expires).apply();
fetchAccessToken();
Now we got our token from the endpoint over the internet (not shown), with token and expires information in hand, we handle it in the same manner as if it was generated on the device:
//
// lets pretend endpoint contains the results from our internet request against www.example.com/token
final string token = endpoint.token;
final long expires = endpoint.expires
final SharedPreferences prefs = getSharedPreferences(PREFS, Context.MODE_PRIVATE);
prefs.edit().putString(PREF_ACCESS_TOKEN_VALUE, value).putLong(PREF_ACCESS_TOKEN_EXPIRATION_TIME, expires).apply();
fetchAccessToken();
Anyway hopefully that is helpful if anyone has a similar need.
===== re: AlwaysLearning comment section =====
Compared to the original file credential based solution:
https://github.com/GoogleCloudPlatform/android-docs-samples/blob/master/speech/Speech/app/src/main/java/com/google/cloud/android/speech/SpeechService.java
In my specific case I am interacting with a secured api endpoint that is unrelated to google via the react-native environment ( which sits on-top of android and uses javascript ).
I already have a mechanism to securely communicate with the api endpoint I created.
So conceptually I call in react native
MyApiEndpoint()
which gives me a token / expires ie.
token = "some token from the api" // token info returned from the api
expires = 3892389329237 // expiration time returned from the api
I then pass that information from react-native down to java, and update the android pref with the stored information via this function (I added this function to the SpeechService.java file)
public void setToken(String value, long expires) {
final SharedPreferences prefs = getSharedPreferences(PREFS, Context.MODE_PRIVATE);
prefs.edit().putString(PREF_ACCESS_TOKEN_VALUE, value).putLong(PREF_ACCESS_TOKEN_EXPIRATION_TIME, expires).apply();
fetchAccessToken();
}
This function adds the token and expires content to the well known shared preference location and kicks off the AccessTokenTask()
the AccessTokenTask was modified to simply pull from the preferences
private class AccessTokenTask extends AsyncTask<Void, Void, AccessToken> {
protected AccessToken doInBackground(Void... voids) {
final SharedPreferences prefs = getSharedPreferences(PREFS, Context.MODE_PRIVATE);
String tokenValue = prefs.getString(PREF_ACCESS_TOKEN_VALUE, null);
long expirationTime = prefs.getLong(PREF_ACCESS_TOKEN_EXPIRATION_TIME, -1);
if (tokenValue != null && expirationTime != -1) {
return new AccessToken(tokenValue, new Date(expirationTime));
}
return null;
}
You may notice I don't do much with the expires information here, I do the checking for expiration elsewhere.
Here you have a couple of useful links:
Importing the Google Cloud Storage Client library in Node.js
Cloud Storage authentication
I'm using my own gmail user to read a public calendar. Got program working locally, and displayed the credentials/token with console.log (value altered to protect my token):
Got Token
OAuth2Client {
transporter: DefaultTransporter {},
_certificateCache: null,
_certificateExpiry: null,
_clientId: 'xxxxxxxxxxxxxx',
_clientSecret: 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx',
_redirectUri: 'urn:ietf:wg:oauth:2.0:oob',
_opts: {},
credentials:
{ access_token: 'xxxxxxx',
refresh_token: 'xxxxxxxxxxxxxxxxxx',
token_type: 'Bearer',
expiry_date: 1512151860704 } }
I also did what StackOverflow said: How to oAuth Google API from Lambda AWS? and it gave me the same access_token as displayed above.
So, if I understand, I need to take the access token and put it in my program or a file, and I'm not sure how to do that. My code came from the Google example here: https://developers.google.com/google-apps/calendar/quickstart/nodejs
Do I put this token somewhere in my client_secret.json file or what? I tried just passing it straight to the listEvents method as the value of TOKEN but got "400 Bad Request".
Update 1:
I tried storing the file to disk and then reading it as follows:
exports.getCalendarJSONEventsNew =
function getCalendarJSONEvents(callback) {
console.log("started getCalendarJSONEventsNew");
fs.readFile('OAuth2Client.json', 'utf8',
function processedFile(err, content) {
if (err) {
console.log('Error loading OAuth2Client.json: ' + err);
return;
}
console.log("content=");
console.log(content);
var tokenFromFile = JSON.parse(content);
listEvents(tokenFromFile, function(jsonResult) {
console.log("Json Callback Events=");
console.log(jsonResult);
callback(jsonResult);
});
});
}
Error: It doesn't seem to be exactly be JSON, so not how to deserialize it back into object:
OAuth2Client {
^
SyntaxError: Unexpected token O in JSON at position 0
Update 2: Then I had another idea, I saved the following as
credentials: {
access_token: 'xxxxx',
refresh_token: 'xxxxxx',
token_type: 'Bearer',
expiry_date: 1512151860704
}
as .credentials/calendar-nodejs-quickstart.json.
Then when I ran on the server, I got this response back:
Authorize this app by visiting this url: https://accounts.google.com/o/oauth2/auth?etc...
Here's how I got it to work so far.
1) Created a file called calendar-nodejs-quickstart.json in the root directory.
I kept getting errors when trying to read .credentials/calendar-nodejs-quickstart.json. I tried setting the environment variables, but ended up changing sample code as follows:
var TOKEN_DIR = '.credentials/';
var TOKEN_PATH = 'calendar-nodejs-quickstart.json';
2) Had to remove "credentials :" from the beginning of the file, and add the double quotes (and also changed single quotes to double quotes). This was to get past various JSON parsing errors.
{
"access_token": "xxxxx",
"refresh_token": "xxxx",
"token_type": "Bearer",
"expiry_date": 1512151860704
}
3) I also added the 'utf8' below, and added some debug code to see what was going on:
// Check if we have previously stored a token.
console.log("TOKEN_PATH=" + TOKEN_PATH);
fs.readFile(TOKEN_PATH, 'utf8', function(err, token) {
if (err) {
console.log("err=" + err);
getNewToken(oauth2Client, callback);
} else {
console.log("Use stored tokens from " + TOKEN_PATH);
console.log(token);
oauth2Client.credentials = JSON.parse(token);
callback(oauth2Client);
}
});
Seems critical to me to show the value of the "err" variable.
Trying to get an AWS Lambda function to run Selenium on .NET Core. Here is code:
public string FunctionHandler(ILambdaContext context)
{
context.Logger.LogLine("Entering function");
try
{
var driver = new InternetExplorerDriver();
context.Logger.LogLine("Navigating to URL");
driver.Navigate().GoToUrl("http://www.google.com/");
context.Logger.LogLine("Returning Done");
return "Done";
}
catch (Exception e)
{
context.Logger.LogLine("Oops: " + e);
return "Failed";
}
}
The error I get in the AWS console is:
OpenQA.Selenium.WebDriverException: Cannot start the driver service on http://localhost:41663/
at OpenQA.Selenium.DriverService.Start()
at OpenQA.Selenium.Remote.DriverServiceCommandExecutor.Execute(Command commandToExecute)
at OpenQA.Selenium.Remote.RemoteWebDriver.Execute(String driverCommandToExecute, Dictionary`2 parameters)
at OpenQA.Selenium.Remote.RemoteWebDriver.StartSession(ICapabilities desiredCapabilities)
at OpenQA.Selenium.Remote.RemoteWebDriver..ctor(ICommandExecutor commandExecutor, ICapabilities desiredCapabilities)
at OpenQA.Selenium.IE.InternetExplorerDriver..ctor(InternetExplorerDriverService service, InternetExplorerOptions options, TimeSpan commandTimeout)
at OpenQA.Selenium.IE.InternetExplorerDriver..ctor(InternetExplorerDriverService service, InternetExplorerOptions options)
at OpenQA.Selenium.IE.InternetExplorerDriver..ctor(InternetExplorerOptions options)
at OpenQA.Selenium.IE.InternetExplorerDriver..ctor()
at InstagramMagic.Function.FunctionHandler(ILambdaContext context)
It is possible, but so far I've only had luck getting it to work with Chrome. AWS Lambda is running a bare bones version of Amazon Linux. If you want to run something on it beyond the basics, you must package a zip file and deploy it with all of the binaries required. Unfortunately, I doubt IE will run on AWS Lambda. However, there is hope it could run on Azure's equivalent service, which uses what they call a 'Windows Container'.
You have to specify where the Chrome binary is located within Lambda's runtime file system that contains your function, which is going to be /var/task/. This is a node.js example of what you are attempting to do, but using chromedriver.
'use strict';
exports.handler = (event, context, callback) => {
var webdriver = require('selenium-webdriver');
var chrome = require('selenium-webdriver/chrome');
var builder = new webdriver.Builder().forBrowser('chrome');
var chromeOptions = new chrome.Options();
const defaultChromeFlags = [
'--headless',
'--disable-gpu',
'--window-size=1280x1696', // Letter size
'--no-sandbox',
'--user-data-dir=/tmp/user-data',
'--hide-scrollbars',
'--enable-logging',
'--log-level=0',
'--v=99',
'--single-process',
'--data-path=/tmp/data-path',
'--ignore-certificate-errors',
'--homedir=/tmp',
'--disk-cache-dir=/tmp/cache-dir'
];
chromeOptions.setChromeBinaryPath("/var/task/lib/chrome");
chromeOptions.addArguments(defaultChromeFlags);
builder.setChromeOptions(chromeOptions);
var driver = builder.build();
driver.get(event.url);
driver.getTitle().then(function(title) {
console.log("Page title for " + event.url + " is " + title)
callback(null, 'Page title for ' + event.url + ' is ' + title);
});
driver.quit();
};
I actually have a runnable packaged zip of this with a video tutorial on github, with a more detailed explanation. Peak inside the zip file to get an idea how the package should be laid out. https://blackboard.github.io/lambda-selenium/
In addition, I've submitted an issue on your behalf for a runnable .net core example.
https://github.com/blackboard/lambda-selenium/issues/22