is there a way to get string (data) from text file stored in s3 in Alexa localisation.js file? - amazon-web-services

Problem:
I am trying to get the data from a text file stored in s3, I get it right in intent handler using a sync await but I want to get string in localisation file as I am trying to implement the solution in 2 languages.
I am getting err saying skill does not respond correctly.
This is file.js
const AWS = require('aws-sdk');
//========================
// This step is not required if you are running your code inside lambda or in
// the local environment that has AWS set up
//========================
const s3 = new AWS.S3();
async function getS3Object (bucket, objectKey) {
try {
const params = {
Bucket: 'my-bucket',
Key: 'file.txt',
};
const data = await s3.getObject(params).promise();
let dat = data.Body.toString('utf-8');
return dat;
} catch (e) {
throw new Error(`Could not retrieve file from S3: ${e.message}`);
}
}
module.exports = getS3Object;
this is the localisation.js file code
const dataText = require('file.js');
async let textTitle = await dataText().then(); **// this does not work**
module.exports = {
en: {
translation: {
WELCOME_BACK_MSG : textTitle,
}
},
it: {
translation: {
WELCOME_MSG: textTitle,
}
}
}

The problem is that in your localisation.js file you are trying to export something that is obtained via an asynchronous function call, but you cannot do that directly, module.exports is assigned and returned synchronously. Please, see for instance this SO question and answer for an in-deep background.
As you are mentioning Alexa skill, and for the name of the file, localisation.js, I assume you are trying something similar to the solution proposed in this GitHub repository.
Analyzing the content of the index.js file they provide, it seems the library is using i18next for localisation.
The library provides the concept of backend if you need to load your localisation information from an external resource.
You can implement a custom backend, although the library offers one that could fit your needs, i18next-http-backend.
As indicated in the documentation, you can configure the library to fetch your localization resources with this backend with something like the following:
import i18next from 'i18next';
import Backend from 'i18next-http-backend';
i18next
.use(Backend)
.init({
backend: {
// for all available options read the backend's repository readme file
loadPath: '/locales/{{lng}}/{{ns}}.json'
}
});
Here in SO you can find a more complete example.
You need to provide a similar configuration to the localisation interceptor provided in the Alexa skill example project, perhaps something like:
import HttpApi from 'i18next-http-backend';
/**
* This request interceptor will bind a translation function 't' to the handlerInput
*/
const LocalizationInterceptor = {
process(handlerInput) {
const localisationClient = i18n
.use(HttpApi)
.init({
lng: Alexa.getLocale(handlerInput.requestEnvelope),
// resources: languageStrings,
backend: {
loadPath: 'https://your-bucket.amazonaws.com/locales/{{lng}}/translations.json',
crossDomain: true,
},
returnObjects: true
});
localisationClient.localise = function localise() {
const args = arguments;
const value = i18n.t(...args);
if (Array.isArray(value)) {
return value[Math.floor(Math.random() * value.length)];
}
return value;
};
handlerInput.t = function translate(...args) {
return localisationClient.localise(...args);
}
}
};
Please, be aware that instead of a text file you need to return a valid son file with the appropriate translations:
{
"WELCOME_MSG" : "Welcome!!",
"WELCOME_BACK_MSG" : "Welcome back!!"
}

Related

Cloud Functions / Cloud Tasks UNAUTHENTICATED error

I am trying to get a Cloud Function to create a Cloud Task that will invoke a Cloud Function. Easy.
The flow and use case are very close to the official tutorial here.
I also looked at this article by Doug Stevenson and in particular its security section.
No luck, I am consistently getting a 16 (UNAUTHENTICATED) error in Cloud Task.
If I can trust what I see in the console it seems that Cloud Task is not attaching the OIDC token to the request:
Yet, in my code I do have the oidcToken object:
const { v2beta3, protos } = require("#google-cloud/tasks");
import {
PROJECT_ID,
EMAIL_QUEUE,
LOCATION,
EMAIL_SERVICE_ACCOUNT,
EMAIL_HANDLER,
} from "./../config/cloudFunctions";
export const createHttpTaskWithToken = async function (
payload: {
to_email: string;
templateId: string;
uid: string;
dynamicData?: Record<string, any>;
},
{
project = PROJECT_ID,
queue = EMAIL_QUEUE,
location = LOCATION,
url = EMAIL_HANDLER,
email = EMAIL_SERVICE_ACCOUNT,
} = {}
) {
const client = new v2beta3.CloudTasksClient();
const parent = client.queuePath(project, location, queue);
// Convert message to buffer.
const convertedPayload = JSON.stringify(payload);
const body = Buffer.from(convertedPayload).toString("base64");
const task = {
httpRequest: {
httpMethod: protos.google.cloud.tasks.v2.HttpMethod.POST,
url,
oidcToken: {
serviceAccountEmail: email,
audience: new URL(url).origin,
},
headers: {
"Content-Type": "application/json",
},
body,
},
};
try {
// Send create task request.
const request = { parent: parent, task: task };
const [response] = await client.createTask(request);
console.log(`Created task ${response.name}`);
return response.name;
} catch (error) {
if (error instanceof Error) console.error(Error(error.message));
return;
}
};
When logging the task object from the code above in Cloud Logging I can see that the service account is the one that I created for the purpose of this and that the Cloud Tasks are successfully created.
IAM:
And the function that the Cloud Task needs to invoke:
Everything seems to be there, in theory.
Any advice as to what I would be missing?
Thanks,
Your audience is incorrect. It must end by the function name. Here, you only have the region and the project https://<region>-<projectID>.cloudfunction.net/. Use the full Cloud Functions URL.

Forward uploaded file from a service to another service

I'm trying to create a REST API with ExpressJS that accept an image and pass it to another service (with a POST request) which is in charge to perform some operations (resize, etc..) and store into an AWS S3. I know that the same solution can be easily done with a Lambda Function directly but I have a K8s and I want to make worth it.
All components are already working with the exception of the service that forward the image to the second service.
The idea that I've found on internet is using a stream, but I got the exception Error: Expected a stream at Object.getStream [as default]
How can I solve that? Is the right practice or there is a better solution to achieve the same result?
const headers = req.headers;
const files: any = req.files
const filename = files[0].originalname;
const buffer = await getStream(files[0].stream)
const formFile = new FormData();
formFile.append('image', buffer, filename);
headers['Content-Type'] = 'multipart/form-data';
axios.post("http://localhost:1401/content/image/test/upload/", formFile, {
headers: headers,
})
.catch((error) => {
const { status, data } = error.response;
res.status(status).send(data);
})
I've found a solution that I post here for those who'll have the same problem.
Install form-data on node:
yarn add form-data
Then in your controller:
const headers = req.headers;
const files: any = req.files
const formFile = new FormData();
files.forEach((file: any) => {
const filename = file.originalname;
const buffer = file.buffer
formFile.append('image', buffer, filename);
})
// set the correct header otherwise it won't work
headers["content-type"] = `multipart/form-data; boundary=${formFile.getBoundary()}`
// now you can send the image to the second service
axios.post("http://localhost:1401/content/image/test/upload/", formFile, {
headers: headers,
})
.then((r : any) => {
res.sendStatus(r.status).end()
})
.catch((error) => {
const { status, data } = error.response;
res.status(status).send(data);
})

How to set credentials in AWS SDK v3 JavaScript?

I am scouring the documentation, and it only provides pseudo-code of the credentials for v3 (e.g. const client = new S3Client(clientParams)
How do I initialize an S3Client with the bucket and credentials to perform a getSignedUrl request? Any resources pointing me in the right direction would be most helpful. I've even searched YouTube, SO, etc and I can't find any specific info on v3. Even the documentation and examples doesn't provide the actual code to use credentials. Thanks!
As an aside, do I have to include the fake folder structure in the filename, or can I just use the actual filename? For example: bucket/folder1/folder2/uniqueFilename.zip or uniqueFilename.zip
Here's the code I have so far: (Keep in mind I was returning the wasabiObjKey to ensure I was getting the correct file name. I am. It's the client, GetObjectCommand, and getSignedUrl that I'm having issues with.
exports.getPresignedUrl = functions.https.onCall(async (data, ctx) => {
const wasabiObjKey = `${data.bucket_prefix ? `${data.bucket_prefix}/` : ''}${data.uid.replace(/-/g, '_').toLowerCase()}${data.variation ? `_${data.variation.replace(/\./g, '').toLowerCase()}` : ''}.zip`
const { S3Client, GetObjectCommand } = require('#aws-sdk/client-s3')
const s3 = new S3Client({
bucketEndpoint: functions.config().s3_bucket.name,
region: functions.config().s3_bucket.region,
credentials: {
secretAccessKey: functions.config().s3.secret,
accessKeyId: functions.config().s3.access_key
}
})
const command = new GetObjectCommand({
Bucket: functions.config().s3_bucket.name,
Key: wasabiObjKey,
})
const { getSignedUrl } = require("#aws-sdk/s3-request-presigner")
const url = getSignedUrl(s3, command, { expiresIn: 60 })
return wasabiObjKey
})
There are a credential chain that provide credential to your API calls from SDK
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/setting-credentials-node.html
Loaded from AWS Identity and Access Management (IAM) roles for Amazon
EC2
Loaded from the shared credentials file (~/.aws/credentials)
Loaded from environment variables
Loaded from a JSON file on disk
Other credential-provider classes provided by the JavaScript SDK
You can embed the credential inside your source code but it's not the prefered way
new S3Client(configuration: S3ClientConfig): S3Client
Where S3ClientConfig contain a credentials property
https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/modules/credentials.html
const { S3Client,GetObjectCommand } = require("#aws-sdk/client-s3");
let client = new S3Client({
region:'ap-southeast-1',
credentials:{
accessKeyId:'',
secretAccessKey:''
}
});
(async () => {
const response = await client.send(new GetObjectCommand({Bucket:"BucketNameHere",Key:"ObjectNameHere"}));
console.log(response);
})();
Sample answer
'$metadata': {
httpStatusCode: 200,
requestId: undefined,
extendedRequestId: '7kwrFkEp3lEnLU+OtxjrgdmS6gQmvPdbnqqR7I8P/rdFrUPBkdKYPYykWivuHPXCF1IHgjCIbe8=',
cfId: undefined,
attempts: 1,
totalRetryDelay: 0
},
Here's a simple approach I use (in Deno) for testing (in case you don't want to go the signedUrl approach and just let the SDK do the heavy lifting for you):
import { config as env } from 'https://deno.land/x/dotenv/mod.ts' // https://github.com/pietvanzoen/deno-dotenv
import { S3Client, ListObjectsV2Command } from 'https://cdn.skypack.dev/#aws-sdk/client-s3' // https://github.com/aws/aws-sdk-js-v3
const {AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY} = env()
// https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/modules/credentials.html
const credentials = {
accessKeyId: AWS_ACCESS_KEY_ID,
secretAccessKey: AWS_SECRET_ACCESS_KEY,
}
// https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/interfaces/s3clientconfig.html
const config = {
region: 'ap-southeast-1',
credentials,
}
// https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/classes/s3client.html
const client = new S3Client(config)
export async function list() {
// https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/interfaces/listobjectsv2commandinput.html
const input = {
Bucket: 'BucketNameHere'
}
// https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/classes/command.html
const cmd = new ListObjectsV2Command(input)
// https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/classes/listobjectsv2command.html
return await client.send(cmd)
}

Stripe Error: No signatures found matching the expected signature for payload

I have a stripe webhook that call a Firebase function. In this function I need to verify that this request comes from Stripe servers. Here is the code :
const functions = require('firebase-functions');
const bodyParser = require('body-parser');
const stripe = require("stripe")("sk_test_****");
const endpointSecret = 'whsec_****';
const app = require('express')();
app.use(bodyParser.json({
verify: function (req, res, buf) {
var url = req.originalUrl;
if (url.startsWith('/webhook')) {
req.rawBody = buf.toString()
}
}
}));
app.post('/webhook/example', (req, res) => {
let sig = req.headers["stripe-signature"];
try {
console.log(req.bodyRaw)
let event = stripe.webhooks.constructEvent(req.body, sig, endpointSecret);
console.log(event);
res.status(200).end()
// Do something with event
}
catch (err) {
console.log(err);
res.status(400).end()
}
});
exports.app = functions.https.onRequest(app);
As mentioned in Stripe Documentation, I have to use raw body to perform this security check.
I have tried with my current code and with :
app.use(require('body-parser').raw({type: '*/*'}));
But I always get this error :
Error: No signatures found matching the expected signature for payload. Are you passing the raw request body you received from Stripe? https://github.com/stripe/stripe-node#webhook-signing
Cloud Functions automatically parses body content of known types. If you're getting JSON, then it's already parsed and available to you in req.body. You shouldn't need to add other body parsing middleware.
If you need to process the raw data, you should use req.rawBody, but I don't think you'll need to do that here.
Here is what is working for me:
add this line:
app.use('/api/subs/stripe-webhook', bodyParser.raw({type: "*/*"}))
(The first argument specifies which route we should use the raw body parser on. See the app.use() reference doc.)
just before this line:
app.use(bodyParser.json());
(it doesn't affect all your operation, just this: '/api/subs/stripe-webhook')
Note: If you are using Express 4.16+ you can replace bodyParser by express:
app.use('/api/subs/stripe-webhook', express.raw({type: "*/*"}));
app.use(express.json());
Then:
const endpointSecret = 'whsec_........'
const stripeWebhook = async (req, res) => {
const sig = req.headers['stripe-signature'];
let eventSecure = {}
try {
eventSecure = stripe.webhooks.constructEvent(req.body, sig, endpointSecret);
//console.log('eventSecure :', eventSecure);
}
catch (err) {
console.log('err.message :', err.message);
res.status(400).send(`Webhook Secure Error: ${err.message}`)
return
}
res.status(200).send({ received: true });
}
Here is code which is working for me:
app.use(bodyParser.json({
verify: function (req, res, buf) {
var url = req.originalUrl;
if (url.startsWith('/stripe')) {
req.rawBody = buf.toString();
}
}
}));
And then pass the req.rawBody for verification
stripe.checkWebHook(req.rawBody, signature);
Reference: https://github.com/stripe/stripe-node/issues/341
2 things to note:
pass req.rawBody instead of req.body to constructEvent
const event = stripe.webhooks.constructEvent(
req.rawBody,
sig,
STRIPE_WEBHOOK_SECRET
);
Make sure you're using the correct webhook secret. It's unique per webhook url!
2021 - Solution
I faced that error, and after a lot research I could not figure out the problem easily, but finally I could do it based in my architecture below:
//App.js
this.server.use((req, res, next) => {
if (req.originalUrl.startsWith('/webhook')) {
next();
} else {
express.json()(req, res, next);
}
});
//routes.js
routes.post(
'/webhook-payment-intent-update',
bodyParser.raw({ type: 'application/json' }),
//your stripe logic (Im using a controller, but wherever)
(req, res) => {
stripe.webhooks.constructEvent(...)
}
)
Two big warnings to pay attention:
Make sure to send the req.headers['stripe-signature']
Make sure that your endpointSecret is right, if not it will still saying the same error
Tips:
Test it locally by installing the Stripe CLI: https://stripe.com/docs/webhooks/test
Verify your key on stripe dashboard or you can also make sure if you have the right key by verifying you stripe log as below:
I hope it helps you. :)
// Use JSON parser for all non-webhook routes
app.use(
bodyParser.json({
verify: (req, res, buf) => {
const url = req.originalUrl;
if (url.startsWith('/api/stripe/webhook')) {
req.rawBody = buf.toString();
}
}
})
);
The above code will look fine for the above answers. But even I was made one mistake. After put the same thing I got the same error.
Finally, I've figured it out if you're configured body-parser below the rawBody code then it'll work.
Like this
// Use JSON parser for all non-webhook routes
app.use(
bodyParser.json({
verify: (req, res, buf) => {
const url = req.originalUrl;
if (url.startsWith('/api/stripe/webhook')) {
req.rawBody = buf.toString();
}
}
})
);
// Setup express response and body parser configurations
app.use(express.json());
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
Hopefully, it'll help someone.
It is late but will help others
Github answer
const payload = req.body
const sig = req.headers['stripe-signature']
const payloadString = JSON.stringify(payload, null, 2);
const secret = 'webhook_secret';
const header = stripe.webhooks.generateTestHeaderString({
payload: payloadString,
secret,
});
let event;
try {
event = stripe.webhooks.constructEvent(payloadString, header, secret);
} catch (err) {
console.log(`Webhook Error: ${err.message}`)
return res.status(400).send(`Webhook Error: ${err.message}`);
}
switch (event.type) {
case 'checkout.session.completed': {
......
enter code here
If you are trying to add a stripe webhook into your NextJS API Route, here's how to do so (ref):
import initStripe from "stripe";
import { buffer } from "micro";
import { NextApiRequest, NextApiResponse } from "next";
export const config = { api: { bodyParser: false } };
const handler = async (req: NextApiRequest, res: NextApiResponse) => {
const stripe = initStripe(process.env.STRIPE_SECRET_KEY||'');
const signature = req.headers["stripe-signature"];
const signingSecret = process.env.STRIPE_WEBHOOK_SECRET || '';
const reqBuffer = await buffer(req);
let event;
try {
event = stripe.webhooks.constructEvent(reqBuffer, signature, signingSecret);
} catch (error: any) {
console.log(error);
return res.status(400).send(`Webhook error: ${error?.message}`);
}
console.log({ event });
res.send({ received: true });
};
export default handler;
This is using buffer from the micro library, in combination with the modifying the default API request to use request's rawbody. In some frameworks (like NextJs), rawBody doesn't come OOTB, hence the workaround of retrieving the rawbody by reqBuffer, which is needed in the stripe.webhooks.constructEvent event.
I was able to obtain data from one webhook but not from a second one: the problem was that the secret key I used was the same as the one used for the first webhook, but I found out that every webhook has a different key, that's way I got that same message.
AWS API Gateway + Lambda (Express.js CRUD) I'm using this for Stripe webhook endpoint and it works for me:
app.use(require('body-parser').text({ type: "*/*" }));
This happened to me when sending a test webhook from the Stripe dashboard after I had renamed a firebase cloud function. All my other functions were working fine. Solved by re-setting in the terminal
firebase functions:config:set stripe.webhook_signature="Your webhook signing secret"
(if you're using that) and redeploying the functions firebase deploy --only functions
On a second occasion I solved the problem by rolling the stripe signature in the stripe dashboard.
Please use this script
app.use(
bodyParser.json({
verify: (req, res, buf) => {
req.rawBody = buf;
},
})
);
My fave was combining two of above great answers.
Then you can use req.rawbody when you construct the event.
Replace "webhook" with whatever route you wish you have a raw body for.
app.use(
"/webhook",
express.json({
verify: (req, res, buf) => {
req.rawBody = buf.toString();
},
})
);
BEFORE
app.use(express.json());
Works well if you are using routes and controllers.
To use raw body in express with a specific endpoint in a seperated middleware, my solution is just enabling router to use express.raw for the webhook endpoint.
-node.js v12
-express.js v4.17.1
export const handleBodyRequestParsing = (router: Router): void => {
router.use('/your_webhook_endpoint', express.raw({ type: '*/*' }))
router.use(express.json({ limit: '100mb' }))
router.use(express.urlencoded({ extended: true }))
}
Here is the Quick Tip which may save your hours !
If you are adding express payment to your exciting express app sometimes you may already pass your request as json in the beginning of application by using express middleware app.use(json()); or any other middleware (Bodyparser for example).
If you are doing that then change that to omit your webhook url
Exmaple:
Assume your payment webhook url is /paments/webhhok
app.use((req, res, next) => {
if (req.originalUrl.includes("/payments/webhook")) {
next();
} else {
express.json()(req, res, next);
}
});
When using Stripe in Express, if you have the following line in your code;
app.use(express.json());
it is going to prevent you from providing the raw body to the Stripe even when you explicitly set "bodyParser.raw", which will throw an error. This was the reason my code failed. Finally sorted it out.
I tried all the solutions above and no one worked, and figured out that the only solution was not to use express at all for this endpoint. you just have to create another http function
export const webhook = functions.https.onRequest(async (req, res) => {
try {
const sig = req.headers['stripe-signature']
const endpointSecret = 'web_secret'
const event = stripe.webhooks.constructEvent(
req.rawBody,
sig,
endpointSecret
)
console.log(event.data.object)
res.status(200).send(event.data.object)
} catch (err) {
console.error('ocorreu um erro', err)
res.status(400).send(`Webhook Error: ${err.message}`)
}
})

AWS S3 Bucket Upload using CollectionFS and cfs-s3 meteor package

I am using Meteor.js with Amazon S3 Bucket for uploading and storing photos. I am using the meteorite packges collectionFS and aws-s3. I have setup my aws-s3 connection correctly and the images collection is working fine.
Client side event handler:
'click .submit': function(evt, templ) {
var user = Meteor.user();
var photoFile = $('#photoInput').get(0).files[0];
if(photoFile){
var readPhoto = new FileReader();
readPhoto.onload = function(event) {
photodata = event.target.result;
console.log("calling method");
Meteor.call('uploadPhoto', photodata, user);
};
}
And my server side method:
'uploadPhoto': function uploadPhoto(photodata, user) {
var tag = Random.id([10] + "jpg");
var photoObj = new FS.File({name: tag});
photoObj.attachData(photodata);
console.log("s3 method called");
Images.insert(photoObj, function (err, fileObj) {
if(err){
console.log(err, err.stack)
}else{
console.log(fileObj._id);
}
});
The file that is selected is a .jpg image file but upon upload I get this error on the server method:
Exception while invoking method 'uploadPhoto' Error: DataMan constructor received data that it doesn't support
And no matter whether I directly pass the image file, or attach it as data or use the fileReader to read as text/binary/string. I still get that error. Please advise.
Ok, maybe some thoughts. I have done things with collectionFS some months ago, so take care to the docs, because my examples maybe not 100% correct.
Credentials should be set via environment variables. So your key and secret is available on server only. Check this link for further reading.
Ok first, here is some example code which is working for me. Check yours for differences.
Template helper:
'dropped #dropzone': function(event, template) {
addImage(event);
}
Function addImage:
function addImagePreview(event) {
//Go throw each file,
FS.Utility.eachFile(event, function(file) {
//Some Validationchecks
var reader = new FileReader();
reader.onload = (function(theFile) {
return function(e) {
var fsFile = new FS.File(image.src);
//setMetadata, that is validated in collection
//just own user can update/remove fsFile
fsFile.metadata = {owner: Meteor.userId()};
PostImages.insert(fsFile, function (err, fileObj) {
if(err) {
console.log(err);
}
});
};
})(file);
// Read in the image file as a data URL.
reader.readAsDataURL(file);
});
}
Ok, your next point is the validation. The validation can be done with allow/deny rules and with a filter on the FS.Collection. This way you can do all your validation AND insert via client.
Example:
PostImages = new FS.Collection('profileImages', {
stores: [profileImagesStore],
filter: {
maxSize: 3145728,
allow: {
contentTypes: ['image/*'],
extensions: ['png', 'PNG', 'jpg', 'JPG', 'jpeg', 'JPEG']
}
},
onInvalid: function(message) {
console.log(message);
}
});
PostImages.allow({
insert: function(userId, doc) {
return (userId && doc.metadata.owner === userId);
},
update: function(userId, doc, fieldNames, modifier) {
return (userId === doc.metadata.owner);
},
remove: function(userId, doc) {
return false;
},
download: function(userId) {
return true;
},
fetch: []
});
Here you will find another example click
Another point of error is maybe your aws configuration. Have you done everything like it is written here?
Based on this post click it seems that this error occures when FS.File() is not constructed correctly. So maybe this should be you first way to start.
A lot for reading so i hope this helps you :)