I have an app with an Amplify backend and I am getting an error when calling my Lambda:
"Error: Cannot find module 'stripe'\nRequire stack:\n- /var/task/index.js\n- /var/runtime/UserFunction.js\n- /var/runtime/index.js"
I have installed stripe and see it in my dependencies from the main folder but not in the dependencies for my function. I tried to install stripe in my functions folder with no change. I am using React 17.0.2. The stripe version is 9.6.0.
I found a post with a similar problem but they were already able to install stripe: AWS Lambda Error: Cannot find module 'stripe' Require stack
My Lambda:
const aws = require('aws-sdk');
const ddb = new aws.DynamoDB({apiVersion: '2012-10-08'});
const stripe = require('stripe')('secret key');
/**
* #type {import('#types/aws-lambda').APIGatewayProxyHandler}
*/
exports.handler = async (event) => {
try {
const tableName = process.env.tableName
const {username, email} = event.arguments.input
const account = await stripe.accounts.create({
type: 'express',
email: `${email}`,
metadata: {user: `${username}`}
});
console.log("Account creation response: ", account)
console.log("Account id: ", account.id)
// store the Stripe account id in DBB
let ddbParams = {
Item: {
'stripe_id': `${account.id}`
},
TableName: tableName
}
try {
await ddb.putItem(ddbParams).promise()
console.log("Successfully updated stripe_id field")
} catch (err) {
console.log("Storing to DB error: ", err)
}
const accountId = account.id
const accountLink = await stripe.accountLinks.create({
account: accountId,
//Swap for live website
refresh_url: 'http://localhost:3000/profile',
return_url: 'http://localhost:3000/listingform',
type: 'account_onboarding',
});
console.log('Account link response :', accountLink)
return accountLink
} catch (err) {
throw new Error(err)
}
};
My package.json in my functions folder:
{
"name": "createStripeConnectAccount",
"version": "2.0.0",
"description": "Lambda function generated by Amplify",
"main": "index.js",
"license": "Apache-2.0",
"devDependencies": {
"#types/aws-lambda": "^8.10.92"
}
}
UDPATE:
I managed to install stripe and have it as a dependency but I still get the same error.
My package.json now looks like this:
{
"name": "createStripeConnectAccount",
"version": "2.0.0",
"description": "Lambda function generated by Amplify",
"main": "index.js",
"license": "Apache-2.0",
"devDependencies": {
"#types/aws-lambda": "^8.10.92"
},
"dependencies": {
"save": "^2.5.0",
"stripe": "^9.6.0"
}
}
The solution is to first install Stripe in the src folder of the function. This didn't show up in the external terminal I had opened so I had to right click on the src folder and open a terminal in there. Then run npm install.
Keeping this question up for any other new developers or people new to AWS.
Related
I am trying to update a thing's shadow on AWS IoT Core by calling the function 'UpdateThingShadowCommand' from my vueJS web app.
I am following instructions from the documentation here:
https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-iot-data-plane/classes/updatethingshadowcommand.html
However, when I execute the method 'UpdateThingShadowCommand', I keep running into the following error message:
net::ERR_CERT_AUTHORITY_INVALID
And another log message:
TypeError: Failed to fetch
My code is as follows:
import { IoTDataPlaneClient } from "#aws-sdk/client-iot-data-plane";
import { UpdateThingShadowCommand } from "#aws-sdk/client-iot-data-plane";
myMethod () {
const configIotDataPlaneClient = {
apiVersion: 'XXXXXXXX',
region: 'XXXXXXX',
credentials: {
accessKeyId: 'XXXXXXXXXXXXXXXX',
secretAccessKey: 'XXXXXXXXXXXXXXXXX'
}
//Initializing the client
const clientShadow = new IoTDataPlaneClient(configIotDataPlaneClient);
console.log(clientShadow)
const inputShadow = {
payload: new Uint8Array(
Buffer.from(
JSON.stringify({
"state": {
"reported": {
"item1": "val1",
"item2": "val2"
}
}
}),
),
),
//shadowName: "",
thingName: "thing-name"
}
//Updating a thing's shadow
try {
const commandShadow = new UpdateThingShadowCommand(inputShadow);
console.log(commandShadow)
const responseShadow = await clientShadow.send(commandShadow);
console.log("Update Shadow response", responseShadow)
}
catch (error) {
console.log("Update Shadow error: ", error)
}
finally{
console.log("Update Shadow: finally method")
}
}
Can anyone suggest why I may be getting these errors? Any help is much appreciated!
I have created this stack:
export class InfrastructureStack extends cdk.Stack {
constructor(scope: cdk.Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const bucket = new s3.Bucket(this, "My Hello Website", {
websiteIndexDocument: 'index.html',
websiteErrorDocument: 'error.html',
publicReadAccess: true,
removalPolicy: cdk.RemovalPolicy.DESTROY
});
const api = new apigateway.RestApi(this, "My Endpoint", {
restApiName: "My rest API name",
description: "Some cool description"
});
const myLambda = new lambda.Function(this, 'My Backend', {
runtime: lambda.Runtime.NODEJS_8_10,
handler: 'index.handler',
code: lambda.Code.fromAsset(path.join(__dirname, 'code'))
});
const apiToLambda = new apigateway.LambdaIntegration(myLambda)
api.root.addMethod('GET', apiToLambda);
updateWebsiteUrl.newUrl(api.url);
}
}
Last line of code is my function to update asset that will be deployed on S3 as a website with a API url that will be created during deployment. This is just a plain Node.js script that replaces files PLACEHOLDER with api.url.
Of course during compile time the CDK does not know what will be the final adress of REST endpoint because this is happening during deploy time and it updates my url with somethis like:
'https://${Token[TOKEN.26]}.execute-api.${Token[AWS::Region.4]}.${Token[AWS::URLSuffix.1]}/${Token[TOKEN.32]}/;'
Is there any way that I can update this after integrating lambda with API endpooint after deploying those?
I would like to use #aws-cdk/aws-s3-deployment module to deploy code to newly created bucket. All in the same Stack, so one cdk deploy will update everything I need.
To avoid confusion. My updateWebsiteUrl is:
export function newUrl(newUrl: string): void {
const scriptPath = path.join(__dirname, '/../../front/');
const scriptName = 'script.js';
fs.readFile(scriptPath + scriptName, (err, buf) => {
let scriptContent : string = buf.toString();
let newScript = scriptContent.replace('URL_PLACEHOLDER', newUrl);
fs.writeFile(scriptPath + 'newScript.js', newScript, () => {
console.log('done writing');
});
});
}
And my script is just simple:
const url = URL_PLACEHOLDER;
function foo() {
let req = new XMLHttpRequest();
req.open('GET', url , false);
req.send(null);
if (req.status == 200) {
replaceContent(req.response);
}
}
function replaceContent(content) {
document.getElementById('content').innerHTML = content;
}
I ran into the same issue today and managed to find a solution for it.
The C# code I am using in my CDK program is the following:
// This will at runtime be just a token which refers to the actual JSON in the format {'api':{'baseUrl':'https://your-url'}}
var configJson = stack.ToJsonString(new Dictionary<string, object>
{
["api"] = new Dictionary<string, object>
{
["baseUrl"] = api.Url
}
});
var configFile = new AwsCustomResource(this, "config-file", new AwsCustomResourceProps
{
OnUpdate = new AwsSdkCall
{
Service = "S3",
Action = "putObject",
Parameters = new Dictionary<string, string>
{
["Bucket"] = bucket.BucketName,
["Key"] = "config.json",
["Body"] = configJson,
["ContentType"] = "application /json",
["CacheControl"] = "max -age=0, no-cache, no-store, must-revalidate"
},
PhysicalResourceId = PhysicalResourceId.Of("config"),
},
Policy = AwsCustomResourcePolicy.FromStatements(
new[]
{
new PolicyStatement(new PolicyStatementProps
{
Actions = new[] { "s3:PutObject" },
Resources= new [] { bucket.ArnForObjects("config.json") }
})
})
});
}
You will need to install the following package to have the types available: https://docs.aws.amazon.com/cdk/api/latest/docs/custom-resources-readme.html
It is basically a part of the solution you can find as an answer to this question AWS CDK passing API Gateway URL to static site in same Stack, or at this GitHub repository: https://github.com/jogold/cloudstructs/blob/master/src/static-website/index.ts#L134
I followed following steps while trying to run android app test via AWS Lambda Node.JS
Created a project
Created an upload
Uploaded APK to signed url
Once upload was done I created device pool using following params
var createDevicePoolParams = {
name: "DAP_Device_Pool",
description: "DAP_Android_Devices",
projectArn: projectARN,
rules: [{
attribute: "PLATFORM",
operator: "EQUALS",
value: "\"ANDROID\""
}]
};
Then I called schedulerun with following params
var scheduleRunParams = {
appArn: uploadARN,
name: "tarunRun",
devicePoolArn: devicePoolARN,
projectArn: projectARN,
test: {
type: "BUILTIN_FUZZ",
}
};
But I am getting error of missing or unprocessed resources.
I am not able to understand what I am missing. My understanding is that If I am using built in fuzz testing type then I dont need to upload any custom testcases.
Can somebody pls help pointing out what step is missing
Then
After your uploads have been processed by Device Farm, call aws devicefarm schedule-run
[update]
I put this code in a AWS Lambda function and it worked there as well. Here is a gist of it:
https://gist.github.com/jamesknowsbest/3ea0e385988b0098e5f9d38bf5a932b6
Here is the code I just authored and it seems to work with the Built-inFuzz/Explorer tests
// assume we already executed `npm install aws-sdk`
var AWS = require('aws-sdk');
// assumes `npm install https`
const request = require("request");
// assumes `npm install fs`
const fs = require('fs');
// https://stackoverflow.com/a/41641607/8016330
const sleep = (waitTimeInMs) => new Promise(resolve => setTimeout(resolve, waitTimeInMs));
// Device Farm is only available in the us-west-2 region
var devicefarm = new AWS.DeviceFarm({ region: 'us-west-2' });
(async function() {
let project_params = {
name: "test of fuzz tests"
};
let PROJECT_ARN = await devicefarm.createProject(project_params).promise().then(
function(data){
return data.project.arn;
},
function (error) {
console.error("Error creating project", "Error: ", error);
}
);
console.log("Project created ", "Project arn: ", PROJECT_ARN);
// create the upload and upload files to the project
let params = {
name: "app-debug.apk",
type: "ANDROID_APP",
projectArn: PROJECT_ARN
};
let UPLOAD = await devicefarm.createUpload(params).promise().then(
function(data){
return data.upload;
},
function(error){
console.error("Creating upload failed with error: ", error);
}
);
let UPLOAD_ARN = UPLOAD.arn;
let UPLOAD_URL = UPLOAD.url;
console.log("upload created with arn: ", UPLOAD_ARN);
console.log("uploading file...");
let options = {
method: 'PUT',
url: UPLOAD_URL,
headers: {},
body: fs.readFileSync("/path/to/your/apk/file")
};
// wait for upload to finish
await new Promise(function(resolve,reject){
request(options, function (error, response, body) {
if (error) {
console.error("uploading file failed with error: ", error);
reject(error);
}
resolve(body);
});
});
//get the status of the upload and make sure if finished processing before scheduling
let STATUS = await getStatus(UPLOAD_ARN);
console.log("upload status is: ", STATUS);
while(STATUS !== "SUCCEEDED"){
await sleep(5000);
STATUS = await getStatus(UPLOAD_ARN);
console.log("upload status is: ", STATUS);
}
//create device pool
let device_pool_params = {
projectArn: PROJECT_ARN,
name: "Google Pixel 2",
rules: [{"attribute": "ARN","operator":"IN","value":"[\"arn:aws:devicefarm:us-west-2::device:5F20BBED05F74D6288D51236B0FB9895\"]"}]
}
let DEVICE_POOL_ARN = await devicefarm.createDevicePool(device_pool_params).promise().then(
function(data){
return data.devicePool.arn;
},function(error){
console.error("device pool failed to create with error: ",error);
}
);
console.log("Device pool created successfully with arn: ", DEVICE_POOL_ARN);
//schedule the run
let schedule_run_params = {
name: "MyRun",
devicePoolArn: DEVICE_POOL_ARN, // You can get the Amazon Resource Name (ARN) of the device pool by using the list-pools CLI command.
projectArn: PROJECT_ARN, // You can get the Amazon Resource Name (ARN) of the project by using the list-projects CLI command.
test: {
type: "BUILTIN_FUZZ"
},
appArn: UPLOAD_ARN
};
let schedule_run_result = await devicefarm.scheduleRun(schedule_run_params).promise().then(
function(data){
return data.run;
},function(error){
console.error("Schedule run command failed with error: ", error);
}
);
console.log("run finished successfully with result: ", schedule_run_result);
})();
async function getStatus(UPLOAD_ARN){
return await devicefarm.getUpload({arn: UPLOAD_ARN}).promise().then(
function(data){
return data.upload.status;
},function(error){
console.error("getting upload failed with error: ", error);
}
);
}
Ouput is:
Project created Project arn: arn:aws:devicefarm:us-west-2:111122223333:project:b9233b49-967e-4b09-a51a-b5c4101340a1
upload created with arn: arn:aws:devicefarm:us-west-2:111122223333:upload:b9233b49-967e-4b09-a51a-b5c4101340a1/48ffd115-f7d7-4df5-ae96-4a44911bff65
uploading file...
upload status is: INITIALIZED
upload status is: SUCCEEDED
Device pool created successfully with arn: arn:aws:devicefarm:us-west-2:111122223333:devicepool:b9233b49-967e-4b09-a51a-b5c4101340a1/c0ce1bbc-7b40-4a0f-a419-ab024a6b1000
run finished successfully with result: { arn:
'arn:aws:devicefarm:us-west-2:111122223333:run:b9233b49-967e-4b09-a51a-b5c4101340a1/39369894-3829-4e14-81c9-bdfa02c7e032',
name: 'MyRun',
type: 'BUILTIN_FUZZ',
platform: 'ANDROID_APP',
created: 2019-06-06T23:51:13.529Z,
status: 'SCHEDULING',
result: 'PENDING',
started: 2019-06-06T23:51:13.529Z,
counters:
{ total: 0,
passed: 0,
failed: 0,
warned: 0,
errored: 0,
stopped: 0,
skipped: 0 },
totalJobs: 1,
completedJobs: 0,
billingMethod: 'METERED',
seed: 982045377,
appUpload:
'arn:aws:devicefarm:us-west-2:111122223333:upload:b9233b49-967e-4b09-a51a-b5c4101340a1/48ffd115-f7d7-4df5-ae96-4a44911bff65',
eventCount: 6000,
jobTimeoutMinutes: 150,
devicePoolArn:
'arn:aws:devicefarm:us-west-2:111122223333:devicepool:b9233b49-967e-4b09-a51a-b5c4101340a1/c0ce1bbc-7b40-4a0f-a419-ab024a6b1000',
radios: { wifi: true, bluetooth: false, nfc: true, gps: true } }
HTH
-James
Trigger a cloud function whenever a new file is uploaded to cloud storage bucket. This function should call a dataproc job written in pyspark to read the file and load it to BigQuery.
I want to know how to call a google dataproc job from cloud function. Please suggest.
I was able to create a simple Cloud Function that triggers Dataproc Job on GCS create file event. In this example, the file in GCS contains a Pig query to execute. However you can follow Dataproc API documentation to create a PySpark version.
index.js:
exports.submitJob = (event, callback) => {
const google = require('googleapis');
const projectId = 'my-project'
const clusterName = 'my-cluster'
const file = event.data;
if (file.name) {
google.auth.getApplicationDefault(function (err, authClient, projectId) {
if (err) {
throw err;
}
const queryFileUri = "gs://" + file.bucket + "/" + file.name
console.log("Using queryFileUri: ", queryFileUri);
if (authClient.createScopedRequired && authClient.createScopedRequired()) {
authClient = authClient.createScoped([
'https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/userinfo.email'
]);
}
const dataproc = google.dataproc({ version: 'v1beta2', auth: authClient });
dataproc.projects.regions.jobs.submit({
projectId: projectId,
region: "global",
resource: {
"job": {
"placement": {"clusterName": clusterName},
"pigJob": {
"queryFileUri": queryFileUri,
}
}
}
}, function(err, response) {
if (err) {
console.error("Error submitting job: ", err);
}
console.log("Dataproc response: ", response);
callback();
});
});
} else {
throw "Skipped processing file!";
}
callback();
};
Make sure to set Function to execute to submitJob.
package.json:
{
"name": "sample-cloud-storage",
"version": "0.0.1",
"dependencies":{ "googleapis": "^21.3.0" }
}
The following blogpost gave me many ideas how to get started:
https://cloud.google.com/blog/big-data/2016/04/scheduling-dataflow-pipelines-using-app-engine-cron-service-or-cloud-functions
I am relatively new to AWS and Alexa skills. I am building a simple custom skill that gives you a dressing advice depending on the weather.
I have 2 custom intents : dressingTodayIntent & dressingTomorrowIntent. In the Service Simulator of the developer portal, my two intents don't work, I do get a lambda response though, but with an undefined outputSpeech, like this:
{
"version": "1.0",
"response": {
"outputSpeech": {
"type": "SSML",
"ssml": "<speak> undefined </speak>"
},
"card": null,
"reprompt": null,
"speechletResponse": {
"outputSpeech": {
"id": null,
"ssml": "<speak> undefined </speak>"
},
"card": null,
"directives": null,
"reprompt": null,
"shouldEndSession": true
}
},
"sessionAttributes": {}
}
Could it be a scope issue in my intent code?
'DressingTodayIntent': function() {
var dressingAdvice;
var speechOutput = getJSON('https://api.darksky.net/forecast/9e0495a835ed823a705a9a567eee982a/48.861317,2.348764?units=si&exclude=currently,minutely,hourly,alerts,flags',
function(err, forecast) {
if (err) {
console.log('Error occurred while trying to retrieve weather data', err);
} else {
dressingAdvice = getDressingAdvice(forecast, true);
console.log("one " + dressingAdvice);
}
console.log("two " + dressingAdvice);
return dressingAdvice;
});
console.log("three " + speechOutput);
this.response.cardRenderer("Your dressing advice for today:", speechOutput);
this.response.speak(speechOutput);
this.emit(':responseReady');
},
In AWS Lambda, I see a correct output for the first 2 logs, and an error for the 3rd one:
first log: "one " + dressingAdvice, as expected
second log: "two " + dressingAdvice, as expected
third log: "three " + undefined
Thank you for you help!
When you say "tested from AWS Lambda", I assume that you mean using the AWS console to send a JSON test message to the Lambda, then looking at the response JSON to determine if it is correct?
If so, make sure that it matches the JSON sent to/from the Alexa test page in the dev portal. Sounds like they might be different.
Also, make sure that you are linked to the correct ARN in the Alexa skill.
The undefined is likely a variable scope issue in the code.
I noticed in your response that you don't have any sessionAttributes. Is your code setting or pulling the value for the response from a session value? If so, the values need to be sent back with the sessionAttributes.
I figured out what was wrong, I needed to move the response code into the callback function, like this:
'DressingTodayIntent': function() {
var speechOutput;
var self = this;
var dressingAdvice = getJSON('https://api.darksky.net/forecast/9e0495a835ed823a705a9a567eee982a/48.861317,2.348764?units=si!ude=currently,minutely,hourly,alerts,flags',
function(err, forecast) {
if (err) {
console.log('Error occurred while trying to retrieve weather data', err);
} else {
speechOutput = getDressingAdvice(forecast, true);
}
self.response.cardRenderer("Your dressing advice for today:", speechOutput);
self.response.speak(speechOutput);
self.emit(':responseReady');
});
},