I have written an HTTP triggered function in GCP and it computes some data as expected and after the computation, I would like to publish the result on a MQTT topic.
I added the following snippet of code but it triggers an error:
Error: Error: Cannot find module '#google-cloud/pubsub'
Below is the code added
//decoding worked
const PubSub = require('#google-cloud/pubsub');
// Your Google Cloud Platform project ID
const projectId = 'XXXXX';
// Instantiates a client
const pubsubClient = new PubSub({
projectId: projectId
});
// The name for the new topic
const topicName = 'XXXX';
// Creates the new topic
pubsubClient
.createTopic(topicName)
.then(results => {
const topic = results[0];
console.log(`Topic ${topic.name} created.`);
})
.catch(err => {
console.error('ERROR:', err);
});
If I get rid of the import of library, I get
Error: ReferenceError: PubSub is not defined
So - How can I publish in a topic from an HTTP triggered function in gcp?
You need to install the #google-cloud/pubsub library as dependency so that your Cloud Function can import it successfully. You can do it by running the following command locally:
npm install --save #google-cloud/pubsub
This will include this library in the package.json file you upload with your function code.
If you are writing your function from the Developer Console directly, you'll need to add the following to your package.json file:
"dependencies": {
"#google-cloud/pubsub": "^0.19.0"
}
Related
I am trying to deploy a simple containerized Express app using either GCE or Cloud Run.
It simply calls getSignedUrl with action set to 'read':
router.get('/', async function (req, res, next) {
const gs = new Storage();
const credentials = await gs.authClient.getCredentials();
log(`Client email: ${credentials.client_email}`);
const [url] = await gs.bucket(GS_BUCKET).file('0.txt').getSignedUrl({
version: 'v4',
action: 'read',
expires: Date.now() + 60_000,
contentType: 'text/plain',
});
res.render('index', { title: 'Express', ...credentials, url });
})
I set up my local development environment using the default service account for the project, as explained here.
Now, when I run it on my local machine, either directly (using Node.js) or in a container (using Docker), it works fine and generates a signed URL every time. When I try to build and deploy the container in the cloud (using Cloud Build + Cloud Run or GCE), however, I get the following error:
Error: The caller does not have permission
at Gaxios._request (/workspace/node_modules/gaxios/build/src/gaxios.js:130:23)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Compute.requestAsync (/workspace/node_modules/google-auth-library/build/src/auth/oauth2client.js:382:18)
at async GoogleAuth.signBlob (/workspace/node_modules/google-auth-library/build/src/auth/googleauth.js:721:21)
at async sign (/workspace/node_modules/#google-cloud/storage/build/src/signer.js:181:35) {
The client_email property is the same in both environments: 6***********-compute#developer.gserviceaccount.com i.e. the project's default service account, which seems to have the required permissions (as shown by the operation's succeeding on my local machine).
What could cause this error and how can I find out?
I want to deploy AWS CDK stacks from with a Lambda function. The use case is to support similar functionality to that described in Trek10's Serverless CI/CD pipeline, where the relevant code is here.
In a nutshell, AWS CodePipelines only support listening to changes within a single Git branch. I want to listen to GitHub events relating to the creation of repos and branches and create CodePipeline instances in response to these events so that there is a Pipeline for each branch of each Git repository. I want a Lambda to listen to the GitHub events and create CDK stacks. The Trek10 example uses Python and calls CloudFormation directly. I'd like the Lambdas to be much simpler and use the CDK instead.
Here is a simple Lambda using Typescript that is derived from this AWS CDK issue:
// index.ts
import {ScheduledEvent} from 'aws-lambda';
import {CloudFormationDeploymentTarget, DEFAULT_TOOLKIT_STACK_NAME} from 'aws-cdk/lib/api/deployment-target';
import {CdkToolkit} from 'aws-cdk/lib/cdk-toolkit';
import {AppStacks} from 'aws-cdk/lib/api/cxapp/stacks';
import {Configuration} from 'aws-cdk/lib/settings';
import {execProgram} from "aws-cdk/lib/api/cxapp/exec";
import * as yargs from 'yargs';
import {SDK} from 'aws-cdk/lib/api/util/sdk';
export const handleCloudWatchEvent = async (event: ScheduledEvent): Promise<void> => {
try {
const aws = new SDK();
const argv = await yargs.parse(['deploy', '--app', 'bin/pipeline.js', '--staging', '/tmp', '--verbose', '--require-approval', 'never']);
const configuration = new Configuration(argv);
await configuration.load();
const appStacks = new AppStacks({
configuration,
aws,
synthesizer: execProgram,
});
const provisioner = new CloudFormationDeploymentTarget({ aws });
const cli = new CdkToolkit({ appStacks, provisioner });
const toolkitStackName = configuration.settings.get(['toolkitStackName']) || DEFAULT_TOOLKIT_STACK_NAME;
await cli.deploy({
stackNames: [],
exclusively: argv.exclusively as boolean,
toolkitStackName,
roleArn: argv.roleArn as string,
requireApproval: configuration.settings.get(['requireApproval']),
ci: true,
reuseAssets: argv['build-exclude'] as string[],
sdk: aws
});
return;
} catch (e) {
console.error(e);
return;
}
};
However, I get an error as described in the issue mentioned above:
ERROR { Error: Cannot find module '../package.json'
at Function.Module._resolveFilename (internal/modules/cjs/loader.js:636:15)
at Function.Module._load (internal/modules/cjs/loader.js:562:25)
at Module.require (internal/modules/cjs/loader.js:692:17)
at new SDK (/var/task/node_modules/aws-cdk/lib/api/util/sdk.ts:92:39)
at Runtime.exports.handleCloudWatchEvent [as handler] (/resources/index.ts:204:21)
at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)
at process._tickCallback (internal/process/next_tick.js:68:7) code: 'MODULE_NOT_FOUND' }
I don't want to patch sdk.ts: https://github.com/aws/aws-cdk/blob/master/packages/aws-cdk/lib/api/util/sdk.ts#L92 as that seems to be a very dirty solution so I'm looking for another solution.
What is the simplest working example of calling the CDK from within a Lambda function?
Recently I had an ideal use case for a CDK deployer in Lambda but couldn't find any good and full examples of how to do this.
Using kadishmal example from CDK Github I managed to put CDK into Lambda layer, delete AWS SDK module from there (Lambda already has it), include it into a Lambda, and run CDK deploy/destroy from there. There is a size restriction for a Lambda and all its layers of 250mb so I also had to use Webpack to optimize the build size.
Here a Github repository with a basic example - a stack template that deploys S3 bucket and destroys it.
I created this repository which uses a 5 minute approach with Gradle & Docker to install CDK in a targeted location and mounts the Lambda Layer zipfile on your local system which you can use directly to manually upload in the AWS console or use i.e. with CDK.
An example Lambda handler function with NodeJS runtime and with the layer attached can look something like:
exports.handler = async (event) => {
const spawnSync = require('child_process').spawnSync;
const process = spawnSync('cdk', ['--version'], {
stdio: 'pipe',
stderr: 'pipe'
});
console.log(process.status);
console.log(process.stdout.toString());
};
I was previously using the all-in-one aws-sdk npm module (https://www.npmjs.com/package/aws-sdk) to invoke an AWS Lambda function, and for that the following code had been working well:
//Some code to get "credentials"
...
const AWS = require('aws-sdk');
const lambda = new AWS.Lambda({
accessKeyId: credentials.accessKeyId,
secretAccessKey: credentials.secretAccessKey,
region: Config.REGION
});
lambda.invoke(pullParams, (err, data) =>
//I would do something with data
);
...
Now, taking a cue from https://github.com/aws/aws-sdk-js-v3, I wish to use to modularised #aws-sdk/client-lambda-node, since it is the only class that I need in my project. Thus, I have changed my code (as suggested here: https://github.com/aws/aws-sdk-js-v3/tree/master/packages/client-lambda-node#usage) like so:
import * as AWS from "#aws-sdk/client-lambda-node/Lambda";
/*
I believe there is a typo in the form of
"
import * as AWS from "#aws-sdk/#aws-sdk/client-lambda-node/Lambda";
"
at the original page
*/
...
//Some code to get the same "credentials" as above
const lambda = new AWS.Lambda({
accessKeyId: credentials.accessKeyId,
secretAccessKey: credentials.secretAccessKey,
region: Config.REGION
});
lambda.invokeAsync(pullParams, (err, data) =>
//I want to do something with err / data
);
...
For what its worth, this is inside a ReactJS app (though I'm sure thats not relevant). Trying the above code with version 0.1.0-preview.5 inside a browser (where it worked earlier) perpetually gives me
http://169.254.169.254/latest/meta-data/iam/security-credentials/ net::ERR_CONNECTION_TIMED_OUT
Error: Unable to connect to instance metadata service (I guess related to (1))
Is the library unstable for use, or am I doing something wrong
You have to pass your credentials using key credentials.
Like:
const lambda = new AWS.Lambda({
credentials: {
accessKeyId: credentials.accessKeyId,
secretAccessKey: credentials.secretAccessKey,
},
region: Config.REGION
});
Or:
const lambda = new AWS.Lambda({
credentials,
region: Config.REGION
});
Yes, version 3 of the SDK is still in beta preview and they've stated that breaking changes are to be expected:
While the SDK is in preview, you may encounter bugs.
To answer your question, yes, it is absolutely unstable for production at the present time. Your specific issue is quite common and I was unable to get to the bottom of it either. My production work all uses v2 still.
I have followed a tutorial to get this working.
My Alexa skill is built with invocation, intents and utterances set up.
My Lambda function is set up.
My endpoints default region is:
arn:aws:lambda:us-east-1:(myID found in AWS Support Center ):function:myLearn
myLearn function in Lambda is set with Alexa Skills Kit which has my correct Skill ID copied from the skill.
My HelloIntent doesn't have a slot. I'm just trying to get a response from the invocation.
My code running node.js 6.10 with a handler called index.handler follows as this:
var Alexa = require("alexa-sdk");
var handlers = {
"HelloIntent": function () {
this.response.speak("Hello, It's Me.");
this.emit(':responseReady');
},
"LaunchRequest": function () {
this.response.speak("Welcome to my thing I got going on.");
this.emit(':responseReady');
}
};
exports.handler = function(event, context, callback){
var alexa = Alexa.handler(event, context);
alexa.registerHandlers(handlers);
alexa.execute();
};
I've read that there are issues with zips but I didn't upload anything - I just changed the default index.js file...and my handler isn't named anything different - it's index.handler.
When I run the test in the alexa console I get the ol:
"There was a problem with the requested skill's response"
My json output is null.
And when I go to my logs in Cloud Watch:
Unable to import module 'index': Error at Function.Module._resolveFilename
I did a search for this and many of the errors were how users uploaded the zips and there was a conflict with the handler name and js file.
It looks like you might have created the Lambda function from the AWS Console and not included the alexa-sdk. To fix this you can start by using one of the provided 'Alexa blueprints' that include the alexa-sdk then overwrite the code in the Lambda with your code. Or you can package your code in a .zip file that includes the alexa-sdk module and upload the package through the web console. Here is a video I did a while back that explains the issue https://youtu.be/cFzAIhsldbs - I'm pretty sure that's your problem. I hope this helps.
You can try using "speechOutput" variable to store your response and then use the emit function.
I have deployed a gcloud background function (pubsub) in emulator.
It is getting succesfully invoked from command line
functions call helloPubSub --data='{"message":"Hello World"}'
How to invoke gcloud local function from local server code ?
= = =
Below is my code on server side to publish to topic
pubsub
.topic(topicName)
.publisher()
.publish(dataBuffer)
.then(results => {
const messageId = results[0];
console.log(`Message ${messageId} published.`);
res.status(200)
res.send({hello:'world'})
})
.catch(err => {
console.error('ERROR:', err);
res.status(200)
res.send({err:err})
});
I receive following error message
{"err":{"code":7,"metadata":{"_internal_repr":{}},"note":"Exception occurred in retry method that was not classified as transient"}}
In the official docs it states:
Note: Functions deployed in the Emulator with non-HTTP triggers like Cloud Storage or Cloud Pub/Sub will not be invoked by these services. The Emulator is intended as a development environment only, so you would need to call these functions manually to have them invoked.
So if you deployed a function locally with Cloud Pub/Sub trigger, the only way to invoke it is using the command line command:
functions call [your-function]