I need a way to get notified when the task finished rather it failed or Load complete
const dms : AWS.DMS;
const params = {
StartReplicationTaskType: "reload-target",
ReplicationTaskArn:
"MyArn",
};
dms.startReplicationTask(
params,
(err: AWS.AWSError, data: AWS.DMS.StartReplicationTaskResponse) => {
if (err) {
console.log("failed_with_error", err);
}
if (data) {
console.log("success ", data);
}
}
);
after the task started I got response on the data object with status: starting
but it's not updated with the other status.
I tried waitFor to track status changing but its not working
create event subscription under DMS --> Event subscriptions
then create Lambda function triggered by that event subscription
Related
I'm using GCP PubSub (push subscription to be precisely), and Cloud Run to execute subscribed messages.
Recently I've noticed that Cloud Run executes a same message for five times.
I see there is a retry policy for a subscription which its message has been unhandled (or mishandled), but Cloud Run clearly is giving 200 OK response.
So it seems that sending 200 OK response is not enough, but I cannot find a way to send ack sign properly.
Here's the code to publish message.
function publish(message: string) {
const pubsub = new PubSub({ projectId: 'my-project' });
const dataBuffer = Buffer.from(message);
const topicName = 'my-topic';
pubsub
.topic(topicName)
.publish(dataBuffer)
.then((messageId) => {
console.log(`Message ${messageId} published`);
})
.catch((err) => {
console.log(err);
});
}
publish(JSON.stringify({ foo : 'bar' }));
And here's the code of Cloud Run.
// express app
app.post('/run', async (req, res) => {
try {
const body = req.body.message ? Buffer.from(req.body.message.data, 'base64').toString() : req.body;
// req.body came out to be "{"foo":"bar"}"
const deliveredMessage = JSON.parse(body);
// do something
// Do I have to do something like message.ack() here?
return res.status(200).end();
} catch (e) {
return res.status(502).end();
}
});
These are example responses of Cloud Run retrying.
Here's a graph of unacked messages at that time.
Is your cloud run app running when you did the post?
Because there is a timeout for the messages waiting in the queue, maybe pub/sub is resending the messages before you start to consuming it.
Another thing you can try is putting the response at the first line, like:
app.post('/run', async (req, res) => {
try {
res.status(200)
...
We can use GCP cloud functions to start and stop the GCP instances but I need to work on scheduled suspend and resume of GCP instances using cloud function and scheduler.
From GCP documentation, I got that we can do start and stop using cloud functions available below
https://github.com/GoogleCloudPlatform/nodejs-docs-samples/tree/master/functions/scheduleinstance
Do we have same node JS or other language Pcgks available to suspend and resume GCP instances?
If not can we create our own for suspend/resume.
When I tried one I got below error
"TypeError: compute.zone(...).vm(...).resume is not a function
Edit, thanks Chris and Guillaume, after going through you links i have edited my code and below is my index.js file now.
For some reason when I do
gcloud functions deploy resumeInstancePubSub --trigger-topic resume-instance --runtime nodejs10 --allow-unauthenticated
i always get
Function 'resumeInstancePubSub1' is not defined in the provided module.
resumeInstancePubSub1 2020-09-04 10:57:00.333 Did you specify the correct target function to execute?
i have not worked on Node JS Or JS before, I was expecting something similar to start/stop documentation which I could make work easily using below git repo
https://github.com/GoogleCloudPlatform/nodejs-docs-samples.git
My index.js file,
// BEFORE RUNNING:
// ---------------
// 1. If not already done, enable the Compute Engine API
// and check the quota for your project at
// https://console.developers.google.com/apis/api/compute
// 2. This sample uses Application Default Credentials for authentication.
// If not already done, install the gcloud CLI from
// https://cloud.google.com/sdk and run
// `gcloud beta auth application-default login`.
// For more information, see
// https://developers.google.com/identity/protocols/application-default-credentials
// 3. Install the Node.js client library by running
// `npm install googleapis --save`
const {google} = require('googleapis');
var compute = google.compute('beta');
authorize(function(authClient) {
var request = {
// Project ID for this request.
project: 'my-project', // TODO: Update placeholder value.
// The name of the zone for this request.
zone: 'my-zone', // TODO: Update placeholder value.
// Name of the instance resource to resume.
instance: 'my-instance', // TODO: Update placeholder value.
resource: {
// TODO: Add desired properties to the request body.
},
auth: authClient,
};
exports.resumeInstancePubSub = async (event, context, callback) => {
try {
const payload = _validatePayload(
JSON.parse(Buffer.from(event.data, 'base64').toString())
);
const options = {filter: `labels.${payload.label}`};
const [vms] = await compute.getVMs(options);
await Promise.all(
vms.map(async (instance) => {
if (payload.zone === instance.zone.id) {
const [operation] = await compute
.zone(payload.zone)
.vm(instance.name)
.resume();
// Operation pending
return operation.promise();
}
})
);
// Operation complete. Instance successfully started.
const message = `Successfully started instance(s)`;
console.log(message);
callback(null, message);
} catch (err) {
console.log(err);
callback(err);
}
};
compute.instances.resume(request, function(err, response) {
if (err) {
console.error(err);
return;
}
// TODO: Change code below to process the `response` object:
console.log(JSON.stringify(response, null, 2));
});
});
function authorize(callback) {
google.auth.getClient({
scopes: ['https://www.googleapis.com/auth/cloud-platform']
}).then(client => {
callback(client);
}).catch(err => {
console.error('authentication failed: ', err);
});
}
Here and here is the documetation for the new beta verison of the api. You can see that you can suspend an instance like:
compute.instances.suspend(request, function(err, response) {
if (err) {
console.error(err);
return;
}
And you can resume a instance in a similar way:
compute.instances.resume(request, function(err, response) {
if (err) {
console.error(err);
return;
}
GCP recently added "create schedule" feature to start and stop the VM instances based on the configured schedule.
More details can be found at
https://cloud.google.com/compute/docs/instances/schedule-instance-start-stop#managing_instance_schedules
I'm trying to connect to AWS documentDB from Lambda function but, not able to connect.
MongoClient.connect never calls the callback function connected.
TLS is off on documentDB Cluster. I'm able to connect via mongo shell.
Lambda & documentDB are in same VPC & Security group.
'use strict';
module.exports.search = async (event, context, callback) => {
const MongoClient = require('mongodb').MongoClient;
const url = "mongodb://xxx:xxxx#xxx-db.cluster-xxx.us-east-2.docdb.amazonaws.com:27017";
console.log("Starting");
MongoClient.connect(url,
{
useNewUrlParser: true
},
function(err, client) {
if(err)
throw err;
console.log("Connected");
db = client.db('mydb');
col = db.collection('mycollection');
col.find({}).toArray().then(result => {
console.log(result);
return { statusCode: 200, body: result };
}).catch(err => {
console.log('=> an error occurred: ', err);
return { statusCode: 500, body: 'error' };
});
});
};
Output only prints starting which was consoled before calling Mongo.Connect.
How to identify or debug the issue ?
Just from looking at the current code I am pretty sure your function exit before it is able to complete. Therefore, your callback is not executed
Because MongoClient.connect runs asynchronously
Try to take a look at some resource around async/await/promise and Lambda
https://medium.com/tensult/async-await-on-aws-lambda-function-for-nodejs-2783febbccd9
How to wait for async actions inside AWS Lambda?
I have an AWS Lambda that runs every five minutes. The Lambda will create a connection to an RDS database inside my VPC and run some queries.
The Lambda will successfully run three times, so for about 15 minutes or so, but, then I get a timeout error:
Task timed out after 600.10 seconds
After this timeout error, the next time the Lambda attempts to run, I can no longer connect to my RDS database. I get the following timeout error:
Error: connect ETIMEDOUT
I'm pretty stumped at this point and could use some more eyes on this:
'use strict';
const mysql = require('mysql');
const util = require('util');
const {
fetchQuery,
insertQuery,
updateQuery,
} = require('./queries');
const {
getInsertValues,
getUpdateValues,
} = require('./utils');
const connection = mysql.createConnection({
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
});
// node native promisify
const query = util.promisify(connection.query).bind(connection);
connection.connect();
module.exports.scan = async (event, context, callback) => {
let results = await query(fetchQuery);
console.log(`found ${results.length} forms that are scheduled to be filed`);
if (results.length > 0) {
const insertValues = getInsertValues(results);
const updateValues = getUpdateValues(results);
try {
console.log(`creating user_tasks`);
await query(insertQuery, [insertValues]);
console.log(`updating next_scheduled dates`);
await query(updateQuery, [updateValues]);
callback(null, 'successfully updated form next_scheduled')
} catch (err) {
console.error('error creating user_tasks')
callback(err, 'error creating user_tasks')
}
}
}
Try closing the connection to the database once you are done using it, I previously had a similar error because the open connection was keeping the Lambda alive until it got timeout
We are using Cloud Function to process images and sending a notification to PubSub once processing is complete. The function is built using nodejs 8 [beta]
Here's the snippet:
const PubSubMessage = require('#google-cloud/pubsub');
const pubsub = new PubSubMessage();
const htmlData = JSON.stringify({ url: url, html: Buffer.from(html).toString('base64') });
const htmlDataBuffer = Buffer.from(htmlData);
pubsub
.topic("projects/<project-id>/topics/<name>")
.publisher()
.publish(htmlDataBuffer)
.then(messageId => {
console.log('Message ${messageId} published.');
})
.catch(err => {
console.error('ERROR in sending to pubsub:', err);
console.log('ERROR in sending to pubsub:', err);
});
The message does get published to the PubSub queue, however, it is pushed ~90 seconds after the image processing is complete.
Any idea why pushing message to PubSub might be taking so long?
-anurag