I have a firestore database that I want to backup regulalry using google cloud's scheduler alongside google cloud function. I am following this tutorial: https://firebase.google.com/docs/firestore/solutions/schedule-export#gcp-console.
I am using Node.js 16 with 2 GB of RAM with the code below:
const firestore = require('#google-cloud/firestore');
const client = new firestore.v1.FirestoreAdminClient();
const bucket = 'gs://BUCKET_NAME'
exports.scheduledFirestoreExport = (event, context) => {
const databaseName = client.databasePath(
process.env.GCLOUD_PROJECT,
'(default)'
);
return client
.exportDocuments({
name: databaseName,
outputUriPrefix: bucket,
collectionIds: [],
})
.then(responses => {
const response = responses[0];
return response;
})
.catch(err => {
});
};
When I test the function it just returns an Internal Server Error 500. Any ideas?
Also, here is a picture of the errors im getting from the logs:
Related
I using the Next.js app for my project, that is hosted in AWS Amplify. All are working fine until I added getServerSideProps to call the Lambda function to get the data. It is for SEO purpose.
This is the code for getServerSideProps in pages folder
pages->categories->[categoriesId->[nftId].tsx
const NftDetails = dynamic(() => import('../../../components/NFTDetails'));
import dynamic from 'next/dynamic';
const CategoryNFT = (data: any) => {
return <NftDetails props={data} />;
};
export default CategoryNFT;
export const getServerSideProps = async (context: any) => {
const { nftId } = context.query;
console.log('NFT ID ----> ', nftId);
let requestOptions = {
method: 'GET',
headers: {
'Content-Type': 'application/json',
},
};
const res = await fetch(
`https://******.amazonaws.com/test/asset/get-asset-by-id?asset_id=${nftId}`,
requestOptions
);
const data = await res.json();
console.log(data);
return {
props: { data },
};
};
In local everything works perfectly fine. But when I pushed the code to amplify. this particular route gives this error message.
I researched about it, I increased the lambda timeout function to 1 minute, previously it was 30 seconds. (but in aws docs it should be 3 seconds )
Also I checked the logs the amplify project, it says timeout after 30.2 seconds. that's it.
Does anyone knows how to resolve this issue. Thanks in advance.
I have this app https://github.com/ChristianOConnor/google-cloudfunction-callfromreactapp. It works by simply calling some text via a button press. The text is delivered by a Netlify function. I set up the Netlify function by adding a netlify.toml file to the root directory:
netlify.toml:
[functions]
directory = "functions/"
and adding this file:
functions/hello-netlify.js:
exports.handler = async (event) => {
return {
statusCode: 200,
body: process.env.GREETING_TEST,
};
};
I added GREETING_TEST environmental variable in Netlify's deploy settings and set it to "this variable is now working":
The app works perfectly after deploying:
I have a default python Google Cloud Function that simply prints "Hello World!"
The question is, if I replace the test Netlify function that spits out "this variable is now working," with this,
import { JWT } from "google-auth-library";
exports.handler = async (event) => {
const client = new JWT({
email: process.env.CLIENT_EMAIL,
key: process.env.PRIVATE_KEY
});
const url = process.env.RUN_APP_URL;
const res = await client.request({url});
const resData = res.data
return {
statusCode: 200,
body: resData,
};
};
set the CLIENT_EMAIL and PRIVATE_KEY to that of my relevant Google Cloud Function service account, and set RUN_APP_URL to the Google Cloud Function's trigger url, would that be safe? My secret environment variables like PRIVATE_KEY would never be visible right?
I tried to edit the lambda function using AWS Cloud 9 but unfortunately, the layers were not imported. How do I attach lambda layers to it?
var config = require('./config.js');
var AWS = require("aws-sdk");
//How to import Axios in cloud 9 as I have added Axios layer in Lambda
const axios = require("axios");
exports.handler = async(event, context, callback) => {
//Get DataList
var get_data = await config.connectToDatabase()
.then(db => config.get_items(db))
.then(result => {
console.log(result);
return result;
});
callback(null, get_data);
}
The code is working fine when I run this function in lambda but there is an error when I tried the same function in cloud 9, It says Module not found Axios because Cloud 9 does not support layers.
I'm trying to connect to AWS documentDB from Lambda function but, not able to connect.
MongoClient.connect never calls the callback function connected.
TLS is off on documentDB Cluster. I'm able to connect via mongo shell.
Lambda & documentDB are in same VPC & Security group.
'use strict';
module.exports.search = async (event, context, callback) => {
const MongoClient = require('mongodb').MongoClient;
const url = "mongodb://xxx:xxxx#xxx-db.cluster-xxx.us-east-2.docdb.amazonaws.com:27017";
console.log("Starting");
MongoClient.connect(url,
{
useNewUrlParser: true
},
function(err, client) {
if(err)
throw err;
console.log("Connected");
db = client.db('mydb');
col = db.collection('mycollection');
col.find({}).toArray().then(result => {
console.log(result);
return { statusCode: 200, body: result };
}).catch(err => {
console.log('=> an error occurred: ', err);
return { statusCode: 500, body: 'error' };
});
});
};
Output only prints starting which was consoled before calling Mongo.Connect.
How to identify or debug the issue ?
Just from looking at the current code I am pretty sure your function exit before it is able to complete. Therefore, your callback is not executed
Because MongoClient.connect runs asynchronously
Try to take a look at some resource around async/await/promise and Lambda
https://medium.com/tensult/async-await-on-aws-lambda-function-for-nodejs-2783febbccd9
How to wait for async actions inside AWS Lambda?
Trying to use node to run a label detection with the vision api:
'use strict';
// Imports the Google Cloud client library
const vision = require('#google-cloud/vision');
// Creates a client
const client = new ImageAnnotatorClient({
projectId: 'my-project-xxx',
keyFilename: 'Users/xxx/Downloads/xxx.json',
});
// Performs label detection on the image file
client
.labelDetection('.//Users/xxx/Downloads/menu.jpg')
.then(results => {
const labels = results[0].labelAnnotations;
console.log('Labels:');
labels.forEach(label => console.log(label.description));
})
.catch(err => {
console.error('ERROR:', err);
});
Continuously receiving error: "ImageAnnotatorClient is not defined"
Any reason for this?
Can you try modifying the line:
const client = new ImageAnnotatorClient({
for:
const client = new vision.ImageAnnotatorClient({
The ImageAnnotatorClient method is extracted from the Cloud Vision API, which you imported as the vision variable.