NEXTJS Amplify slow server response - amazon-web-services

I've a SSR App Nextjs 12 installed on AWS Amplify that's too slow.
Logging the getServerSideProps() this is the result:
It takes 9 seconds to load page but the code inside getServerSideProps takes less than 0.5 second.
This is the server log:
START RequestId: 94ced4e1-ec32-4409-8039-fdcd9b5f5894 Version: 300
2022-09-13T09:25:32.236Z 94ced4e1-ec32-4409-8039-fdcd9b5f5894 INFO 1 [09:25:32.236] -
2022-09-13T09:25:32.253Z 94ced4e1-ec32-4409-8039-fdcd9b5f5894 INFO 2 [09:25:32.253] -
2022-09-13T09:25:32.255Z 94ced4e1-ec32-4409-8039-fdcd9b5f5894 INFO 3 [09:25:32.255] -
2022-09-13T09:25:32.255Z 94ced4e1-ec32-4409-8039-fdcd9b5f5894 INFO 4 [09:25:32.255] -
2022-09-13T09:25:32.431Z 94ced4e1-ec32-4409-8039-fdcd9b5f5894 INFO 5 [09:25:32.431] -
2022-09-13T09:25:32.496Z 94ced4e1-ec32-4409-8039-fdcd9b5f5894 INFO 6 [09:25:32.496] -
END RequestId: 94ced4e1-ec32-4409-8039-fdcd9b5f5894
REPORT RequestId: 94ced4e1-ec32-4409-8039-fdcd9b5f5894 Duration: 9695.59 ms Billed Duration: 9696 ms Memory Size: 512 MB Max Memory Used: 206 MB
That's the code:
export async function getServerSideProps(context) {
console.log("1 [" + new Date().toISOString().substring(11, 23) + "] -");
let req = context.req;
console.log("2 [" + new Date().toISOString().substring(11, 23) + "] -");
const { Auth } = withSSRContext({ req });
console.log("3 [" + new Date().toISOString().substring(11, 23) + "] -");
try {
console.log("4 [" + new Date().toISOString().substring(11, 23) + "] -");
const user = await Auth.currentAuthenticatedUser();
console.log("5 [" + new Date().toISOString().substring(11, 23) + "] -");
const dict = await serverSideTranslations(context.locale, ["common", "dashboard", "footer", "hedgefund", "info", "etf", "fs"]);
console.log("6 [" + new Date().toISOString().substring(11, 23) + "] -");
return {
props: {
exchange: context.params.exchange,
ticker: context.params.ticker,
username: user.username,
attributes: user.attributes,
...dict,
},
};
} catch (err) {
return {
redirect: {
permanent: false,
destination: "/auth/signin",
},
props: {},
};
}
}

This is not the answer but rather an alternative.
I tried using Amplify for my implementation because getServerSideProps on the Vercel hobby account gives a function timeout error. However, I think the Next.js deployment to Amplify is not optimized yet.
Instead of using getServerSideProps, I used getStaticPaths and getStaticProps whereby I always limited the number of paths to fetch from my API.
On client side
export const getStaticPaths: GetStaticPaths = async () => {
// This route to my API only gets paths(IDs)
const res = await getFetcher("/sentences-paths");
let paths = [];
if (res.success && res.resource) {
paths = res.resource.map((sentence: any) => ({
params: { sentenceSlug: sentence._id },
}));
}
return { paths, fallback: "blocking" };
};
On API
const getSentencePaths = async (req, res) => {
const limit = 50;
Sentence.find(query)
.select("_id")
.limit(limit)
.exec()
.then((resource) => res.json({ success: true, resource }))
.catch((error) => res.json({ success: false, error }));
};
This means even if I have 100 000 sentences, only 50 are rendered at build. The rest of the sentences are generated on demand because we have fallback: "blocking". See docs
Here is how my getStaticProps looks like
export const getStaticProps: GetStaticProps = async ({ params }) => {
const sentenceSlug = params?.sentenceSlug;
const response = await getFetcher(`/sentences/${sentenceSlug}`);
let sentence = null;
if (response.success && response.resource) sentence = response.resource;
if (!sentence) {
return {
notFound: true,
};
}
return {
props: { sentence },
revalidate: 60,
};
};
As you can see above, I used revalidate: 60 seconds see docs but since you wanted to use getServerSideProps, that's not the perfect solution.
The perfect solution is On-Demand Revalidation. With this, whenever you make a change to data that's used in a page, for example, change the sentence content, you can trigger a webhook to regenerate your page created by getStaticProps. So, your page will always be updated.
Go through this youtube tutorial to implement on-demand revalidation, really comprehensive https://www.youtube.com/watch?v=Wh3P-sS1w0I&t=8s&ab_channel=TuomoKankaanp%C3%A4%C3%A4.
Next.js on Vercel works way faster and more efficiently. Hope I helped.

Related

Pass data from cloud task to a firebase cloud function - currently getting an error

My question is this: how do I call a Firebase Cloud Function from a Cloud task and pass a payload through?
I tried following the tutorial here. The only difference is that I'm using Cloud functions for Firebase instead of regular Cloud Functions.
Here is my cloud function.
const functions = require("firebase-functions");
exports.myFunction = functions.https.onRequest((req, res) => {
console.log(req.query);
res.send('success');
});
When I query the url in the browser with parameters ?myparams=data I can log 'data' so I know the cloud function is basically working.
But when I try to call it from my queue (below) I get:
SyntaxError: Unexpected token o in JSON at position 1
at JSON.parse (<anonymous>)
My guess is that req is undefined.
I've been looking at this SO question and I am wondering if it has something to do with needing to use bodyParser for onRequest functions.
HTTP Event Cloud Function: request body value is undefined
I'm also seeing that some people have CORS issues with their cloud functions, which seems like it might be related.
Here is the task queue code that should be sending the payload.
const seconds = 5;
const project = 'xxxxx-xxxxxxx';
const queue = 'xxxxx';
const location = 'us-west2';
const url = 'https://us-central1-xxxxx-xxxxx.cloudfunctions.net/writeDB';
const payload = 'My data';
const parent = client.queuePath(project, location, queue);
const task = {
httpRequest: {
httpMethod: "POST",
url: url,
body: Buffer.from(JSON.stringify(payload)).toString("base64"),
headers: {
"Content-Type": "application/json"
},
oidcToken: {
serviceAccountEmail
}
}
};
task.scheduleTime = {
seconds: seconds + Date.now() / 1000,
};
const request = {parent: parent, task: task};
await client.createTask(request)
.then(response => {
const task = response[0].name;
console.log(`Created task ${task}`);
return {'Response': String(response)}
})
.catch(err => {
console.error(`Error in createTask: ${err.message || err}`);
next()
});
It calls the function, but for some reason it results in the error and the payload isn't logged.
Can anyone help?
As always, I'm happy to clarify the question if anything is unclear. Thanks!
I was able to replicate your error and I managed to fix it by changing the content type headers from "application/json" to "text/plain". I have also removed the JSON.stringify() function in the body value because your payload variable is a String type. Below is my modified sample of your code:
const {CloudTasksClient} = require('#google-cloud/tasks');
// Instantiates a client.
const client = new CloudTasksClient();
const seconds = 5;
const serviceAccountEmail = "xxxx-xxxxx-xxxxxx#appspot.gserviceaccount.com";
const project = 'xxxx-xxxxxx';
const queue = "xx-xxxxx";
const location = 'us-central1';
const url = "https://us-central1-xxxxx-xxxxx.cloudfunctions.net/myFunction";
const payload = 'My Data';
const parent = client.queuePath(project, location, queue);
async function quickstart() {
const task = {
httpRequest: {
httpMethod: "POST",
url: url,
body: Buffer.from(payload).toString("base64"), // your previous code: body: Buffer.from(JSON.stringify(payload)).toString("base64"),
headers: {
"Content-Type": "text/plain"
},
oidcToken: {
serviceAccountEmail
}
}
};
task.scheduleTime = {
seconds: seconds + Date.now() / 1000,
};
const request = {parent: parent, task: task};
await client.createTask(request)
.then(response => {
const task = response[0].name;
console.log(`Created task ${task}`);
return {'Response': String(response)}
})
.catch(err => {
console.error(`Error in createTask: ${err.message || err}`);
next()
});
}
quickstart();
In Cloud Functions, I changed req.query to req.body to get the result from Cloud Tasks
const functions = require("firebase-functions");
exports.myFunction = functions.https.onRequest((req, res) => {
console.log(req.body);
console.log('success')
res.send('success');
});

Dialogflow Google Calender integration "Error: No handler for requested intent at WebhookClient.handleRequest"

I did the Google Calender Integration in Dialogflow using the Fullfillment.
When I test the action I get the following error within the Google Cloud Platform
Error: No handler for requested intent
at WebhookClient.handleRequest (/workspace/node_modules/dialogflow-fulfillment/src/dialogflow-fulfillment.js:317:29)
at exports.dialogflowFirebaseFulfillment.functions.https.onRequest (/workspace/index.js:62:8)
at cloudFunction (/workspace/node_modules/firebase-functions/lib/providers/https.js:57:9)
at process.nextTick (/layers/google.nodejs.functions-framework/functions-framework/node_modules/#google-cloud/functions-framework/build/src/invoker.js:100:17)
at process._tickCallback (internal/process/next_tick.js:61:11)
This is my cloud function code - the name of the used intent is "book.appointment"
'use strict';
// Import the Dialogflow module from Google client libraries.
const functions = require('firebase-functions');
const {google} = require('googleapis');
const {WebhookClient} = require('dialogflow-fulfillment');
// Set up Google Calendar Service account credentials
const serviceAccountAuth = new google.auth.JWT({
email: serviceAccount.client_email,
key: serviceAccount.private_key,
scopes: 'https://www.googleapis.com/auth/calendar'
});
const calendar = google.calendar('v3');
process.env.DEBUG = 'dialogflow:*'; // enables lib debugging statements
const timeZone = 'Europe/Berlin';
const timeZoneOffset = '+02:00';
// Set the DialogflowApp object to handle the HTTPS POST request.
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
const agent = new WebhookClient({ request, response });
console.log("Parameters", agent.parameters);
const appointment_type = agent.parameters.AppointmentType;
function makeAppointment (agent) {
// Calculate appointment start and end datetimes (end = +1hr from start)
const dateTimeStart = new Date(Date.parse(agent.parameters.date.split('T')[0] + 'T' + agent.parameters.time.split('T')[1].split('-')[0] + timeZoneOffset));
const dateTimeEnd = new Date(new Date(dateTimeStart).setHours(dateTimeStart.getHours() + 1));
const appointmentTimeString = dateTimeStart.toLocaleString(
'en-US',
{ month: 'long', day: 'numeric', hour: 'numeric', timeZone: timeZone }
);
// Check the availability of the time, and make an appointment if there is time on the calendar
return createCalendarEvent(dateTimeStart, dateTimeEnd, appointment_type).then(() => {
agent.add(`Ok, let me see if we can fit you in. ${appointmentTimeString} is fine!.`);
}).catch(() => {
agent.add(`I'm sorry, there are no slots available for ${appointmentTimeString}.`);
});
}
// Handle the Dialogflow intent named 'book.appointment'.
let intentMap = new Map();
intentMap.set('book.appointment', makeAppointment);
agent.handleRequest(intentMap);
});
//Creates calendar event in Google Calendar
function createCalendarEvent (dateTimeStart, dateTimeEnd, appointment_type) {
return new Promise((resolve, reject) => {
calendar.events.list({
auth: serviceAccountAuth, // List events for time period
calendarId: calendarId,
timeMin: dateTimeStart.toISOString(),
timeMax: dateTimeEnd.toISOString()
}, (err, calendarResponse) => {
// Check if there is a event already on the Calendar
if (err || calendarResponse.data.items.length > 0) {
reject(err || new Error('Requested time conflicts with another appointment'));
} else {
// Create event for the requested time period
calendar.events.insert({ auth: serviceAccountAuth,
calendarId: calendarId,
resource: {summary: appointment_type +' Appointment', description: appointment_type,
start: {dateTime: dateTimeStart},
end: {dateTime: dateTimeEnd}}
}, (err, event) => {
err ? reject(err) : resolve(event);
}
);
}
});
});
}
Do you have any idea what I'm doing wrong?
Thanks in advance, Bianca
You've to split with '+' instead of '-' in :
const dateTimeStart = new Date(Date.parse(agent.parameters.date.split('T')[0] + 'T' + agent.parameters.time.split('T')[1].split('-')[0] + timeZoneOffset));
because you have timeZoneOffset is '+02:00'
Or try changing all that to :
const dateTimeStart = new Date(Date.parse(agent.parameters.date.split('T')[0] + 'T' + agent.parameters.time.split('T')[1]));
const dateTimeEnd = new Date(new Date(dateTimeStart).setHours(dateTimeStart.getHours() + 1));
const appointmentTimeString = dateTimeStart.toLocaleString(
'en-US',
{ weekday: 'short', day: 'numeric', month: 'short', hour: 'numeric', minute: 'numeric', timeZone: timeZone }
);
I hope this works, because it did for me.

AWS Device Farm - Missing or unprocessed resources

I followed following steps while trying to run android app test via AWS Lambda Node.JS
Created a project
Created an upload
Uploaded APK to signed url
Once upload was done I created device pool using following params
var createDevicePoolParams = {
name: "DAP_Device_Pool",
description: "DAP_Android_Devices",
projectArn: projectARN,
rules: [{
attribute: "PLATFORM",
operator: "EQUALS",
value: "\"ANDROID\""
}]
};
Then I called schedulerun with following params
var scheduleRunParams = {
appArn: uploadARN,
name: "tarunRun",
devicePoolArn: devicePoolARN,
projectArn: projectARN,
test: {
type: "BUILTIN_FUZZ",
}
};
But I am getting error of missing or unprocessed resources.
I am not able to understand what I am missing. My understanding is that If I am using built in fuzz testing type then I dont need to upload any custom testcases.
Can somebody pls help pointing out what step is missing
Then
After your uploads have been processed by Device Farm, call aws devicefarm schedule-run
[update]
I put this code in a AWS Lambda function and it worked there as well. Here is a gist of it:
https://gist.github.com/jamesknowsbest/3ea0e385988b0098e5f9d38bf5a932b6
Here is the code I just authored and it seems to work with the Built-inFuzz/Explorer tests
// assume we already executed `npm install aws-sdk`
var AWS = require('aws-sdk');
// assumes `npm install https`
const request = require("request");
// assumes `npm install fs`
const fs = require('fs');
// https://stackoverflow.com/a/41641607/8016330
const sleep = (waitTimeInMs) => new Promise(resolve => setTimeout(resolve, waitTimeInMs));
// Device Farm is only available in the us-west-2 region
var devicefarm = new AWS.DeviceFarm({ region: 'us-west-2' });
(async function() {
let project_params = {
name: "test of fuzz tests"
};
let PROJECT_ARN = await devicefarm.createProject(project_params).promise().then(
function(data){
return data.project.arn;
},
function (error) {
console.error("Error creating project", "Error: ", error);
}
);
console.log("Project created ", "Project arn: ", PROJECT_ARN);
// create the upload and upload files to the project
let params = {
name: "app-debug.apk",
type: "ANDROID_APP",
projectArn: PROJECT_ARN
};
let UPLOAD = await devicefarm.createUpload(params).promise().then(
function(data){
return data.upload;
},
function(error){
console.error("Creating upload failed with error: ", error);
}
);
let UPLOAD_ARN = UPLOAD.arn;
let UPLOAD_URL = UPLOAD.url;
console.log("upload created with arn: ", UPLOAD_ARN);
console.log("uploading file...");
let options = {
method: 'PUT',
url: UPLOAD_URL,
headers: {},
body: fs.readFileSync("/path/to/your/apk/file")
};
// wait for upload to finish
await new Promise(function(resolve,reject){
request(options, function (error, response, body) {
if (error) {
console.error("uploading file failed with error: ", error);
reject(error);
}
resolve(body);
});
});
//get the status of the upload and make sure if finished processing before scheduling
let STATUS = await getStatus(UPLOAD_ARN);
console.log("upload status is: ", STATUS);
while(STATUS !== "SUCCEEDED"){
await sleep(5000);
STATUS = await getStatus(UPLOAD_ARN);
console.log("upload status is: ", STATUS);
}
//create device pool
let device_pool_params = {
projectArn: PROJECT_ARN,
name: "Google Pixel 2",
rules: [{"attribute": "ARN","operator":"IN","value":"[\"arn:aws:devicefarm:us-west-2::device:5F20BBED05F74D6288D51236B0FB9895\"]"}]
}
let DEVICE_POOL_ARN = await devicefarm.createDevicePool(device_pool_params).promise().then(
function(data){
return data.devicePool.arn;
},function(error){
console.error("device pool failed to create with error: ",error);
}
);
console.log("Device pool created successfully with arn: ", DEVICE_POOL_ARN);
//schedule the run
let schedule_run_params = {
name: "MyRun",
devicePoolArn: DEVICE_POOL_ARN, // You can get the Amazon Resource Name (ARN) of the device pool by using the list-pools CLI command.
projectArn: PROJECT_ARN, // You can get the Amazon Resource Name (ARN) of the project by using the list-projects CLI command.
test: {
type: "BUILTIN_FUZZ"
},
appArn: UPLOAD_ARN
};
let schedule_run_result = await devicefarm.scheduleRun(schedule_run_params).promise().then(
function(data){
return data.run;
},function(error){
console.error("Schedule run command failed with error: ", error);
}
);
console.log("run finished successfully with result: ", schedule_run_result);
})();
async function getStatus(UPLOAD_ARN){
return await devicefarm.getUpload({arn: UPLOAD_ARN}).promise().then(
function(data){
return data.upload.status;
},function(error){
console.error("getting upload failed with error: ", error);
}
);
}
Ouput is:
Project created Project arn: arn:aws:devicefarm:us-west-2:111122223333:project:b9233b49-967e-4b09-a51a-b5c4101340a1
upload created with arn: arn:aws:devicefarm:us-west-2:111122223333:upload:b9233b49-967e-4b09-a51a-b5c4101340a1/48ffd115-f7d7-4df5-ae96-4a44911bff65
uploading file...
upload status is: INITIALIZED
upload status is: SUCCEEDED
Device pool created successfully with arn: arn:aws:devicefarm:us-west-2:111122223333:devicepool:b9233b49-967e-4b09-a51a-b5c4101340a1/c0ce1bbc-7b40-4a0f-a419-ab024a6b1000
run finished successfully with result: { arn:
'arn:aws:devicefarm:us-west-2:111122223333:run:b9233b49-967e-4b09-a51a-b5c4101340a1/39369894-3829-4e14-81c9-bdfa02c7e032',
name: 'MyRun',
type: 'BUILTIN_FUZZ',
platform: 'ANDROID_APP',
created: 2019-06-06T23:51:13.529Z,
status: 'SCHEDULING',
result: 'PENDING',
started: 2019-06-06T23:51:13.529Z,
counters:
{ total: 0,
passed: 0,
failed: 0,
warned: 0,
errored: 0,
stopped: 0,
skipped: 0 },
totalJobs: 1,
completedJobs: 0,
billingMethod: 'METERED',
seed: 982045377,
appUpload:
'arn:aws:devicefarm:us-west-2:111122223333:upload:b9233b49-967e-4b09-a51a-b5c4101340a1/48ffd115-f7d7-4df5-ae96-4a44911bff65',
eventCount: 6000,
jobTimeoutMinutes: 150,
devicePoolArn:
'arn:aws:devicefarm:us-west-2:111122223333:devicepool:b9233b49-967e-4b09-a51a-b5c4101340a1/c0ce1bbc-7b40-4a0f-a419-ab024a6b1000',
radios: { wifi: true, bluetooth: false, nfc: true, gps: true } }
HTH
-James

Why is my jest async action creator test not working?

I am very new to unit testing, and am trying to go through my react-redux project to write some tests.
Why is this test not working, and how could I make it pass?
Here is the test. I want to test my fetch posts action creator. This is for a small blog application.:
import configureStore from 'redux-mock-store'; // ES6 modules
import { findSinglePost, sendEdit, changeRedirect, checkBoxChange } from '../client/redux/actions/postActions';
import thunk from 'redux-thunk';
import axios from 'axios';
const middlewares = [thunk];
const mockStore = configureStore(middlewares);
describe('asynchronous action creators', () => {
it('should fetch posts', () => {
let store = mockStore({})
//my async action creator. It uses mock data that's in the same folder.
const fetchPosts = () => function(dispatch) {
dispatch({type: 'FETCH_POSTS'});
return axios.get('./MOCK.json').then((response) => {
dispatch({type: 'FETCH_POSTS_FUFILLED', payload: response.data});
}).catch((err) => {
dispatch({type: 'FETCH_POSTS_REJECTED', payload: err});
});
};
//this doesn't equal FETCH_POSTS_FUFILLED, it ends up equaling just "FETCH_POSTS"
return store.dispatch(fetchPosts()).then(() => {
const actions = store.getActions();
expect(actions[0]).toEqual({type: 'FETCH_POSTS_FUFILLED'});
})
})
});
Here is jest's feedback. I want it to equal 'FETCH_POSTS_'FUFILLED', but it's returning 'FETCH_POSTS'. :
FAIL _test_\actions.test.js
● asynchronous action creators › should fetch posts
expect(received).toEqual(expected)
Expected value to equal:
{"type": "FETCH_POSTS_FUFILLED"}
Received:
{"type": "FETCH_POSTS"}
Difference:
- Expected
+ Received
Object {
- "type": "FETCH_POSTS_FUFILLED",
+ "type": "FETCH_POSTS",
}
88 | return store.dispatch(fetchPosts()).then(() => {
89 | const actions = store.getActions();
> 90 | expect(actions[0]).toEqual({type: 'FETCH_POSTS_FUFILLED'});
91 | })
92 | })
93 | });
at _test_/actions.test.js:90:26
PASS client\views\LoginPage\LoginPage.test.jsx
Test Suites: 1 failed, 1 passed, 2 total
Tests: 1 failed, 5 passed, 6 total
Snapshots: 0 total
Time: 1.49s
Ran all test suites related to changed files.
Also, here is the project's github repo if you want to try to run it.
Also, if there's a standard way in the industry that's more well known on how to do this, I'd love the advice.
Edit:
When I change actions[0] to actions[ 1] I get this error:
Expected value to equal:
{"type": "FETCH_POSTS_FUFILLED"}
Received:
{"payload": {Symbol(impl): {"message": "The string did not match the expected pattern.", "name": "SyntaxError", Symbol(wrapper): [Circular]}}, "type": "FETCH_POSTS_REJECTED"}
Difference:
- Expected
+ Received
Object {
- "type": "FETCH_POSTS_FUFILLED",
+ "payload": DOMException {
+ Symbol(impl): DOMExceptionImpl {
+ "message": "The string did not match the expected pattern.",
+ "name": "SyntaxError",
+ Symbol(wrapper): [Circular],
+ },
+ },
+ "type": "FETCH_POSTS_REJECTED",
Here is the picture form of jest's feedback:
The mocked store you are using will store all dispatched calls that have been made to it. In your case, two dispatch calls should be made, the first being FETCH_POSTS and the second being either FETCH_POST_FULFILLED or FETCH_POST_REJECTED.
Hence when you retrieve the dispatched actions from the mocked store, the first entry (which you are using in your expect) will be the FETCH_POSTS. You should check the second value in the array, which would be either FETCH_POSTS_FULFILLED or FETCH_POSTS_REJECTED based on how the promise is resolved in the function you are testing.

batch update contacts - google people api

Is there a way to batch CRUD contacts with the new google people api (i see getBatchGet exists for READS)? My app is gonna hit ratelimits left and right if we upgrade from the old gdata contacts api.
Follow the Google People APIs to learn how to populate your objects, the most important part is the way of using Google Batch API in a do-while loop:
const { google } = require('googleapis')
function extractJSON (str) {
const result = []
let firstOpen = 0
let firstClose = 0
let candidate = 0
firstOpen = str.indexOf('{', firstOpen + 1)
do {
firstClose = str.lastIndexOf('}')
if ( firstClose <= firstOpen ) {
return []
}
do {
candidate = str.substring(firstOpen, firstClose + 1)
try {
result.push(JSON.parse(candidate))
firstOpen = firstClose
} catch (e) {
firstClose = str.substr(0, firstClose).lastIndexOf('}')
}
} while ( firstClose > firstOpen )
firstOpen = str.indexOf('{', firstOpen + 1)
} while ( firstOpen !== -1 )
return result
}
async function batchDeleteContacts (resourceIds) {
/*
resourceIds = [
'c1504716451892127784',
'c1504716451892127785',
'c1504716451892127786,
....'
]
Setup your google-api client and extract the oauth header/
*/
const authHeader = await google.oAuth2Client.getRequestHeaders()
let counter = 0
let confirmed = []
try {
do {
const temp = resourceIds.splice(0, 25)
const multipart = temp.map((resourceId, index) => ({
'Content-Type': 'application/http',
'Content-ID': (counter * 25) + index,
'body': `DELETE /v1/people/${resourceId}:deleteContact HTTP/1.1\n`
}))
const responseString = await request.post({
url: 'https://people.googleapis.com/batch',
method: 'POST',
multipart: multipart,
headers: {
'Authorization': authHeader.Authorization,
'content-type': 'multipart/mixed'
}
})
const result = extractJSON(responseString)
confirmed = confirmed.concat(result)
counter ++
} while ( resourceIds.length > 0 )
} catch (ex) {
// Handling exception here
}
return confirmed
}
async function batchInsertContacts (contacts) {
/*
Follow the Google People APIs documentation to learn how to generate contact objects,
Its easy and depends on your needs.
contacts = [{
resource: {
clientData,
names,
nicknames,
birthdays,
urls,
addresses,
emailAddresses,
phoneNumbers,
biographies,
organizations
}
}]
Setup your google-api client and extract the oauth header/
*/
const authHeader = await google.oAuth2Client.getRequestHeaders()
let counter = 0
let confirmed = []
const authHeader = await this.oAuth2Client.getRequestHeaders()
try {
do {
const temp = contacts.splice(0, 25)
const multipart = temp.map((contact, index) => ({
'Content-Type': 'application/http',
'Content-ID': (counter * 25) + index,
'body': 'POST /v1/people:createContact HTTP/1.1\n'
+ 'Content-Type: application/json\n\n'
+ JSON.stringify(contact.resource)
}))
const responseString = await request.post({
url: 'https://people.googleapis.com/batch',
method: 'POST',
multipart: multipart,
headers: {
'Authorization': authHeader.Authorization,
'content-type': 'multipart/mixed'
}
})
const result = extractJSON(responseString)
confirmed = confirmed.concat(result)
counter ++
} while ( contacts.length > 0 )
} catch (ex) {
// Handling exception here
}
return confirmed
}
async function batchUpdateContacts (contacts) {
/*
Follow the Google People APIs documentation to learn how to generate contact objects,
Its easy and depends on your needs.
contacts = [{
resourceId: 'c1504616451882127785'
resource: {
clientData,
names,
nicknames,
birthdays,
urls,
addresses,
emailAddresses,
phoneNumbers,
biographies,
organizations
}
}]
Setup your google-api client and extract the oauth header/
*/
const authHeader = await google.oAuth2Client.getRequestHeaders()
const updatePersonFields = 'names,nicknames,birthdays,urls,addresses,emailAddresses,phoneNumbers,biographies,organizations'
let counter = 0
let confirmed = []
const authHeader = await this.oAuth2Client.getRequestHeaders()
try {
do {
const temp = contacts.splice(0, 25)
const multipart = temp.map((contact, index) => ({
'Content-Type': 'application/http',
'Content-ID': (counter * 25) + index,
'body': `PATCH /v1/people/${contact.resourceId}:updateContact?updatePersonFields=${updatePersonFields} HTTP/1.1\n`
+ 'Content-Type: application/json\n\n'
+ JSON.stringify(contact.resource)
}))
const responseString = await request.post({
url: 'https://people.googleapis.com/batch',
method: 'POST',
multipart: multipart,
headers: {
'Authorization': authHeader.Authorization,
'content-type': 'multipart/mixed'
}
})
const result = extractJSON(responseString)
confirmed = confirmed.concat(result)
counter ++
} while ( contacts.length > 0 )
} catch (ex) {
// Handling exception here
}
return confirmed
}
Short answer: no.
However, you may be able to prevent rate limiting with the quotaUser query param on your requests.
Lets you enforce per-user quotas from a server-side application even
in cases when the user's IP address is unknown. This can occur, for
example, with applications that run cron jobs on App Engine on a
user's behalf.
You can choose any arbitrary string that uniquely
identifies a user, but it is limited to 40 characters.
I have been using the People API on a new development. Seems pretty limited, e.g. contact search isn't even available yet. I'd keep the Contacts API/GData around for features that aren't available in the newer API yet.