Read data from Google Cloud Datastore by dialogflow agent - google-cloud-platform

I am newbie in the chatbot domain. I need to develop a dialogflow chatbot which can store the data collected from user to Google Cloud Datastore Entities(not Firebase real time database) and retrieve it back when the user want to search.
I can able to write the data collected from user to datastore. But I am struggling in retrieving the data. I am writing the function in the dialogflow inline editor.
Write function :
function order_pizza(agent) {
var pizza_size = agent.parameters.size;
var pizza_topping = agent.parameters.pizza_topping;
var date_time = agent.parameters.size;
const taskKey = datastore.key('order_item');
const entity = {
key: taskKey,
data: {
item_name: 'pizza',
topping: pizza_topping,
date_time: date_time,
order_time: new Date().toLocaleString(),
size: pizza_size }
};
return datastore.save(entity).then(() => {
console.log(`Saved ${entity.key.name}: ${entity.data.item_name}`);
agent.add(`Your order for ${pizza_topping} pizza has been placed!`);
});
}
where "order_item" is the kind(table in datastore) the data is being stored. It is storing the data successfully.
Read data:(the function not working)
function search_pizza(agent){
const taskKey = datastore.key('order_item');
var orderid = agent.parameters.id;
const query = datastore.createQuery('taskKey').filter('ID','=', orderid);
return datastore.runQuery(query).then((result) =>{
agent.add(result[0]);
});
}
This is what i tried so far! Whereever i search I can find the result for firebase realtime database. But can't find solution for google datastore!
Followed many tutorial. But can't quite get it right! Kindly help!

Related

Authenticate AWS lambda against Google Sheets API

I am trying to create an aws lambda function that will read rows from multiple Google Sheets documents using the Google Sheet API and will merge them afterwards and write in another spreadsheet. To do so I did all the necessary steps according to several tutorials:
Create credentials for the AWS user to have the key pair.
Create a Google Service Account, download the credentials.json file.
Share each necessary spreadsheet with the Google Service Account client_email.
When executing the program locally it works perfectly, it successfully logins using the credentials.json file and reads & writes all necessary documents.
However when uploading it to AWS Lambda using the serverless framework and google-spreadsheet, the program fails silently in the authentication step. I've tried changing the permissions as recommended in this question but it still fail. The file is read properly and I can print it to the console.
This is the simplified code:
async function getData(spreadsheet, psychologistName) {
await spreadsheet.useServiceAccountAuth(clientSecret);
// It never gets to this point, it fails silently
await spreadsheet.loadInfo();
... etc ...
}
async function main() {
const promises = Object.entries(psychologistSheetIDs).map(async (psychologistSheetIdPair) => {
const [psychologistName, googleSheetId] = psychologistSheetIdPair;
const sheet = new GoogleSpreadsheet(googleSheetId);
psychologistScheduleData = await getData(sheet, psychologistName);
return psychologistScheduleData;
});
//When all sheets are available, merge their data and write back in joint view.
Promise.all(promises).then(async (psychologistSchedules) => {
... merge the data ...
});
}
module.exports.main = async (event, context, callback) => {
const result = await main();
return {
statusCode: 200,
body: JSON.stringify(
result,
null,
2
),
};
I solved it,
While locally having a Promise.all(promises).then(result =>...) eventually returned the value and executed what was inside the then(), aws lambda returned before the promises were resolved.
This solved it:
const res = await Promise.all(promises);
mergeData(res);

Flutter, complex SQLite DBs with many tables, is this best practice?

Flutter newbie here afraid.
I have a small Django app (python) that I am porting over to a standalone Flutter app with no web back-end. I directly exported the SQL (DDL; about 300 lines worth) that specifies my SQL tables used in my Django app and use that in my flutter app (see below). I end up with ~8 tables and I can query these by just copy/pasting the Django SQL queries Django creates for me via it's ORM.
My question: is it best practice to have complex tables in mobile app development? I worry SQLite is not best suited for such complexity. But I feel it saves me time to reuse this already generated model structure and range of SQL queries.
Many thanks,
Andy.
initDb() async {
// Get a location using path_provider
var databasesPath = await getDatabasesPath();
String path = join(databasesPath, "gear_log.db");
await deleteDatabase(path);
var theDb = await openDatabase(path, version: 1,
onCreate: (Database db, int version) async {
String sql = await rootBundle.loadString('assets/db/schema.txt');
for(var s in sql.split(";")) { //seems to be a max # characters for db.execute
if(s.length > 5) { // catching any hidden characters at end of schema
await db.execute(s + ';');
}
}
// When creating the db, create the table
});
return theDb;
}
Reusing Django generated SQL to retrieve data:
Future<List<Item>> getItems() async {
var dbClient = await db;
List<Map> list = await dbClient.rawQuery('SELECT "shoe_actualpair"."id", "shoe_actualpair"."created", "shoe_actualpair"."modified", "shoe_actualpair"."name", "shoe_actualpair"."shoe_id", "shoe_actualpair"."expires", "shoe_actualpair"."runner_id" FROM "shoe_actualpair" WHERE "shoe_actualpair"."runner_id" = 1 ORDER BY "shoe_actualpair"."modified" DESC, "shoe_actualpair"."created" DESC');
List<Item> employees = new List();
for (int i = 0; i < list.length; i++) {
employees.add(Item.fromMap(list[i]));
}
return employees;
}
You can use jaguar ORM. https://github.com/Jaguar-dart/jaguar_orm
I am using it in a an app with both one-one, one-many and many-many relationships.
For sqlite (sqflite), you also need this adapter in your flutter app:
https://github.com/Jaguar-dart/jaguar_orm/tree/master/sqflite

Passport Strategy to authenticate application users from local db

As per composer documentation I am able to validate my application users using github and after that redirecting to my blockchain application.
But I have to use my local db where application users will be stored and have to validate application users against stored identities in my local db.
Which passport strategy should I use and please let me know steps for it.
Thanks in advance
in case you are using composer-rest-server you can follow the comments on this link to implement local strategy.
However, in case you have your own rest server you can follow this steps:
1- Allow Participants to Register and add registration info to your database beside adding field pending = true,
so all Participants by default will be pending for admin approval.
2- Admin review user requests then run the following method.
Which creates new participant and issue identity bonded to this participant using adminCardName to sign those transactions of add and issue.
const IdentityIssue = require('composer-cli/lib/cmds/identity').Issue;
const ParticipantAdd = require('composer-cli/lib/cmds/participant').Add;
const CardImport = require('composer-cli/lib/cmds/card').Import;
const NetworkPing = require('composer-cli/lib/cmds/network').Ping;
const createParticipantCard = async (participantDetails) => {
const participantOptions = {
card: AdminCardName,
data: JSON.stringify({
$class: 'Name Space and type for your participant',
participantId: participantDetails.participantId,
email: participantDetails.email,
name: participantDetails.name,
}),
};
const issueOptions = {
card: AdminCardName,
file: `cards/identities/${participantDetails.participantId}.card`,
newUserId: participantDetails.participantId,
participantId:
`resource:org.finance.einvoice.participant.Company#
${participantDetails.participantId}`,
};
const importOptions = {
file: `cards/identities/${participantDetails.participantId}.card`,
card: participantDetails.participantId,
};
const pingOptions = {
card: participantDetails.participantId,
};
try {
await ParticipantAdd.handler(participantOptions);
await IdentityIssue.handler(issueOptions);
await CardImport.handler(importOptions);
await NetworkPing.handler(pingOptions);
return participantDetails.participantId;
} catch (err) {
throw err;
}
}
2- call this method at any file like following:
const createdParticipantId = await createParticipantCard(participantDetails);
than you can save createdParticipantId in your database and use it to query network to check if participant exists or it's identity has been revoked or submit transactions.

Parse-Server Cloud Code Query Doesn't Return All Columns

I have setup Parse-Server on AWS Elastic Beanstalk by following this guide. I've then written a cloud-code function which fetches a single record from a specific class/collection. The collection contains about 20 columns. However, the object fetched as a result of the query contains only about 8 columns. I've made sure the record does have data in the columns which are missed by the query. Am I missing something here or is it some limitation in Parse? Is there any way to force Parse to fetch these columns?
Parse.Cloud.define('confirmAppointment', function(request, response) {
var staffId = request.params.staffId;
var appointmentId = request.params.appointmentId;
var appointmentRequest = Parse.Object.extend("AppointmentRequest");
appointmentRequest.id = appointmentId;
appointmentRequest.staffId = staffId;
var query = new Parse.Query(appointmentRequest);
query.first({
useMasterKey: true,
success: function(appointment) {
if (appointment) {
// these fields are not found in the fetched appointment object
// they do exist however in mongodb
var requesterUserId = appointment.get("requesterUserId");
var staffUserId = appointment.get("staffUserId");
var staffName = appointment.get("staffNameEn");
...
}
}
...
});
});
There might be some typos in your code (the construction of the query part). Try this instead:
Parse.Cloud.define('confirmAppointment', function(req, res) {
var staffId = req.params.staffId;
var appointmentId = req.params.appointmentId;
var query = new Parse.Query("AppointmentRequest");
query.equalTo('objectId', appointmentId);
query.equalTo('staffId', staffId);
query.first({
useMasterKey: true,
success: function(appointment) {
res.success(appointment.get("requesterUserId"));
},
error: function(err) {
res.error(err);
}
});
});
The issue turned out to be that when i did migration of data from Parse to my mongolab hosted MongoDB instance, I did not click 'Finalize' button in Parse migration wizard. That was intentional, as Parse was warning me that clicking Finalize would make the migration permanent and I would no longer be able to get back to the Parse managed database. On the other hand, I could see that all the data was successfuly migrated to mongolab, and technically it should have been enough to have my AWS hosted parse server work on this new database without any issue. But somehow, clicking "Finalize" button in Parse did some magic (I still dont understand what it could be) and my queries started returning the expected results.
I was able to reproduce the same issue when migrating to Heroku as well, so i was sure it had nothing to do with AWS.
Hope this would help someone.

How to login into a third-party website using google app script and manage data on login?

I am interested in creating a google app script that on run would login into a specific website (third-party) and complete certain functions within the website (pressing buttons/copying text).
After browsing the stackoverflow and other forums I have created a script that allows me to login into my website (source1 source2).
However, I am having difficulties staying logged in and managing the data.
//The current code is just testing if I can get data from within the website.
//The results are displayed in a google app.
function doGet() {
var app = UiApp.createApplication();
app.add(app.createLabel(display_basic_data()));
return app;
}
//logins into website and displays data
function display_basic_data() {
var data;
var url = "http://www.website.bla/users/sign_in";
var payload = {"user[username]":"usr","user[password]":"ps"};
var opt ={"method":"post","payload":payload, "followRedirects" : false};
var response = UrlFetchApp.fetch(url,opt);
data = response;
return data;
}
Currently, the data returned from display_basic_data() is
"<html><body>You are being redirected.</body></html>".
If I try to change my script so that "followRedirects" is true, the data is equivalent to the HTML of the login page.
I understand I have to play around with cookies in order to 'stay' logged in but I have no idea what to do as the examples online provided to be fruitless for me.
Any help would be much appreciated!!!
You may want to do something like this:
var cookie = response.getAllHeaders()['Set-Cookie'];
//maybe parse cookies here, depends on what cookie is
var headers = {'Cookie':cookie};
var opt2 = {"headers":headers};
var pagedata = UrlFetchApp.fetch("http://www.website.bla/home",opt2);