Flutter, complex SQLite DBs with many tables, is this best practice? - django

Flutter newbie here afraid.
I have a small Django app (python) that I am porting over to a standalone Flutter app with no web back-end. I directly exported the SQL (DDL; about 300 lines worth) that specifies my SQL tables used in my Django app and use that in my flutter app (see below). I end up with ~8 tables and I can query these by just copy/pasting the Django SQL queries Django creates for me via it's ORM.
My question: is it best practice to have complex tables in mobile app development? I worry SQLite is not best suited for such complexity. But I feel it saves me time to reuse this already generated model structure and range of SQL queries.
Many thanks,
Andy.
initDb() async {
// Get a location using path_provider
var databasesPath = await getDatabasesPath();
String path = join(databasesPath, "gear_log.db");
await deleteDatabase(path);
var theDb = await openDatabase(path, version: 1,
onCreate: (Database db, int version) async {
String sql = await rootBundle.loadString('assets/db/schema.txt');
for(var s in sql.split(";")) { //seems to be a max # characters for db.execute
if(s.length > 5) { // catching any hidden characters at end of schema
await db.execute(s + ';');
}
}
// When creating the db, create the table
});
return theDb;
}
Reusing Django generated SQL to retrieve data:
Future<List<Item>> getItems() async {
var dbClient = await db;
List<Map> list = await dbClient.rawQuery('SELECT "shoe_actualpair"."id", "shoe_actualpair"."created", "shoe_actualpair"."modified", "shoe_actualpair"."name", "shoe_actualpair"."shoe_id", "shoe_actualpair"."expires", "shoe_actualpair"."runner_id" FROM "shoe_actualpair" WHERE "shoe_actualpair"."runner_id" = 1 ORDER BY "shoe_actualpair"."modified" DESC, "shoe_actualpair"."created" DESC');
List<Item> employees = new List();
for (int i = 0; i < list.length; i++) {
employees.add(Item.fromMap(list[i]));
}
return employees;
}

You can use jaguar ORM. https://github.com/Jaguar-dart/jaguar_orm
I am using it in a an app with both one-one, one-many and many-many relationships.
For sqlite (sqflite), you also need this adapter in your flutter app:
https://github.com/Jaguar-dart/jaguar_orm/tree/master/sqflite

Related

How to request response from .net core to aws amplify GraphQL

Recently, I'm working on AWS Amplify, Which has java and javaScripts related example, but no .net related example. after next i solved this below the way. Here is my question this is the only process to request or response, another way that i missed.
const string query = "query ";
var serializer = new NewtonsoftJsonSerializer();
using var graphQlClient = new GraphQLHttpClient("https://xx.xx.xx.amazonaws.com/graphql", serializer);
graphQlClient.HttpClient.DefaultRequestHeaders.Add("x-api-key", "<api key>");
var subscriptionStream = await graphQlClient.SendQueryAsync<dynamic>(query);
I expect a better way, except my current code.

Read data from Google Cloud Datastore by dialogflow agent

I am newbie in the chatbot domain. I need to develop a dialogflow chatbot which can store the data collected from user to Google Cloud Datastore Entities(not Firebase real time database) and retrieve it back when the user want to search.
I can able to write the data collected from user to datastore. But I am struggling in retrieving the data. I am writing the function in the dialogflow inline editor.
Write function :
function order_pizza(agent) {
var pizza_size = agent.parameters.size;
var pizza_topping = agent.parameters.pizza_topping;
var date_time = agent.parameters.size;
const taskKey = datastore.key('order_item');
const entity = {
key: taskKey,
data: {
item_name: 'pizza',
topping: pizza_topping,
date_time: date_time,
order_time: new Date().toLocaleString(),
size: pizza_size }
};
return datastore.save(entity).then(() => {
console.log(`Saved ${entity.key.name}: ${entity.data.item_name}`);
agent.add(`Your order for ${pizza_topping} pizza has been placed!`);
});
}
where "order_item" is the kind(table in datastore) the data is being stored. It is storing the data successfully.
Read data:(the function not working)
function search_pizza(agent){
const taskKey = datastore.key('order_item');
var orderid = agent.parameters.id;
const query = datastore.createQuery('taskKey').filter('ID','=', orderid);
return datastore.runQuery(query).then((result) =>{
agent.add(result[0]);
});
}
This is what i tried so far! Whereever i search I can find the result for firebase realtime database. But can't find solution for google datastore!
Followed many tutorial. But can't quite get it right! Kindly help!

.Net Core DynamodDB unit testing with XUnit

Using C#, .net core 2.0, dynamo db
I have my web api, that interact with my dynamo db database having both Get and Post methods.
Example of Mehthod:
[HttpGet("api/data")]
public async Task<List<string>> GetAllData(string userId, string type, string status)
{
var creds = new BasicAWSCredentials(awsId, awsPassword);
var dynamoClient = new AmazonDynamoDBClient(creds, dynamoRegion);
var context = new DynamoDBContext(dynamoClient);
List<ScanCondition> conditions = new List<ScanCondition>();
conditions.Add(new ScanCondition("UserId", ScanOperator.Equal, userId));
conditions.Add(new ScanCondition("Type", ScanOperator.Equal, type));
conditions.Add(new ScanCondition("Status", ScanOperator.Equal, status));
var results = await context.ScanAsync<Common.Job>(conditions, new DynamoDBOperationConfig() { OverrideTableName = MyDynamoTable }).GetRemainingAsync();
return results.Select(x => x.UpdatedBy.ToLower()).ToList();
}
Now I want to write unit/integration tests for my api methods. Earlier I had used NUnit but with .net core 2.0 I believe we have to use XUnit: https://xunit.github.io/docs/getting-started-dotnet-core
Setting up Xunit in my project should not be an issue.
I wanted to know how can I write test which involve dynamo db here. This is the first time I am using any AWS service here.
So bascially I need to know how can I mock up a aws connection, dynamo db and then use various params as shown in my method above.
I could not find much details or any earlier helpful post on this topic so posting one here.
If aws dynamo db part is not testable. Can anyone share the example of xunit test where we can test the params may be and see the expected result?
AWS SDK work with interfaces. You can mock interface IAmazonDynamoDB easily. But try to do it with dependecy injection-ish. Much better.
Something like
private readonly IAmazonDynamoDB dynamodbClient;
private readonly IDynamoDBContext context;
public MyDynamodbHandler(IAmazonDynamoDB client)
{
this.dynamodbClient = client;
this.context = new DynamoDBContext(client);
}
[HttpGet("api/data")]
public async Task<List<string>> GetAllData(string userId, string type, string status)
{
List<ScanCondition> conditions = new List<ScanCondition>();
conditions.Add(new ScanCondition("UserId", ScanOperator.Equal, userId));
conditions.Add(new ScanCondition("Type", ScanOperator.Equal, type));
conditions.Add(new ScanCondition("Status", ScanOperator.Equal, status));
var results = await this.context.ScanAsync<Common.Job>(conditions, new DynamoDBOperationConfig() { OverrideTableName = MyDynamoTable }).GetRemainingAsync();
return results.Select(x => x.UpdatedBy.ToLower()).ToList();
}
So every function uses the injected IAmazonDynamoDB. All you have to do is to mock this instance at the beginning
Such as
dynamodbClientMock = new Mock();
Then use this mock to initiate MyDynamodbHandler class
var dynamodbHandler = new MyDynamodbHandler(dynamodbClientMock);
dynamodbHandler.GetAllData();

Passport Strategy to authenticate application users from local db

As per composer documentation I am able to validate my application users using github and after that redirecting to my blockchain application.
But I have to use my local db where application users will be stored and have to validate application users against stored identities in my local db.
Which passport strategy should I use and please let me know steps for it.
Thanks in advance
in case you are using composer-rest-server you can follow the comments on this link to implement local strategy.
However, in case you have your own rest server you can follow this steps:
1- Allow Participants to Register and add registration info to your database beside adding field pending = true,
so all Participants by default will be pending for admin approval.
2- Admin review user requests then run the following method.
Which creates new participant and issue identity bonded to this participant using adminCardName to sign those transactions of add and issue.
const IdentityIssue = require('composer-cli/lib/cmds/identity').Issue;
const ParticipantAdd = require('composer-cli/lib/cmds/participant').Add;
const CardImport = require('composer-cli/lib/cmds/card').Import;
const NetworkPing = require('composer-cli/lib/cmds/network').Ping;
const createParticipantCard = async (participantDetails) => {
const participantOptions = {
card: AdminCardName,
data: JSON.stringify({
$class: 'Name Space and type for your participant',
participantId: participantDetails.participantId,
email: participantDetails.email,
name: participantDetails.name,
}),
};
const issueOptions = {
card: AdminCardName,
file: `cards/identities/${participantDetails.participantId}.card`,
newUserId: participantDetails.participantId,
participantId:
`resource:org.finance.einvoice.participant.Company#
${participantDetails.participantId}`,
};
const importOptions = {
file: `cards/identities/${participantDetails.participantId}.card`,
card: participantDetails.participantId,
};
const pingOptions = {
card: participantDetails.participantId,
};
try {
await ParticipantAdd.handler(participantOptions);
await IdentityIssue.handler(issueOptions);
await CardImport.handler(importOptions);
await NetworkPing.handler(pingOptions);
return participantDetails.participantId;
} catch (err) {
throw err;
}
}
2- call this method at any file like following:
const createdParticipantId = await createParticipantCard(participantDetails);
than you can save createdParticipantId in your database and use it to query network to check if participant exists or it's identity has been revoked or submit transactions.

Parse-Server Cloud Code Query Doesn't Return All Columns

I have setup Parse-Server on AWS Elastic Beanstalk by following this guide. I've then written a cloud-code function which fetches a single record from a specific class/collection. The collection contains about 20 columns. However, the object fetched as a result of the query contains only about 8 columns. I've made sure the record does have data in the columns which are missed by the query. Am I missing something here or is it some limitation in Parse? Is there any way to force Parse to fetch these columns?
Parse.Cloud.define('confirmAppointment', function(request, response) {
var staffId = request.params.staffId;
var appointmentId = request.params.appointmentId;
var appointmentRequest = Parse.Object.extend("AppointmentRequest");
appointmentRequest.id = appointmentId;
appointmentRequest.staffId = staffId;
var query = new Parse.Query(appointmentRequest);
query.first({
useMasterKey: true,
success: function(appointment) {
if (appointment) {
// these fields are not found in the fetched appointment object
// they do exist however in mongodb
var requesterUserId = appointment.get("requesterUserId");
var staffUserId = appointment.get("staffUserId");
var staffName = appointment.get("staffNameEn");
...
}
}
...
});
});
There might be some typos in your code (the construction of the query part). Try this instead:
Parse.Cloud.define('confirmAppointment', function(req, res) {
var staffId = req.params.staffId;
var appointmentId = req.params.appointmentId;
var query = new Parse.Query("AppointmentRequest");
query.equalTo('objectId', appointmentId);
query.equalTo('staffId', staffId);
query.first({
useMasterKey: true,
success: function(appointment) {
res.success(appointment.get("requesterUserId"));
},
error: function(err) {
res.error(err);
}
});
});
The issue turned out to be that when i did migration of data from Parse to my mongolab hosted MongoDB instance, I did not click 'Finalize' button in Parse migration wizard. That was intentional, as Parse was warning me that clicking Finalize would make the migration permanent and I would no longer be able to get back to the Parse managed database. On the other hand, I could see that all the data was successfuly migrated to mongolab, and technically it should have been enough to have my AWS hosted parse server work on this new database without any issue. But somehow, clicking "Finalize" button in Parse did some magic (I still dont understand what it could be) and my queries started returning the expected results.
I was able to reproduce the same issue when migrating to Heroku as well, so i was sure it had nothing to do with AWS.
Hope this would help someone.