Dynamic Models in Loopback - loopbackjs

How to create a dynamic models in loopback, instead of using the command "lb model" for all models.
For ex:
If I want to create 30 models with almost same properties, will be in trouble of creating all the 30 models and those corresponding properties again and again.
Is it possible to create the model and iterate it to another model using loopback. Kindly share your answers.

Well, I'm still new to this but I think, you can easily create any number of dynamic models programmatically. For example, at first, create a boot script inside your boot directory, ex: server\boot\dynamic-models.js and then create a dynamic model using the following code:
const app = require('../server');
const dbDataSource = app.datasources.db;
const schema = {
"name": {
"type": "string",
"required": true
},
"email": {
"type": "string",
"required": true
}
};
const MyDynamicModel = dbDataSource.createModel('MyDynamicModel', schema);
app.model(MyDynamicModel);
The app is exported from projectroot/server/server.js, so you can require it in your script.
Also, the schema is optional (in case of noSql/mongo). Once you create the dynamic models then you can visit your api explorer and can see the dynamically created models/endpoint.
If you've more models to create then all you need to do a loop and create the models, for example:
const models = ['ModelOne', 'ModelTwo'];
// or export from other files and import those here, i.e:
// const schema = require('exported-from-another-file');
// const models = require('exported-from-another-file');
models.forEach(model => {
app.model(dbDataSource.createModel(model, schema));
});
Update: Another working example for multiple models to register dynamically:
// project-root/common/dynamic/index.js
module.exports.schema = {
"name": {
"type": "string",
"required": true
},
"email": {
"type": "string",
"required": true
}
};
module.exports.models = [
'ModelOne',
'ModelTwo'
];
// project-root/server/boot/dynamic-models.js
const app = require('../server');
const dbDataSource = app.datasources.db;
const {schema, models} = require('../../common/dynamic');
models.forEach(
model => app.model(dbDataSource.createModel(model, schema))
);
Now on, to add any dynamic model using the same schema, all you need to add a model name in the models array. This is tested and works fine:

Related

Django differ JSONField values between lists and objects

I am using django 3.2 with Postgres as DB.
I have a model with JSONField:
class MyModel(models.Model):
data = models.JSONField(default=dict, blank=True)
In database there are a lot of records in this table and some data have JSON values as object and others as lists:
{
"0:00:00 A": "text",
"0:01:00 B": "text",
"0:02:00 C": "text",
}
[
{"time": "0:00:00", "type": "A", "description": "text"},
{"time": "0:01:00", "type": "B", "description": "text"},
{"time": "0:02:00", "type": "C", "description": "text"},
]
I need to filter all records which has JSON values as objects.
What I tried is to use has_key with time frame "0:00:00" :
result = MyModel.objects.filter(data__has_key="0:00:00 A")
But I really cant use it because I am not sure what the key with time frame look like completely.
Any ideas how to filter JSONField values by object struct?

Can I create a Django Rest Framework API with Geojson format without having a model

I have a Django app that requests data from an external API and my goal is to convert that data which is returned as list/dictionary format into a new REST API with a Geojson format.
I came across django-rest-framework-gis but I don't know if I could use it without having a Model. But if so, how?
I think the best way is to use the python library geojson
pip install geojson
If you do not have a Model like in geodjango you have to explicitly describe the geometry from the data you have.
from geojson import Point, Feature, FeatureCollection
data = [
{
"id": 1,
"address": "742 Evergreen Terrace",
"city": "Springfield",
"lon": -123.02,
"lat": 44.04
},
{
"id": 2,
"address": "111 Spring Terrace",
"city": "New Mexico",
"lon": -124.02,
"lat": 45.04
}
]
def to_geojson(entries):
features = []
for entry in entries:
point = Point((entry["lon"], entry["lat"]))
del entry["lon"]
del entry["lat"]
feature = Feature(geometry=point, properties=entry)
features.append(feature)
return FeatureCollection(features)
if __name__ == '__main__':
my_geojson = to_geojson(data)
print(my_geojson)
Create the point geometry from lon, lat (Could also be another geometry type)
Create a feature with the created geometry and add the dictionary as properties. Note that I deleted lon, lat entries from the dictionary to not show up as properties.
Create A feature collection from multiple features
Result:
{"features": [{"geometry": {"coordinates": [-123.02, 44.04], "type":
"Point"}, "properties": {"address": "742 Evergreen Terrace", "city":
"Springfield", "id": 1}, "type": "Feature"}, {"geometry":
{"coordinates": [-124.02, 45.04], "type": "Point"}, "properties":
{"address": "111 Spring Terrace", "city": "New Mexico", "id": 2},
"type": "Feature"}], "type": "FeatureCollection"}
More Info here: Documentation Geojson Library

How do I extract a string of numbers from random text in Power Automate?

I am setting up a flow to organize and save emails as PDF in a Dropbox folder. The first email that will arrive includes a 10 digit identification number which I extract along with an address. My flow creates a folder in Dropbox named in this format: 2023568684 : 123 Main St. Over a few weeks, additional emails arrive that I need to put into that folder. The subject always has a 10 digit number in it. I was building around each email and using functions like split, first, last, etc. to isolate the 10 digits ID. The problem is that there is no consistency in the subjects or bodies of the messages to be able to easily find the ID with that method. I ended up starting to build around each email format individually but there are way too many, not to mention the possibility of new senders or format changes.
My idea is to use List files in folder when a new message arrives which will create an array that I can filter to find the folder ID the message needs to be saved to. I know there is a limitation on this because of the 20 file limit but that is a different topic and question.
For now, how do I find a random 10 digit number in a randomly formatted email subject line so I can use it with the filter function?
For this requirement, you really need regex and at present, PowerAutomate doesn't support the use of regex expressions but the good news is that it looks like it's coming ...
https://powerusers.microsoft.com/t5/Power-Automate-Ideas/Support-for-regex-either-in-conditions-or-as-an-action-with/idi-p/24768
There is a connector but it looks like it's not free ...
https://plumsail.com/actions/request-free-license
To get around it for now, my suggestion would be to create a function app in Azure and let it do the work. This may not be your cup of tea but it will work.
I created a .NET (C#) function with the following code (straight in the portal) ...
#r "Newtonsoft.Json"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
string strToSearch = System.Text.Encoding.UTF8.GetString(Convert.FromBase64String((string)data?.Text));
string regularExpression = data?.Pattern;
var matches = System.Text.RegularExpressions.Regex.Matches(strToSearch, regularExpression);
var responseString = JsonConvert.SerializeObject(matches, new JsonSerializerSettings()
{
ReferenceLoopHandling = ReferenceLoopHandling.Ignore
});
return new ContentResult()
{
ContentType = "application/json",
Content = responseString
};
}
Then in PowerAutomate, call the HTTP action passing in a base64 encoded string of the content you want to search ...
The is the expression in the JSON ... base64(variables('String to Search')) ... and this is the json you need to pass in ...
{
"Text": "#{base64(variables('String to Search'))}",
"Pattern": "[0-9]{10}"
}
This is an example of the response ...
[
{
"Groups": {},
"Success": true,
"Name": "0",
"Captures": [],
"Index": 33,
"Length": 10,
"Value": "2023568684"
},
{
"Groups": {},
"Success": true,
"Name": "0",
"Captures": [],
"Index": 98,
"Length": 10,
"Value": "8384468684"
}
]
Next, add a Parse JSON action and use this schema ...
{
"type": "array",
"items": {
"type": "object",
"properties": {
"Groups": {
"type": "object",
"properties": {}
},
"Success": {
"type": "boolean"
},
"Name": {
"type": "string"
},
"Captures": {
"type": "array"
},
"Index": {
"type": "integer"
},
"Length": {
"type": "integer"
},
"Value": {
"type": "string"
}
},
"required": [
"Groups",
"Success",
"Name",
"Captures",
"Index",
"Length",
"Value"
]
}
}
Finally, extract the first value that you find which matches the regex pattern. It returns multiple results if found so if you need to, you can do something with those.
This is the expression ... #{first(body('Parse_JSON'))?['value']}
From this string ...
We're going to search for string 2023568684 within this text and we're also going to try and find 8384468684, this should work.
... this is the result ...
Don't have a Premium PowerAutomate licence so can't use the HTTP action?
You can do this exact same thing using the LogicApps service in Azure. It's the same engine with some slight differences re: connectors and behaviour.
Instead of the HTTP, use the Azure Functions action.
In relation to your action to fire when an email is received, in LogicApps, it will poll every x seconds/minutes/hours/etc. rather than fire on event. I'm not 100% sure which email connector you're using but it should exist.
Dropbox connectors exist, that's no problem.
You can export your PowerAutomate flow into a LogicApps format so you don't have to start from scratch.
https://learn.microsoft.com/en-us/azure/logic-apps/export-from-microsoft-flow-logic-app-template
If you're concerned about cost, don't be. Just make sure you use the consumption plan. Costs only really rack up for these services when the apps run for minutes at a time on a regular basis. Just keep an eye on it for your own mental health.
TO get the function URL, you can find it in the function itself. You have to be in the function ...

django-rest-framework-datatables and Django Parler's translation field

I've got model with translated fields.
class Device(TranslatableModel):
translations = TranslatedFields(name=models.CharField(max_length=100))
I made a serializer like:
class DeviceSerializer(TranslatableModelSerializer):
translations = TranslatedFieldsField(shared_model=Device)
class Meta:
model = Device
fields = ('translations',)
It gives me nice JSON like it should.
{
"count": 1,
"next": null,
"previous": null,
"results": [
{
"device": {
"translations": {
"en": {
"name": "Sample Device"
}
}
}
}
]
}
Now i want to use it with django-rest-framework. In my template I've written script like:
$('#devices').DataTable({
'serverSide': true,
'ajax': 'api/devices/?format=datatables',
'columns': [
{'data':'device.translations.en'}
It refuses to work with me. I am getting django.core.exceptions.FieldError: Unsupported lookup 'en' for AutoField or join on the field not permitted.
If I am not appending .en to {'data'} it gives Object.object of course.
Issue is in template file.
Pass name & data field separately to columns in data-table configuration
please replace field_name with your model field name
$('#devices').DataTable({
'ajax': 'api/devices/?format=datatables',
'columns': [
{"data": "translations.en.field_name" , "name": "translations.field_name"},
]
});
for more details refer django-rest-framework-datatables
& Django-parler-rest
The actual problem is that while making get request to server
data-table will add name value in column parameter so
instead of writing
"name": "translations.en.field_name"
write down:
"name": "translations.field_name"
remove language code

Using the Django API

First of all, any help with this would be amazing! I am currently trying to learn Django and am also fairly new to Python. I am trying to create a simple project management web service that could communicate using JSON with another application. All it needs to do is:
Give a list of all projects.
Give a list of all tasks for each of the projects.
Allow users to submit time for a project and task.
For models I currently have a 'Project' model and 'Task' model. I have filled the database with some dummy information. My project model has variables id, name, and description. Task just has project, task_name.
My view for task 1 above is
def api(request):
projects = Project.objects.all()#.order_by('id')
projects = serializers.serialize('json', projects, indent=2)
return HttpResponse(projects, mimetype='application/json')
associated url is url(r'^api/project/$', 'project.views.api'),
However this outputs the text below when typing
http://127.0.0.1:8000/api/project/
into the browser. Is there a better way to test JSON output? I would rather it only output the 'id' and 'name'. For some reason the id doesn't even show. This is what I put for id in models id = models.AutoField(primary_key=True). It works in the interactive shell.
[
{
"pk": 1,
"model": "project.project",
"fields": {
"name": "Project One",
"description": "This is the description for project one"
}
},
{
"pk": 2,
"model": "project.project",
"fields": {
"name": "Project Two",
"description": "This is Project Two"
}
}
]
I learn by example so if someone could show me how this should be done I would be extremely grateful!
By the way for outputting the list of tasks, I was thinking the url would be something like:
http://127.0.0.1:8000/api/project/?project_name
but I'm not sure how to handle that in the view.