Eager Loading disabled by Custom Extension - doctrine-orm

I have a database table called Accounts and there's a column isCustomer.
To make my /Entity/Customer.php only return accounts that are actual customers I used a Custom Doctrine ORM Extension
There is also a relation with the Contacts table. I'd like to add Contacts to Customer when the /customers URL is requested. This relation exists and is working properly, and is also eager loading.
The moment I created the /Doctrine/CustomerExtension.php the result set only consists of customer accounts (with contacts), like I expected.
But at the same time the Eager Loading ORM Extension doesn't work anymore.
All relations are lazy loaded now even with #ORM\ManyToOne(fetch="EAGER") on every relationship.
How do I add $queryBuilder->andWhere('isCustomer = 1'); by default AND keep eager loading?
My first attempt has been to add normalizationContext={ "groups": {"boost"} } to /Entity/Customer.php and put all the properties in that group. It didn't fix eager loading :(

Related

Ember.js - How to not reload the model on the route loading

I am developing an app where there is a huge list of items, loaded from the server by ember-data. This lists can be filtered by different fields, like date.
When you load this route for the first time, it is filtered by date: it only loads the current year items. However, the user can change those filters.
When you transition to another route and come back, the visible filters are the same as when you left, ember seems to remember them. However, since the model is loaded by the route before the controller exists and the filters are available, it loads all the current year ones.
Therefore, the result is that the user is seeing a list of all current-year items, and a set of filters that may not match.
What I would love is for the route to not reload the model if it is already available, so to save time and network; but any solution would be appreciated.
You could move the GET to the controller which will not be regenerated on route change, if the controller is already present all your previous models and filters are still present
Get the last selected filter options from the controller in beforeModel:... or model:... method to load your data via query (if possible)
If you´re sure all the data is already loaded use peekRecords to avoid a new network request

How to know where database has changed

I have a project that looks like a simple shopping site that sells different kinds of products. For example, I have 4 models: Brand, Product, Consignment. Consignment is linked to Product, and Product is linked to Brand. To reduce count of queries to databases, I want to save current state of these models(or at least some of them). I want to do it, because I show a sidebar with brands and products. So every time when user opens some page, it will execute the query to database to get those brands and products.
But when admin add some new product or brand, I want to handle database changing and resave it. How to implement it?
Your answer is by using Cache. Cache is a method to store your objects in memory/other app like redis temporarily so that you do not need send queries to database. You can read the full description here.
Or, you can use this third party library that helps you to cache Django ORM Model. Here are the example.
Brand.objects.filter(name='stackoverlow').cache()
After doing an update to the model, you need to clear or invalidate the cache.
invalidate_model(Brand)

Syncing db with existing tables through django for an existing schema table and also updating few columns for the tables and the rest automatically

I am doing a poc in Django and i was trying to create the admin console module for inserting,updating and deleting records through django admin console through models and it was doing fine
I have 2 questions.
1.I need to have model objects for existing tables which needs to be present in a particular schema.say schema1.table1
Here as of now i was doing poc for public schema.
So can it be done in a fixed defined schema and if yes how.Any reference would be very helpful
2.Also i wanted to update few columns in the table through console and the rest of the columns will be done automatically like currentimestamp and created date etc.Is it possible through default django console and if yes kindly share any reference
Steps for 1
What i have done as of now is created a class in model.py with attributes as author,title,body,timeofpost
Then i used sqlmigrate after makemigrations app to create the table and after migrating have been using the admin console for django to insert and update the records for the table created.But this is for POC only.
Now i need to do the same but for existing tables with whom i can interact and insert or update record for those existing tables through admin console.
Also the tables are getting created in public schema by default.But i am using postgres and the existing tables are present in different schemas and i wanted to insert,update and delete for this existing tables.
I am stuck up here as i dont know how to configure model with existing database schema tables through which we can interact through django console and also for different schemas and not in public schema
Steps for 2:
Also i wanted the user to give input for few columns like suppose in this case time of creation is not required to be given as input by user .Rather it should be taken care when the database is updating or creating
Thanks
In order for Django to "interact" with an existing database you need to create a model for it which can be done automatically as shown here. This assumes that your "external" database isn't going to be changed often because you'll have to keep your models in sync which is tricky - there are other approaches if you need that.
As for working with multiple database schemas - is there a reason you can't put your POC table in the same database as the others? Django supports multiple databases, but it will be harder to setup. See here.
Finally, it sounds like you are interested in setting the Django default field attribute. For an example of current time see here.

Warehousing records from a flat item table: Django Signals or PostgreSQL Triggers?

I have a Django website with a PostgreSQL database. There is a Django app and model for a 'flat' item table with many records being inserted regularly, up to millions of inserts per month. I would like to use these records to automatically populate a star schema of fact and dimension tables (initially also modeled in the Django models.py), in order to efficiently do complex queries on the records, and present data from them on the Django site.
Two main options keep coming up:
1) PostgreSQL Triggers: Configure the database directly to insert the appropriate rows into fact and dimensional tables, based on creation or update of a record, possibly using Python/PL-pgsql and row-level after triggers. Pros: Works with inputs outside Django; might be expected to be more efficient. Cons: Splits business logic to another location; triggering inserts may not be expected by other input sources.
2) Django Signals: Use the Signals feature to do the inserts upon creation or update of a record, with the built-in signal django.db.models.signals.post_save. Pros: easier to build and maintain. Cons: Have to repeat some code or stay inside the Django site/app environment to support new input sources.
Am I correct in thinking that Django's built-in signals are the way to go for maintaining the fact table and the dimension tables? Or is there some other, significant option that is being missed?
I ended up using Django Signals. With a flat table "item_record" containing fields "item" and "description", the code in models.py looks like this:
from django.db.models.signals import post_save
def create_item_record_history(instance, created, **kwargs):
if created:
ItemRecordHistory.objects.create(
title=instance.title,
description=instance.description,
created_at=instance.created_at,
)
post_save.connect(create_item_record_history, sender=ItemRecord)
It is running well for my purposes. Although it's just creating an annotated flat table (new field "created_at"), the same method could be used to build out a star schema.

Is it possible to include a database view in a JPA/EclipseLink CriteriaQuery?

I am working on a project that needs to be able to create dynamic queries into an H2 database. This also includes a full text search with built-in H2 logic, tables, and triggers.
I have been trying to figure out how to add that full-text search into my CriteriaQuery but keep running into the road block that the tables used aren't entities in my model. I could add them as entities, but I don't want them created automatically by EclipseLink when a new database file is created since there is a function in H2 that creates the tables and does other necessary housekeeping.
I had tried the path of creating a view to query the full text tables to give me the information I need in the format I need. But I still keep running into the same problem that that view is not an Entity.
Has anyone encountered this situation before and/or figured out a way around it?
Thanks!