Where to put business logic in django - django

For example, Account 1--> *User --> 1 Authentication
1 account has multiple users and each user will have 1 authentication
I come from java background so what I usually do is
define these classes as java beans (ie, just getter and setter, no logic attached)
create AccountManager ejb class, define create_account method (with takes 1 account, list of users)
preparing data in the web layer, then pass data into AccountManager ejb, for instance:
accountManager.createAccount(account, userList)
But in django, the framework advocates that you put domain logic into model classes (row level) or associated manager classes (table level), which makes things bit awkward. Yes, it is fine that if your logic is only involves one table, but in the real application, usually each step will involve multiple different tables or even databases, so what should I do in this case?
Put the logic into View? I don't think this is good practice at all. or even overwrite the save method in model class, passing in extra data by using **kwargs? then the backend will break.
I hope this illustrates my confusion with where business logic should be placed in a django application.

Not sure if you've read the section on Managers in Django, it seems to solve your current situation. Assuming you have the following Account model defined, User is built-in.
# accounts/models.py
class AccountManager(models.Manager):
def create_account(self, account, user_list):
...
class Account(models.Model):
objects = AccountManager()
Feel free to separate your manager code in a separate file if it gets too big. In your views:
# views.py
from accounts.models import Account
Account.objects.create_account(account, user_list)
The business logic is still in the models.
EDIT
The keyword here is override, not overwrite. If you override the model's save method, you have to keep in mind that any create, update operations from within your web app and admin will use this new functionality. If you only want those business logic to happen once in a specific view, it might be best to keep it out of save.
I guess you can put your business logic in its own regular class. You will have to instantiate that class each time you need to run your business logic. Alternatively, you can put your business logic as a static function in this new class if you want to skip the OOP approach.

What I'm doing is that most of my apps have service.py file with business logic. This is because if I need a function that works with models from multiple apps I simply cannot do this:
# shop/models.py
from auth.models import User, Team
This code will be stuck in circular reference loop.
But I'm gonna move most likely from service.py to app with structure like this:
service/
auth.py
customers.py
another_set_of_functions.py
...

Well the example that you illustrated above with the accountManager.createAccount(account, userList) seems like something that could be easily done my adding a createAccount method to the account model. If you feel the need to take business logic outside of the model though you could always just create a new module that host your business logic, and then import and used in in your views.

Related

Good Django design practice to add a REST api later following DRY

I am starting a web application in pure Django. However, in the future, there might be a requirement for REST api. If it happens, the most obvious choice will be Django REST framework.
Both the "old-fashioned" and REST parts share the models, however, the views are slightly different (permissions definitions, for example) and forms are replaced with serializers. Doing it the most obvious way would mean to duplicate the application logic several times, and thus a failure to follow DRY principle, and so the code becomes unmaintainable.
I got an idea to write all the logic into models (since they are shared), but in such case, there will be no use of permission mixins, generic views and the code would not be among the nicest ones.
Now I ran out of ideas. What is the best practice here?
I'd try to keep things simple as you're not sure about the future requirements for the API, and guessing can introduce extra complexity that may not even be needed when requirements will be clear.
Both Django forms and Rest Framework serializers already offer you a declarative approach that abstracts away the boilerplate code needed for basic stuff, which normally accounts for most of your code anyway.
For example, one of your Django form could look like this:
class ArticleForm(ModelForm):
class Meta:
model = Article
fields = ['title', 'content']
And in the future the DRS serializer would be:
class ArticleSerializer(ModelSerializer):
class Meta:
model = Article
fields = ['title', 'content']
As you can see, if you try and stick to ModelForm and ModelSerializer, there won't be much duplication anyway. You can also simply store the fields list in a variable and just reuse that.
For more custom things, you can start by sharing logic into simple functions, for example:
def save_article_with_author(article_data, author_data):
# custom data manipulation before saving, consider that article_data will be a dictionary either if it comes from deserialized JSON (api) or POST data
# send email, whatever
This function can be shared between your form and serializer.
For everything related to data fetching, I'd try to use Model Managers as much as possible, defining custom querysets that can be resued e.g. for options by forms and serializers.
I tend to avoid writing any logic that doesn't directly read or write data into the model classes. I think that couples too much the business logic with the data layer. As an example, I never want to write any auth/permission checks into a save() method of a model, because that couples different layers too tightly.
As a rule of thumb, imagine this scenario: you add say permissions checks or the logic to send an email when a user is created overriding the save() method of your Article model.
Then, later on you're asked to write a simple manage command that batch-import users from a spreadsheet. At this point, what you did in your save() method really gets in the way, as you can freely access your data through your model without having to bother with permissions, emails and all of that.
Regarding the view layer and assuming you need to implement some shared auth/permission checks and you don't want to have separate views, you can use this approach:
https://www.django-rest-framework.org/topics/html-and-forms/
Blockquote
REST framework is suitable for returning both API style responses, and regular HTML pages. Additionally, serializers can be used as HTML forms and rendered in templates.
Here's some guidelines on how you could dynamically switch from HTML to JSON based to the request content type:
https://www.django-rest-framework.org/api-guide/renderers/#advanced-renderer-usage
This seems like a good option in your situation, I'd just write down a quick proof-of-concept before you go all in to see if you are not too limited for what you need to do.

App Design in Django

I have a general algorithm design question. I am creating a Django app that will connect to an API, but I won't be storing these results (at least not at first). After I retrieve the data from the API I manipulate it accordingly and have already created a class with numerous methods to do this.
Should the programming logic for this be performed in the model or the view for the Django framework? Is one more sustainable than the other (e.g. in a few months I decide to store the information). Also, is it best to encapsulate my class in its own file, and them import into the model/view?
Thanks!
Rob
If you don't need to store data, don't use Djnago's built in models.
Write views and import your own modules/classes.
Bonus: If your views share a lot of logic (probably related to request/response handling), use class based views and write a common MyBaseView.

Django design patterns for overriding models

I'm working on an e-commerce framework for Django. The chief design goal is to provide the bare minimum functionality in terms of models and view, instead allowing the users of the library to extend or replace the components with their own.
The reasoning for this is that trying to develop a one-size-fits-all solution to e-commerce leads to overcomplicated code which is often far from optimal.
One approach to tackling this seems to be using inversion-of-control, either through Django's settings file or import hacks, but I've come up against a bit of a problem due to how Django registers its models.
The e-commerce framework provides a bunch of abstract models, as well as concrete versions in {app_label}/models.py. Views make use of Django's get_model(app_label,model) function to return the model class without having to hard-code the reference.
This approach has some problems:
Users have to mimic the structure of the framework's apps, ie the app_label and effectively replace our version of the app with their own
Because of the way the admin site works by looking for admin.py in each installed app, they have to mimic or explicitly import the framework's admin classes in order to use them. But by importing them, the register method gets called so they have to be unregistered if a user wants to customise them.
The user has to be extremely careful about how they import concrete models from the core framework. This is because Django's base model metaclass automatically registers a model with the app cache as soon as the class definition is read (ie upon __new__), and the first model registered with a specific label is the one you're stuck with. So you have to define all your override models BEFORE you import any of the core models. This means you end up with messy situations of having a bunch of imports at the bottom of your modules rather than the top.
My thinking is to go further down the inversion-of-control rabbit hole:
All references to core components (models, views, admin, etc) replaced with calls to an IoC container
For all the core (e-commerce framework) models, replace the use of Django's base model metaclass with one that doesn't automatically register the models, then have the container explicitly register them on startup.
My question:
Is there a better way to solve this problem? The goal is to make it easy to customise the framework and override functionality without having to learn lots of annoying tricks. The key seems to be with models and the admin site.
I appreciate that using an IoC container isn't a common pattern in the Django world, so I want to avoid it if possible, but it is seeming like the right solution.
Did you look at the code from other projects with a similar approach?
Not sure if this way covers your needs, but imo the code of django-shop is worth to look at.
This framework provides the basic logic, allowing you to provide custom logic where needed.
customize via models
eg see the productmodel.py
#==============================================================================
# Extensibility
#==============================================================================
PRODUCT_MODEL = getattr(settings, 'SHOP_PRODUCT_MODEL',
'shop.models.defaults.product.Product')
Product = load_class(PRODUCT_MODEL, 'SHOP_PRODUCT_MODEL')
customize via logic/urls
eg see the shop's simplevariation-plugin
It extends the cart-logic, so it hooks in via urlpattern:
(r'^shop/cart/', include(simplevariations_urls)),
(r'^shop/', include(shop_urls)),
and extends the views:
...
from shop.views.cart import CartDetails
class SimplevariationCartDetails(CartDetails):
"""Cart view that answers GET and POSTS request."""
...
The framework provides several points to hook-in, the simplevariation-plugin mentionned above additionally provides a cart-modifier:
SHOP_CART_MODIFIERS = [
...
'shop_simplevariations.cart_modifier.ProductOptionsModifier',
...
]
I worry that this explanation is not very understandable, it is difficult to briefly summarize this concept. But take a look at the django-shop project and some of its extensions: ecosystem

How should multiple Django apps communicate with each other?

Other posters have previously said in this forum that when your Django app starts getting big and unmanageable, you should split it up into several apps. I'm at that point now. What are the best practices for allowing communication between these apps?
One of my apps (let's call it Processor) processes very large data sets. Once an hour it produces a small amount of new data for the other app. This other app (let's call it Presenter) displays the data to users.
How should Processor hand new data to Presenter? Should it simply import parts of Presenter's model, so it can create and save records in Presenter's database? That seems like tight coupling to me. Or should it pass the data by calling a function in Presenter? Or put the data in some sort of data store that both Processor and Presenter know about?
How do you all usually solve this problem?
/Martin
I would definitely go for the importing Processor's models in the Presenter app. That's how, for instance, you can add extra user info: you have a UserPreferences model with a ForeignKeyField to django.contrib.auth.models.User. Your might have less of a bad feeling doing that that between your two apps because django.contrib is the "standard library", but nevertheless, it is direct coupling.
If your applications are coupled, then your code should be coupled to reflect this. This follows the idea that explicit is better than implicit, right?
If, however, your're designing something a tad more generic (i.e. you're going to use multiple Presenter app instances for Processors of different kinds), you can store the specific models as a setting:
import processor_x.models
PRESENTER_PROCESSOR_MODELS = presenter_x.models
Then, in your Presenter models:
from django.conf import settings
class Presenter:
processor = models.ForeignKey(settings.PRESENTER_PROCESSOR_MODELS)
Caveat: I have never attempted this, but I don't recall a limitation on settings to be only strings, tuples or lists!

Django: How to dynamically add tag field to third party apps without touching app's source code

Scenario: large project with many third party apps. Want to add tagging to those apps without having to modify the apps' source.
My first thought was to first specify a list of models in settings.py (like ['appname.modelname',], and call django-tagging's register function on each of them. The register function adds a TagField and a custom manager to the specified model. The problem with that approach is that the function needs to run BEFORE the DB schema is generated.
I tried running the register function directly in settings.py, but I need django.db.models.get_model to get the actual model reference from only a string, and I can't seem to import that from settings.py - no matter what I try I get an ImportError. The tagging.register function imports OK however.
So I changed tactics and wrote a custom management command in an otherwise empty app. The problem there is that the only signal which hooks into syncdb is post_syncdb which is useless to me since it fires after the DB schema has been generated.
The only other approach I can think of at the moment is to generate and run a 'south' like database schema migration. This seems more like a hack than a solution.
This seems like it should be a pretty common need, but I haven't been able to find a clean solution.
So my question is: Is it possible to dynamically add fields to a model BEFORE the schema is generated, but more specifically, is it possible to add tagging to a third party model without editing it's source.
To clarify, I know it is possible to create and store Tags without having a TagField on the model, but there is a major flaw in that approach in that it is difficult to simultaneously create and tag a new model.
From the docs:
You don't have to register your models
in order to use them with the tagging
application - many of the features
added by registration are just
convenience wrappers around the
tagging API provided by the Tag and
TaggedItem models and their managers,
as documented further below.
Take a look at the API documentation and the examples that follow for how you can add tags to any arbitrary object in the system.
http://api.rst2a.com/1.0/rst2/html?uri=http://django-tagging.googlecode.com/svn/trunk/docs/overview.txt#tags
Updated
#views.py
def tag_model_view(request, model_id):
instance_to_tag = SomeModel.objects.get(pk=model_id)
setattr(instance_to_tag, 'tags_for_instance', request.POST['tags'])
...
instance_to_tag.save()
...returns response
#models.py
#this is the post_save signal receiver
def tagging_post_save_handler(sender, instance, created):
if hasattr(instance, 'tags_for_instance'):
Tag.objects.update_tags(instance, instance.tags_for_instance)