How to fix circular importing? - django

It seems I have a circular importing error. I currently just struggling to fix it. Does anyone know what I should do?
In my models.py, containing ReservedItems & Order:
def reserveditem_pre_save_receiver(sender, instance, **kwargs):
if not instance.order_reference:
instance.order_reference = unique_order_reference_generator()
In my utils.py
from lumis.utils import get_random_string
from .models import Order, ReservedItem
def unique_order_reference_generator():
new_id = get_random_string(length=10)
reserved_item = ReservedItem.objects.filter(
order_reference=new_id
).exists()
order = Order.objects.filter(order_reference=new_id).exists()
if reserved_item or order:
return unique_order_reference_generator()
else:
return new_id

You can import modules locally in the body of the function, so:
from lumis.utils import get_random_string
def unique_order_reference_generator():
from .models import Order, ReservedItem
new_id = get_random_string(length=10)
reserved_item = ReservedItem.objects.filter(
order_reference=new_id
).exists()
order = Order.objects.filter(order_reference=new_id).exists()
if reserved_item or order:
return unique_order_reference_generator()
else:
return new_id
This thus means that the module is not loaded when Python loads the file, but when the function is actually called. As a result, we can load the unique_order_reference_generator function, without having to load a the module that actually depends on this function.
Note that, like #Alasdair says, signals are typically defined in a dedicated file (signals.py) for example which should be loaded in the ready() function of the app. But regardless how you structure code, frequently local imports should be used to avoid circular imports.

All the current suggestions are good. Move your signal handlers out of models. Models are prone to circular imports because they are used everywhere, so it is a good idea to keep only model code in models.py.
Personally, I don't like imports in the middle of the code
import-outside-toplevel / Import outside toplevel
Instead I use Django application API to load models without importing
from django.apps import apps
def signal_handler(instance, *args, **kwargs):
Order = apps.get_model('your_app', 'Order')
...

Related

Get access to class attributes

import yaml
class Import_Yaml_Setting():
def __init__(self, path):
self.read_yaml(path)
def read_yaml(self, path):
stream = open(path, 'r')
self.settings = yaml.load(stream)
stream.close()
class MasterDef(Import_Yaml_Setting):
def __init__(self, path):
Import_Yaml_Setting.__init__(self, path)
def function_1():
path = 'path_to_settings\\yaml_file.yaml'
MasterDef(path)
def function_2():
MasterDef.settings
if __name__ == '__main__':
function_1()
function_2()
My plan is it to have a class Import_Yaml_Setting which imports settings from a yaml file. The class MasterDef inherits the class Import_Yaml_Setting.
After 'function_1' calls MasterDef in order to import the settings. I want to do this once in my program. After, I just want to get access to the imported settings
without import them again. This should do function_2.
My problem
I don't know how I have to call MasterDef at the first place. If I would create an instance of MasterDef them I wouldn't have access to this instance in function_2.
Also, I get an error that says MasterDef has no attribute settings.
What would be the right way to do this.
There are a few things incorrect, so lets start with the most obvious.
If you have a class MasterDef, calling MasterDef() creates an instance
of that class. If you don't assign that to a variable, that instance will
immediately disappear.
Doing MasterDef.settings later on could work if the class had a
class attribute or method called settings, but in that case you are not accessing
the settings attribute on an instance.
Typical such global settings are passed around, or implemented as a function object that
does the loading only once, or are made into a global variable (as
shown in the following example). Simplified you would do:
from __future__ import print_function, absolute_import, division, unicode_literals
class MasterDef(object):
def __init__(self):
self.settings = dict(some='setting')
master_def = None
def function_1():
global master_def
if master_def is None:
master_def = MasterDef()
def function_2():
print('master_def:', master_def.settings)
if __name__ == '__main__':
function_1()
function_2()
which gives:
master_def: {'some': 'setting'}
A few notes to the above:
If, for whatever reason, you are doing anything new on Python 2.7
make things more Python3 compatible by including the from
__future__ import as indicated. Even if you are just using the
print function (instead of the outdated print statement). It
will make transitioning easier (2.7 goes EOL in 2020)
Again in 2.7 make your base classes a subclass of object, that
makes it e.g. possible to have properties.
By testing that master_def is None you can invoke function_1 multiple
times
You should also be aware that PyYAML load, as is written in its
documentation, can be unsafe when you don't have full control over
your input. There is seldom need to use load() so use safe_load()
or upgrade to my ruamel.yaml package which implements the newer YAML
1.2 standard (released 2009, so there is no excuse for using PyYAML
that still doesn't support that).
As you also seem to be on Windows (assumed from you using \\), consider using raw strings
where you don't need to escape the backslash, using os.path.join(). I am leaving out
your path part in my full example as I am not on Windows:
from __future__ import print_function, absolute_import, division, unicode_literals
import ruamel.yaml
class Import_Yaml_Setting(object):
def __init__(self, path):
self._path = path # stored in case you want to write out the configuration
self.settings = self.read_yaml(path)
def read_yaml(self, path):
yaml = ruamel.yaml.YAML(typ='safe')
with open(path, 'r') as stream:
return yaml.load(stream)
class MasterDef(Import_Yaml_Setting):
def __init__(self, path):
Import_Yaml_Setting.__init__(self, path)
master_def = None
def function_1():
global master_def
path = 'yaml_file.yaml'
if master_def is None:
master_def = MasterDef(path)
def function_2():
print('master_def:', master_def.settings)
if __name__ == '__main__':
function_1()
function_2()
If your YAML file looks like:
example: file
very: simple
the output of the above program will be:
master_def: {'example': 'file', 'very': 'simple'}

Avoiding circular imports in Django Models (Config class)

I've created a Configuration model in django so that the site admin can change some settings on the fly, however some of the models are reliant on these configurations. I'm using Django 2.0.2 and Python 3.6.4.
I created a config.py file in the same directory as models.py.
Let me paracode (paraphase the code? Real Enum has many more options):
# models.py
from .config import *
class Configuration(models.Model):
starting_money = models.IntegerField(default=1000)
class Person(models.Model):
funds = models.IntegarField(default=getConfig(ConfigData.STARTING_MONEY))
# config.py
from .models import Configuration
class ConfigData(Enum):
STARTING_MONEY = 1
def getConfig(data):
if not isinstance(data, ConfigData):
raise TypeError(f"{data} is not a valid configuration type")
try:
config = Configuration.objects.get_or_create()
except Configuration.MultipleObjectsReturned:
# Cleans database in case multiple configurations exist.
Configuration.objects.exclude(Configuration.objects.first()).delete()
return getConfig(data)
if data is ConfigData.MAXIMUM_STAKE:
return config.max_stake
How can I do this without an import error? I've tried absolute imports
You can postpone loading the models.py by loading it in the getConfig(data) function, as a result we no longer need models.py at the time we load config.py:
# config.py (no import in the head)
class ConfigData(Enum):
STARTING_MONEY = 1
def getConfig(data):
from .models import Configuration
if not isinstance(data, ConfigData):
raise TypeError(f"{data} is not a valid configuration type")
try:
config = Configuration.objects.get_or_create()
except Configuration.MultipleObjectsReturned:
# Cleans database in case multiple configurations exist.
Configuration.objects.exclude(Configuration.objects.first()).delete()
return getConfig(data)
if data is ConfigData.MAXIMUM_STAKE:
return config.max_stake
We thus do not load models.py in the config.py. We only check if it is loaded (and load it if not) when we actually execute the getConfig function, which is later in the process.
Willem Van Onsem's solution is a good one. I have a different approach which I have used for circular model dependencies using django's Applications registry. I post it here as an alternate solution, in part because I'd like feedback from more experienced python coders as to whether or not there are problems with this approach.
In a utility module, define the following method:
from django.apps import apps as django_apps
def model_by_name(app_name, model_name):
return django_apps.get_app_config(app_name).get_model(model_name)
Then in your getConfig, omit the import and replace the line
config = Configuration.objects.get_or_create()
with the following:
config_class = model_by_name(APP_NAME, 'Configuration')
config = config_class.objects.get_or_create()

python problems with super

Ok so I'm having a bit of a problem with the code below. It works as is but if I try to change the part with the comment about me not being able to get super to work correctly to.
pipeline_class_call = super(Error_Popup,self)
broken_file_w_whats_wrong = pipeline_class_call.whats_wrong_with_file()
or to
broken_file_w_whats_wrong = super(Error_Popup,self).whats_wrong_with_file()
and change
class Error_Popup(QtGui.QDialog):
to
class Error_Popup(QtGui.QDialog,Pipeline_UI):
I get the following error
# TypeError: object of type 'instancemethod' has no len() #
Which normally means that I need to call the method, but doesn't super handle all this for me. Or am I goofing this?
from PySide import QtCore, QtGui
from shiboken import wrapInstance
import pymel.core as pm
import maya.OpenMayaUI as omui
from UI.UI import Pipeline_UI
def something_bad_happened_window():
sbh_pointer = omui.MQtUtil.mainWindow()
return wrapInstance(long(sbh_pointer), QtGui.QWidget)
class Error_Popup(QtGui.QDialog):
def __init__(self,parent=something_bad_happened_window()):
super(Error_Popup,self).__init__(parent)
self.setWindowTitle('Something Bad Happened!')
self.setWindowFlags(QtCore.Qt.Tool)
self.popup_layout()
self.setAttribute(QtCore.Qt.WA_DeleteOnClose)
self.connections()
def popup_layout(self):
self.file_description = QtGui.QListWidget()
#cant seem to get super to work appropriately... booo
pipeline_class_call = Pipeline_UI()
broken_file_w_whats_wrong = pipeline_class_call.whats_wrong_with_file()
for display in range(0,len(broken_file_w_whats_wrong)):
broken_list = QtGui.QListWidgetItem()
if display % 2 == 0:
broken_list.setText(broken_file_w_whats_wrong[display][0])
broken_list.asset = broken_file_w_whats_wrong[display][1]
else:
broken_list.setText(" " + broken_file_w_whats_wrong[display][0])
self.file_description.addItem(broken_file_w_whats_wrong[display])
self.import_button = QtGui.QPushButton('Import Replacement(s)')
error_layout = QtGui.QVBoxLayout()
error_layout.setContentsMargins(2,2,2,2)
error_layout.setSpacing(2)
error_layout.addWidget(self.file_description)
error_layout.addWidget(self.import_button)
error_layout.addStretch()
self.setLayout(error_layout)
def connections(self):
self.import_button.clicked.connect(Error_Popup.make_sphere)
#classmethod
def make_sphere(cls):
pm.polySphere()
def show_window():
ui = Error_Popup()
if __name__ == '__main__':
try:
ui.close()
except:
pass
ui.show()
show_window()
Thanks in advance everyone
Looks to me like it's a problem of using super with multiple inheritance. It picks one of the parents in a certain order to use. For example, super(Error_Popup,self).__init__(parent) only calls one of the parents __init__ methods. You have to manually call all of them.
When calling methods or accessing variables, you have to be specific about which parent you want to use or super will pick for you. See this answer and this answer.

Load static data on django startup using AppConfig ready method in Django 1.7

I have some static location data to load so that it is available throughout the application like an in-memory cache.
I tried to override ready() on AppConfig but data isn't loaded from the database, also ready() is getting called twice.
from django.apps import AppConfig
class WebConfig(AppConfig):
name = 'useraccount'
verbose_name = 'User Accounts'
locations = []
def ready(self):
print("Initialising...")
location = self.get_model('Location')
all_locations = location.objects.all()
print(all_locations.count())
self.locations = list(all_locations)
any hints?
Well, the docs ( https://docs.djangoproject.com/en/1.7/ref/applications/#django.apps.AppConfig.ready ) tell you to avoid using database calls in the ready() function, and also that it may be called twice.
Avoiding the double-call is easy:
def ready(self):
if self.has_attr('ready_run'): return
self.ready_run = True
...
But I'm still trying to find the right way to do database-based initialization, too. I'll update if I find anything.
For load some static data in app create a separate file for get data
# file /app_util.py
def get_country():
if Student.objects.all().count == 0:
... # your code
else:
... # your code
import app_util and call it from url.py
# file /url.py
admin.autodiscover()
urlpatterns = patterns('equity_funds_investor_app',
# Examples:
url(r'^$', 'views.index'),
)
# make a call to save/get method
app_util.get_country()
Note: same process you can follow when you want to save/get some data at start of ur app
url.py file process only one time when u make a first request after runserver
and call your custom function(s)

Get comments for object using one query

Is it possible to get object with comments related to it? Right now django comment framework creates query for every object which has related comments and another queries for comments owners. Can I somehow avoid this? I use django 1.4 so prefetch_related is allowed.
You could create a function that caches the count:
from django.contrib.contenttypes.models import ContentType
from django.contrib import comments
def get_comment_count_key(model):
content_type = ContentType.objects.get_for_model(model)
return 'comment_count_%s_%s' % (content_type.pk, model.pk)
def get_comment_count(model):
key = get_comment_count_key(model)
value = cache.get(key)
if value is None:
value = comments.get_model().objects.filter(
content_type = ContentType.objects.get_for_model(model),
object_pk = model.pk,
site__pk = settings.SITE_ID
).count()
cache.set(key, value)
return value
You could extend the Comment model and add get_comment_count there. Or put get_comment_count as a template filter. It doesn't matter.
Of course, you would also need cache invalidation when a new comment is posted:
from django.db.models import signals
from django.contrib import comments
def refresh_comment_count(sender, instance, **kwargs):
cache.delete(get_comment_count_key(instance.content_object))
get_comment_count(instance.content_object)
post_save.connect(refresh_comment_count, sender=comments.get_model())
post_delete.connect(refresh_comment_count, sender=comments.get_model())
You could improve this last snippet, by using cache.incr() on comment_was_posted, and cache.decr() on post_delete but that's left as an exercise for you :)