Questions about Heroku and Django settings - django

I have tried many, many times to get my apps running on Heroku without success. The fact that they run locally seems to be totally irrelevant. Clearly I don’t understand how this works.
So … here are a couple of questions, which I've decided to group to gether in a single question on SO:
1. Why does Heroku have so many different places that influence the settings (Procfile, wsgi.py, config vars, and settings.py)?
2. How do they relate to one another?
3. Which has precedence?
4. Do they all have to be exactly the same?
5. What am I supposed to do with / how am I supposed to configure the database settings given in the Heroku Django template?
# Parse database configuration from $DATABASE_URL
DATABASES['default'] = dj_database_url.config()
# this line blank in original. I put text here to make it format correctly
# Enable Connection Pooling (if desired)
DATABASES['default']['ENGINE'] = 'django_postgrespool'
a) Doesn’t DATABASE [‘ENGINE’], coming after DATABASE [default] overwrite DATABASE [default]?
b) Why aren’t these two in the same format as the default Django settings, which is a simple dictionary, instead of all these extra and confusing brackets?
c) Are they supposed to be treated as two different settings, so that you have to use database routers if you want both?
d) Why does the devcenter article say to import postgrespool but the template says nothing about that?
e) Why is ‘default’ optional with dj_database_url but mandatory with Django?
f) When I tried commenting out line 82, I got an error about resetting queries, why?
g) I have the postgres string from my config vars as the argument to dj_database_url, but I get a NAME ERROR, database undefined. Why?

You're making this much much much more difficult than it actually is. All dj_database_url does is use an environment variable to create a dictionary suitable for use in the DATABASES setting. It doesn't do anything else.
The devcenter article you link to mentions processpool as a way of increasing concurrency. It doesn't say or even hint that you need it when you're starting out. There is no reason for you to even be reading that article at this point.
The only article you need to read is the Getting started with Django one, which tells you exactly what to do.

Related

Django doesn't read from database – no error

I just set up the environment for an existing Django project, on a new Mac. I know for certain there is nothing wrong with the code itself (just cloned the repo), but for some reason, Django can't seem to retrieve data from the database.
I know the correct tables and data is in the db.
I know the codebase is as it should be.
I can make queries using the Django shell.
Django doesn't throw any errors despite the data missing on the web page.
I realize that it's hard to debug this without further information, but I would really appreciate a finger pointing me to the right direction. I can't seem to find any useful logs.
EDIT:
I just realized the problem lies elsewhere. Unfortunately I can't delete this post with the bounty still open.
Without seeing any code, I can only suggest some general advice that might help you debug your problem. Please add a link to your repository if you can or some snippets of your database settings, the view which includes the database queries etc...
Debugging the view
The first thing I would recommend is using the python debugger inside the view which queries the database. If you've not used pdb before, it's a life saver which allows you to set breakpoints in your Python script and then interactively execute code inside the interpreter
>>> import pdb
>>> pdb.set_trace()
>>> # look at the results of your queries
If you are using the Django ORM, the QuerySet returned from the query should have all the data you expect.
If it doesn't then you need to look into your database configuration in settings.py.
If it does, then you must might not be returning that object to the template? Unlikely as you said the code was the same, but double check the objects you pass with your HttpResponse object.
Debugging the database settings
If you can query the database using the project settings inside settings.py from the django shell it sounds unlikley that there is a problem with this - but like everything double check.
You said that you've set up a new project on a mac. What is on a different operating system before? Maybe there is a problem with the paths now - to make your project platform independent remember to use the os.path.join() method when working with file paths.
And what about the username and password details....
Debugging the template
Maybe your template is referencing the wrong object variable name or object attribute.You mentioned that
Django doesn't throw any errors despite the data missing on the web
page.
This doesn't really tell us much - to quote the Django docs -
If you use a variable that doesn’t exist, the template system will
insert the value of the TEMPLATE_STRING_IF_INVALID setting, which is
set to '' (the empty string) by default.
So to check all the variables available to your template, you could use the debug template tag
{{ debug }}
Probably even better though is to use the django-debugging-toolbar - this will also let you examine the SQL queries your view is making.
Missing Modules
I would expect this to raise an exception if this were the problem, but have you checked that you have the psycopg module on your new machine?

django - settings.py seems to load multiple times?

EDIT I miscounted - it is printed out twice, not four times.
I put this in my settings.py
print 'ola!'
and on startup "ola" is printed out twice! This seems like something is wrong with my pycharm django project ... any ideas why this would happen? It isn't in a loop or anything (that i'm aware of, anyway)
cheers!
YAY The user known only as "rohit", as per the comments, has determined that a solution can be found here: https://stackoverflow.com/a/2110584/1061426 ~ see the comment about disabling reloading.
CAUTION I don't have my Django code up and about now, so I don't know what the noload will do. Good luck, soldiers.
If you print out the thread ID within settings.py, you will see that settings.py is actually being loaded on two different threads.
See this stackoverflow response and this article for more info.
Actually what Django does is putting a wrapper around settings. It is basically an object (settings object if you want so) which give you access to some direct setters like settings.WHATEVER, so it appears like you access the global variables in settings.py direclty.
I really don't remember though, why the import happens twice. I looked into it once when I worked on django-dynamic-settings which uses a very similar approach as Django itself.
Anyway, if you are interested in the "magic" you can follow the flow starting from the execute_from_command_line call in manage.py.
Django does some strange things with settings.py, and it will execute more than once. I'm used to seeing it imported twice, not sure why in PyCharm you're getting four times. You have to be careful with statements with side-effects in settings.py.
A closely-related question has been asked at least twice since. I can add that a Django core developer rejected the idea that this is any sort of Django bug; it's normal behavior.
Also see this from Graham Dumpleton.

Ways of maintaining separate settings for different environments?

My django project has two environments - development and test. Today I carelessly overwrote the settings.py in test with the one in development. It took me a while to correct the settings in test, which could have been avoided if I have a good way to maintain the two sets of settings separately.
I was thinking to keep two separate copies of settings.py and rename/move them when needed. This, however, is kinda caveman approach. Are there any smarter ways of dealing with this problem?
At the end of your settings.py file, add this:
try:
from settings_dev import *
except ImportError: pass
Where settings_dev.py will contain the dev settings. And in your production env, do not push settings_dev (just ignore it in .gitingore or your source code versioning system.)
So, when ever settings_dev.py is present, the settings.py will be overwritten by the settings_dev.py file.
One more approach by setting the environment variable:
if os.environ.get('DEVELOPMENT', None):
from settings_dev import *
mentioned here: Django settings.py for development and production
I prefer the first one, it's simple and just works.
Split your settings as documented here:
https://code.djangoproject.com/wiki/SplitSettings#SimplePackageOrganizationforEnvironments

Running multiple sites on the same python process

In our company we make news portals for a pretty big number of local newspapers (currently 13, going to 30 next month and more in the future), each with 2k to 100k page views/day. Since we are evolving from a situation where each site was heavily customized to one where each difference is a matter of configuration or custom template, our software is already pretty much the same for all sites. Right now our deployment strategy is one gunicorn instance for each site (with 1-17 workers each, depending on the site traffic), on a 16-core server and 12GB RAM. The problem with this setup is that each worker (regular pre-forked gunicorn) takes 110MB, whether its being used or not. Now with the new sites we would need to add more RAM to serve not that much many requests, so basically it doesn't scale. Also, since we are moving from this model where each site is independent, each site has its own database and I quite like it that way, especially since we are using relational databases (mysql, but migrating to pgsql), so its much easier to shard this way.
I'm doing some research and experimenting with running all sites on one gunicorn instance, so I could use the servers fully and add more servers behind a load balancer when it came to it. The problem is that django assumes in a lot of places that only one site is running per process, so for what I've thought of so far I'd have to implement:
A middleware that takes the HTTP_HOST from the request and places an identifier on a threadlocal variable.
A template loader that uses that variable to load custom templates accordingly.
Monkey patch django.db.model.Model, probably adding a metaclass (not even sure that's possible, but I think I would need it because of the custom managers we sometimes need to use) that would overwrite the managers for one that would first call db_manager(identifier) on the original manager and then call the intended method. I would also need to overwrite the save and delete methods to always include the using=identifier parameter.
I guess I would need to stop using inclusion_tag decorators, not a big problem, but I need to think of other cases like this.
Heavy and ugly patching of urlresolvers if I need custom or extra urls for each site. I don't need them now, but probably will at some point.
And this is just is what I came up with without even implementing it and seeing where it breaks, I'm sure I'd need many more changes for it to work. So I really don't want to do it, especially with the extra maintenance effort I'll need, but I don't see any alternatives and would love to learn that someone already solved this in a better way. Of course I could also stop using django altogether (I already have many reasons to do so) but that would mean a major rewrite and having two maintain two incompatible branches of the software until the new one reached feature parity with the django version, so to me it seems even worse than all the ugly hacks.
I've recently developed an e-commerce system with similar requirements -- many instances running from the same project sharing almost everything. The previous version of the system was a bunch of independent installations (~30) so it was pretty unmaintainable. I'm sure the requirements still differ from yours (for example, all instances shared the same models in my case), but it still might be useful to share my experience.
You are right that Django doesn't help with scenarios like this out of the box, but it's actually surprisingly easy to work it around. Here is a brief description of what I did.
I could see a synergy between what I wanted to achieve and django.contrib.sites. Also because many third-party Django apps out there know how to work with it and use it, for example, to generate absolute URLs to the current site. The major problem with sites is that it wants you to specify the current site id in settings.SITE_ID, which a very naive approach to the multi host problem. What one naturally wants, and what you also mention, is to determine the current site from the Host request header. To fix this problem, I borrowed the hook idea from django-multisite: https://github.com/shestera/django-multisite/blob/master/multisite/threadlocals.py#L19
Next I created an app encapsulating all the functionality related to the multi host aspect of my project. In my case the app was called stores and among other things it featured two important classes: stores.middleware.StoreMiddleware and stores.models.Store.
The model class is a subclass of django.contrib.sites.models.Site. The good thing about subclassing Site is that you can pass a Store to any function where a Site is expected. So you are effectively still just using the old, well documented and tested sites framework. To the Store class I added all the fields needed to configure all the different stores. So it's got fields like urlconf, theme, robots_txt and whatnot.
The middleware class' function was to match the Host header with the corresponding Store instance in the database. Once the matching Store was retrieved, It would patch the SITE_ID in a way similar to https://github.com/shestera/django-multisite/blob/master/multisite/middleware.py. Also, it looked at the store's urlconf and if it was not None, it would set request.urlconf to apply its special URL requirements. After that, the current Store instance was stored in request.store. This has proven to be incredibly useful, because I was able to do things like this in my views:
def homepage(request):
featured = Product.objects.filter(featured=True, store=request.store)
...
request.store became a natural additional dimension of the request object throughout the project for me.
Another thing that was defined on the Store class was a function get_absolute_url whose implementation looked roughly like this:
def get_absolute_url(self, to='/'):
"""
Return an absolute url to this `Store` or to `to` on this store.
The URL includes http:// and the domain name of the store.
`to` can be an object with `get_absolute_url()` or an absolute path as string.
"""
if isinstance(to, basestring):
path = to
elif hasattr(to, 'get_absolute_url'):
path = to.get_absolute_url()
else:
raise ValueError(
'Invalid argument (need a string or an object with get_absolute_url): %s' % to
)
url = 'http://%s%s%s' % (
self.domain,
# This setting allowed for a sane development environment
# where I just set it to ".dev:8000" and configured `dnsmasq`.
# The same value was also removed from the `Host` value in the middleware
# before looking up the `Store` in database.
settings.DOMAIN_SUFFIX,
path
)
return url
So I could easily generate URLs to objects on other than the current store, e.g.:
# Redirect to `product` on `store`.
redirect(store.get_absolute_url(product))
This was basically all I needed to be able to implement a system allowing users to create a new e-shop living on its own domain via the Django admin.

Storing important singular values in Django

So I'm working on a website where there are a couple important values that get used in various places throughout the site. For example, certain important dates, like the start and end dates for registration.
One way I can do this is making a model that stores these values, but that sounds like overkill (since I'd only have one instance). Another way is to store these values in the settings.py file, but if I wanted to change them, it seems like I would need to restart the webserver for them to take effect. I was wondering what would be the best practice in Django to handle this kind of stuff.
You can store them in settings.py. While there is nothing wrong with this (you can even organize your settings into multiple different files, if you have to many custom settings), you're right that you cannot change these at runtime.
We were solving the same problem where I work and came up with a simple app called django-constance (you can get it from github at https://github.com/comoga/django-constance). What this lets is store your settings in a settings.py, but once you need to turn them into settings configurable at runtime, you can switch to a Redis data store with django admin frontend. You can even use the value from settings as your default. I suggest you try this app out.
The changes to your code are pretty minimal, as pasted from docs you initialize your dynamic settings like this:
CONSTANCE_CONFIG = {
'MY_SETTINGS_KEY': (42, 'the answer to everything'),
}
And then instead of importing settings from django conf, you do this:
from constance import config
if config.MY_SETTINGS_KEY == 42:
answer_the_question()
If you want a specific set of variables available to all of your template, what you are looking for is Context Processors.
http://docs.djangoproject.com/en/dev/ref/templates/api/#writing-your-own-context-processors
More links
http://www.b-list.org/weblog/2006/jun/14/django-tips-template-context-processors/
http://blog.madpython.com/2010/04/07/django-context-processors-best-practice/
The code for your context processors, can live anywhere in your project. You just have to add it to your settings.py under:
TEMPLATE_CONTEXT_PROCESSORS =
You could keep the define your constants in your settings.py or even under a constants.py and just
from constants import *
However as you mentioned, you would need to reload your server each time the settings are updated. I think you first need to figure out how often will you be changing these settings? Is it worth the extra effort to be able to reload the settings automatically?
If you wanted to automatically enable the settings, each time they are updated you could do the following:
Store settings in the DB
Upon save/change, write output to a file
settings.py / constants.py reads files
reload server
In addition, you have a look at the mezzanine project which allows you to update settings from the django admin interface and will reload as well.
See: http://mezzanine.jupo.org/docs/configuration.html
If the variables you need will be updated infrequently, i suggest just store them in settings.py and add a custom context processor.
If you are using source control such as GIT, updating will be quite easy, you can just update the file and push to your server. For really simple reloading of the server you could also create a post-recieve hook for git that will automatically reload the server when new code is pushed.
I would only suggest the other option if you are updating settings fairly regularly.