On Heroku, it is possible to share a database among apps using the following command:
$ heroku addons:attach <databaseName> -a <appName>
where <databaseName> is the shared database (belonging to another app) and it is attached to the app <appName> (a Django application).
I googled around for a long time but couldn't find anything describing how to access the attached database in the app. Do I need to add or modify something to Django's settings.py and what? How do I access the attached database in Django's views.py?
The following is the setting for Heroku databases and database accessing is just via ORM.
# Parse database configuration from $DATABASE_URL
import dj_database_url
DATABASES = {'default':dj_database_url.config()}
# Honor the 'X-Forwarded-Proto' header for request.is_secure()
SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
Thanks.
Heroku always supplies configuration details as environment variables. When you do addons:attach it will print the name of the variable it has created for your attached add-on; alternatively you can specify it with the --as flag.
heroku addons:attach <databaseName> -a <appName> --as MY_ATTACHED_DB_URL
Now you can pass that variable name to the config call:
DATABASES = {
'default':dj_database_url.config(),
'course': dj_database_url.config('MY_ATTACHED_DB_URL')
}
Related
I'm currently working on a django project built with django cookiecutter. I'm running both redis server and the redis client but anytime I run the django server it keeps giving the below error
raise ImproperlyConfigured(error_msg)
django.core.exceptions.ImproperlyConfigured: Set the CELERY_BROKER_URL environment variable
Below is the configurations of my .env file in my config folder
DEBUG=True
SECRET_KEY=12345
EMAIL_USE_TLS=True
EMAIL_HOST=smtp.gmail.com
EMAIL_PORT=587
EMAIL_HOST_USER=darkolawrence#gmail.com
EMAIL_HOST_PASSWORD=**********
DEFAULT_FROM_EMAIL=noreply#gmail.com
BRAINTREE_MERCHANT_ID=5pbrjk4gmztd5m8k6dg
BRAINTREE_PUBLIC_KEY=by84t6rfx9nz3vs6kegw
BRAINTREE_PRIVATE_KEY=202056899b37713b1faeb093207160ff2e
BROKER_URL=amqp://
CELERY_RESULT_BACKEND=db+sqlite:///results.sqlite
I think you need to setting.py Configure celery in the file_ BROKER_ URL, as follows:
CELERY_BROKER_URL = 'redis://hostname:6379/6'
The last bit 6 is the number represents the databases of Redis, 0-15 is available.
I have a development app running locally. I have a production app running on my server. I would like to keep developing locally and push my changes. However, the dev version uses local postgres and the static and media files reside inside the project. The server version the static and media files are in a static public_html directory served by apache. Can I have a local static and media files as well as different postgres credentials on localhost than on the server? How do I accomplish that?
The python-dotenv package is designed specifically for the issue you are running into. Instead of using JSON files it uses .env files which follows the practice of 12 factor apps.
An example would be
settings.py
from dotenv import load_dotenv
load_dotenv()
DEBUG = os.getenv('DEBUG', 1) # with a default value
SECRET_KEY = os.getenv('DJANGO_SECRET_KEY')
DB_USERNAME = os.getenv('DB_USERNAME')
DB_PASSWORD = os.getenv('DB_PASSWORD')
LOGGING_FOLDER = os.getenv('LOGGING_FOLDER')
# Allowed hosts can be stored and split into a list
# Also the name in the separate settings file does not have to match the settings.py variable name
ALLOWED_HOSTS = os.getenv('HOSTS', '').split(' ')
# This will be ['locahost', 'localhost:8000']
# ...
.env
DJANGO_SECRET_KEY=somethingotherthanthis
DB_USERNAME=postgres
DB_PASSWORD=123456
LOGGING_FOLDER=/var/logging/app/
HOSTS=localhost localhost:8000
Of course you don't have to use the external package you can always use the JSON file and only change the following code in your settings. What you will want to be careful of in both circumstances is that the file you are saving your sensitive data and environment specific information (like media folders, logging folders, etc) is not checked into the version control system.
settings.py
import json
with open('settings.json') as fh:
file_settings = json.load(fh)
DEBUG = file_settings.get('DEBUG', 1)
# ...
Am trying to set up multiple website with same base. While browsing, came to know Django has Sites framework which I could use.
I didnt get how to set multiple settings.py file with site id. Any ideas anyone?
Thanks in advance :)
To serve multiple sites from the same Django instance you do not need multilple settings.py files.
A simple method is to omit the SITE_ID setting from the Sites framework. Then include the Sites framework middleware:
'django.contrib.sites.middleware.CurrentSiteMiddleware'
This automatically passes a request object to Site.objects.get_current() on every request. It also allows your Django application to detect the current site via request.site.
You would need to make sure to setup multilple virtual hosts using your NGINX or apache instance to route traffic from each site to your server.
you can have multiple setting file for example develop.py and production.py
steps:
create a settings folder inside the project
Add all of the settings file to that folder
while running server
./manage.py runserver -- settings=project_name.settings.required_settingfile
for example:
./manage.py runserver --settings=myproject.settings.develop
I have a django based herokuapp site. I set everything up a long time ago and am now unable to get things working with a local instance of postgresql. In my settings file, I updated:
DATABASES['default'] = dj_database_url.config()
to work with a local database:
DATABASES['default'] = dj_database_url.config(default='postgres://localhost/appDB')
When running foreman, I can view the site, but the database is not currently populated (although I did create the empty DB). Running:
heroku run python manage.py dumpdata
Returns the contents of the remote (herokuapp) database, while a syncdb command results in "Installed 0 object(s) from 0 fixture(s)". So it looks like I'm still contacting the remote database. I'm pretty sure the postgresql DB is setup correctly locally; how can I force the app to use it?
I'm sure this is simple, but I haven't seen anything useful yet. I did try
export DATABASE_URL=postgres:///appDB
but that hasn't helped.
Cheers
I am trying to apply multiple settings files to my project. I am following basic two articles:
1 http://www.rdegges.com/the-perfect-django-settings-file/
2 Two scoops of Django: Best practices for Django 1.5
https://github.com/twoscoops/django-twoscoops-project
I have two questions:
1) I understand that it is important to have such files like secret key and aws keys out of settings. And in pydanny example on github I found this
SECRET_KEY = r"{{ secret_key }}"
I used this on my local and it pass even with no secret key on my environment variables (bashrc, profiles or virtualenvs).
How this SECRET_KEY = r"{{ secret_key }}" works?
2) After I created my settings folder. Also created the init.py and my base, local, stagging and production settings files. I notice that some subcommands disappear like collectstatic for example. I have to change my manage.py to local or base to start seeing them again.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myproject.settings.local")
Why this works for base and local settings, but not for production?
Django's SECRET_KEY has many usages revolving around cryptographic signing. For example, this is how cookies are protected or how Django makes sure hidden form fields are not tampered with.
manage.py runserver is the development server that you should not use in production: you're probably using WSGI there: modifying manage.py won't do anything.