We are starting a web project with Django (and are pretty new to this), and I was wondering what would been the best way to setup a productive development environment.
Here is some info:
We are 2 developers working on the project
We have a development/production server hosted on Webfactional (using an apache server)
The DB (MySQL) is hosted on this server
We are both using Eclipse with Pydev
The project is hosted on a Github repository
So far, we didn't setup any local development server, we are syncing modifications to the server through Github. But it is not really convenient...
We were thinking of setting up apache servers locally, that uses the remote database, and only sync once in a while.
Do you think it would be better?
Do you have other ideas/additional tips?
Thanks!
Don't try and share servers and databases while you're developing. You'll only get confused.
Each developer should have a local copy of MySQL on their own machine, and run the development server as mipadi recommends. Manage db schema changes via South, and keep data in fixtures or use South's data migrations. Each developer should commit to their local version of the git repository, but only push changes to Github when the particular task is complete and working (or, better, use a remote branch so that changes are synched, but only merge back to master when complete).
A good idea is to have a continuous integration server like Hudson/Jenkins that, after each commit to master, runs the tests and - if they pass - builds an up-to-date version of the site, which you can use for functional/user testing.
Edit to add There's no relationship between the development server and any particular database backend. As I recommend above, it is quite simple to install MySQL or PostgreSQL on your local machine, and use the development server against that. I have been working this way for several years, after initially encountering some of the issues you were worried about when switching between sqlite3 and the production MySQL db.
Django has its own development server that you can use for local testing.
Related
My web server depends on nginx, django, and a lot of python dependencies. I'm wondering if there is a way to create a portable image/script that I can run in a new server and quickly get it up and running.
Is Docker relevant to this?
You should always use git to manage your code. With git you could get your django project in the other server quickly. But just that.
Also you have to migrate your database. Every DB engine have dump options for doing this.
Do not forget to move your static assests. Probably, you've all of them in one directory.
What about your nginx, database installation and configuration? Here is relevant Docker.
With all of this, you should migrate successfully.
Why does redmine not use the development and test environments?
In the official installation guide they only show one environment when setting up the databases, advise to run bundler skipping dev and test, and run the rails server in production mode.
I think this instruction describes the installation process only for server (which runs in Production mode). I think it is done this way not to confuse new users (who do not have a lot of knowledge in Rails)
You can easily use this instruction to setup Redmine locally (I did it successfully several times ;). In order to install Redmine locally you should change only few points in the instruction.
I am quite computer-illiterate, but I have managed to utilize the Django framework on my own machine. I have had an account on Amazon Web Service (AWS) for some time, but it appeared rather complex to set-up and to make use of, so I put it of for a while. Then I decided to give it a try, and it was not so hard as I first thought to load a AMI and connect to the server with PuTTY. But since I were already using BitNami's Django-Stack, I decided to take a look at their hosting offer (which builds on AWS). Since they appeared to offer "one-click deployment", I set up a new server through their interface. But then, it seems like the "one-click deployment"-promise is with regard to the server itself. There does not seem to be any interface for deploying Django projects through their site. Having used PuTTY already, and adding WinSCP to my machine, I can acceess the server and load my Django-code unto the server. But then I am lost. The documentation seems a bit thin (look here).
The crux of this is the following: Can anyone make this part of the process more understandable. I.e., how to deploy a Django project on a Linux server with Apache/mod_WSGI?
The other question is: I want to use Postgres. Am I free to install this on the server. Should I opt for EBM (EMB?) for this, or what is the downside of not having EBM?
I hope I am not too unworthy of your attention, thanks!
how to deploy a Django project on a Linux server with Apache/mod_WSGI The Bitnami AMI already comes with all this configured. Once installed try going to the EC2 public url on the default 8000 port and you will see the demo django project setup there. You can add your own project once you have logged into the machine via putty check the /home/bitnami/ directory for the demo project. Copy your project, configure your database The other question is: I want to use Postgres. Am I free to install this on the server Postgres and Mysql are already installed the same way you would do on your local machine. The in your project do ./manage.py runserver 0.0.0.0:9000 since the 8000 port is already running another application.
I am using South for database migration for a Django project. And I was wondering if it is a good idea to commit the migration scripts that were generated by my dev server to the repository & then reusing it on the production server?
Yes. That is the point of migrations to allow you to develop and test database schema changes and then deploy in test and production in a reproducible way.
I wouldn't consider it reuse so much as developed and tested in development and then deployed on production.
Make sure you also develop and test backward migrations to ensure you can retreat.
I'm new to django and my very first project is my blog. I wonder how django developers who use pydev normally synchronize with remote hosting server, updating their sites?
I also would like to know, how do you combine usage of git with a django project? Should I just make a repository for the entire project?
At my company we've got an entire git repository for each project, including the Django sources that are put in the PYTHONPATH for each project, making Django versions project dependant. The folder structure is something like:
/.git
/projectname/app1
/projectname/app2
/projectname/manage.py
/django-lib/django/...
As django-lib is not a Python module, we include both / and /django-lib in the PYTHONPATH. If your project is becoming large, you might want to consider using git submodules on your apps.
We've also setup several servers to support the developers. There's a testing server running a central testing database and a setup including Apache with WSGI to make testing on a real server possible, which sometimes is a bit different then the local manage.py the developers use before committing their changes.
The testing server is updated with the master branch of our git repository. We've made several scripts to allow all developers to do this without letting them login to the server via SSH, but that is just during pre-release. After release, that server will become our staging server, and we'll remove all scripts from it to make it just like our production server.
Every developer has setup their local project to make sure that it communicates with the central testing database, containing several test data. I myself push my changes from the commandline, but you could also use EGit for this.
When we've got a release, we put it in a separate branch, called 'release' (obviously) and the production server will pull only from that branch. This is done via SSH, but I don't really know how your server setup looks like, so I guess that that last step is entirely up to you.
I hope that this has helped you a bit. I won't say that this is the best workflow possible, but it works for us and you should figure out what works for you.
Most experienced Django developers use pip(or distribute) and virtualenv deal with all the python packages you might need for your Django projects (including Django itself).
Personally, all I keep in my projects git repository is a bunch of segregated requirements lists generated by pip :
. ~/Dev/environs/$PROJECT_NAME/bin/activate
pip freeze > ./docs/requirements/main.list
I'm fairly sure most django developers would be familiar with Fabric, which I use for :
streamlining local interaction with git and,
pushing to our central repository,
pulling from our production or test server
touching the wsgi on the relevant server
and pretty much any other kind of task you might find yourself using ssh terminal session for.
For those cases where I need to make changes to someone elses django application in order to make it work or suit our purposes, I :
fork it on github,
clone from my forked repo
make the changes
push it up to my own repo
and provide merge requests to the original repo owner
This way, i have a repo where i can use pip requirement lists to keep pulling from until the original application owner gets their own repo updated.