On my development app I'm using a combo of Haystack for search with Whoosh as the backend.
However, when I deploy to Heroku my search is no longer working, even after running python manage.py update_index.
Upon some research, I discovered that it's because of Heroku's read-only file system.
Is there any free solution to get around this on Heroku so that I can get search working? The addons I have looked at are ~$20/month and I'd prefer to get started with a free solution if possible.
It's not really practical to do this without a separate search server. Storage on Heroku's dynos are not read-only but they are ephemeral and individual to a dyno, and any production application will have at least two dynos. You might be able to set something up to run on a dyno but it's certain to be complex and fragile, whereas a third party service is turnkey. Most third party search add-ons scale with usage and many are free at the cheapest level, and if none of them fit the bill then you can always use non-Heroku search services, of which there are many.
Note that the dyno filesystem is writeable. Can you post the error you are getting?
You might want to take another look at Heroku add-ons. There are several Elastic Search add-ons that are in free beta or have free plans. Haystack supports Elastic Search:
https://addons.heroku.com/searchbox
https://addons.heroku.com/bonsai
https://addons.heroku.com/indexden
https://addons.heroku.com/foundelasticsearch
Related
I want to work locally on my django(1.7) project and regularly deploy updates to a production server. How would you do this? I have not found anything about this in the docs. I am confused about that because it seems like many people would want to do that and there should be some kind of standard solution to this. Or am I getting the whole workflow wrong?
I should note that I'm not expecting a step-by-step guide. I am just trying to understand the concept.
Assuming you already have your deployment server setup, and all you need to do is push code to your server, then you can just use git as a form of deployment.
Digital Ocean has a good tutorial at this link https://www.digitalocean.com/community/tutorials/how-to-set-up-automatic-deployment-with-git-with-a-vps
Push sources to a git repository from a dev machine.
pull sources on a production server. Restart uwsgi/whatever.
There is no standard way of doing this, so no, it cannot be included with Django or be thoroughly described in the docs.
If you're using a PaaS how you deploy depends on the PaaS. Ditto for a container like docker, you must follow the rules of that particular container.
If you're old-school and can ssh into a server you can rsync a snapshot of the code to the correct place after everything else is taken care of: database, ports, webserver setup etc. That's what I do, and I control stuff with bash scripts utilizing a makefile.
REMOETHOST=user#yourbox
REMOTEPATH=yourpath
REMOTE=$REMOTEHOST:$REMOTEPATH
make rsync REMOTE_URI=$REMOTE
ssh $REMOTEHOST make -C $REMOTEPATH deploy
My "deploy"-action is a monster but might be as easy as something that touches the wsgi-file used in order to reload the site. My medium complex ones cleans out stale files, run collectstatic and then reloads the site. The really complex ones creates a timestamped virtualenv, cloned database and remote code tree, a new server-setup that points to this, runs connection tests on the remote and if they succeed, switches the main site to point to the new versioned site, then emails me the version that is now in production, with the git hash and timestamp.
Lots of good solutions. Heroku has a good tutorial: https://devcenter.heroku.com/articles/getting-started-with-django
Check out a general guide for deploying to multiple PaaS providers here: http://www.paascheatsheet.com
Finally the time has come and I'm ready to deploy my first django project.
I'm a newbie in web development stuff and now the real fun begins.
This is a low scale site for computer jobs.
I want to start with a free tier and grow from there as need emerges.
I've read some guides regarding django project deployment but could not find all answers.
so hope some guys here could help me out:
I've been thinking on getting Amazon EC2 free tier VPS, is this a good option?
my local development machine runs Ubuntu, I've read that i could install 10GB Ubuntu image, do you recommend such image?
should I go with apache or lighter web server?
My project is hosted on bitbucket, I just need to checkout my project on my VPS right?
What about data backups? I would like to backup my mySQL DB
How do you recommend me serving the static files?
I'm looking for a good tutorial on how to setup AWS with django and mysql
10x guys!
I've been thinking on getting Amazon EC2 free tier VPS, is this a good
option?
If it fufills your technology requirements, ram, cpu, memory; it is a good option.
my local development machine runs Ubuntu, I've read that i could
install 10GB Ubuntu image, do you recommend such image?
Might as well keep your environments the same if you can. If you can match up versions that is another plus
should I go with apache or lighter web server?
Either, Apache would probably be easier to deploy at this point because you don't have to worry about running it as a servicer ( using a program like like supervisor to manage it).
Whichever one you choose, there is an abundance of tutorials online describing how to set up django.
My project is hosted on bitbucket, I just need to checkout my project
on my VPS right?
That is one way. There are lots of ways to deploy. I like syncing the actual files using fabric. That way your production server doesn't need to know about your bitbucket account. Once again, there are so many tutorials online describing deploying django. Fabric is a great place to start.
What about data backups? I would like to backup my mySQL DB
There exists lots of tools for this. Plenty of premade tools and shell scripts. I have used automysqlbackup and it works great http://sourceforge.net/projects/automysqlbackup/
How do you recommend me serving the static files?
Make sure the webserver serves them. If you deploy through apache you can set up an alias to serve static files very easily. You can come up with a collectstatic deployment scheme to put your static on s3, but for a simple site apache would be just fine
I'm looking for a good tutorial on how to setup AWS with django and
mysql
Perhaps you can find a tutorial that covers this, most likely you might just find a tutorial :
how to setut aws with ubuntu
Installing django / mysql on ubuntu
I have never actually worked for a company which is deploying a Django App (with a large user base), and am curious about what is the best way to do this.
Right now I am hosting a Django App on EC2. The code for the app is sitting in my github account. I have nginx serving static content, and behind it a single apache server running django + mod_wsgi.
I am trying to figure out what the best practice is for "continuous deployment". Right now, after I have added additional functionality I do the following on EC2:
1) git reset HEAD --hard
2) git pull
3) restart apache
4) restart nginx
I have custom logic in my settings.py file so that if I am running on EC2, debug gets set to False, and my databases switch from sqlite3 (development) to mysql (production).
This seems to be working for me now, but I am wondering what is wrong with this process and how could I improve it.
Thanks
I've worked with systems that use Fabric to deploy to multiple servers
I'm the former lead developer at The Texas Tribune, which is 100% Django. We deployed to EC2 using RightScale. I didn't personally write the deployment scripts, but it allowed us to get new instances into the rotation very, very quickly and scales on-demand. it's not cheap, but was worth every penny in my opinion.
I'd agree with John and say that Fabric is the tool to do this sort of thing comfortably. You probably don't want to configure git to automatically deploy with a post commit hook, but you might want to configure a fabric command to run your test suite locally, and then push to production if it passes.
Many people run separate dev and production settings files, rather than having custom logic in there to detect if it's in a production environment. You can inherit from a unified file, and then override the bits that are different between dev and production. Then you start the server using the production file, rather than relying on a single unified settings.py.
If you're just using apache to host the application, you might benefit from a lighter weight solution. Using fastcgi with nginx would allow you to do away with the overhead of apache entirely. There's also a wsgi module for nginx, but I don't know if it's production ready at this point.
There is one more good way how to manage this. For ubuntu/debian amis it is good to manager versions and do deployemnts by packeging your application into .deb
I'm currently working with a team in my University to put together a new webapp. Nothing too fancy, just run of the mill MySQL + Django. We are also hoping to use Git for source control. We were wondering what hosting options were available to us. We're all very competent with Unix, so a ssh connection would be preferable. We also looked into the Amazon Cloud, but are not sure if that's right for us. What does Stackoverflow suggest for a provider to host both a Git repo for us and our webapp. The simpler, the better. It should also run a Linux environment.
I have had great success using the Rackspace Cloud servers. You get root SSH into the server, so you can set up your Git repo and your web app there. They have a lot of options for which flavor of Linux you want to use as well.
I'm doing Django/Postgres on an Ubuntu server and haven't had any problems at all. As a bonus, it includes very easy web and API integration with their CDN if you're interested in that.
I looked into a variety of cloud providers and RS had the best options for me, although CDN integration was a big deal for my site so that factor weighed heavier than it might for you.
I use the cheapo 256MB RAM/10GB HD install and pay around ~$12/month after bandwidth costs are figured into it.
Here's the pricing: http://www.rackspace.com/cloud/cloud_hosting_products/servers/pricing/
Why not AWS? It has a free tier that is able to run basic Django apps well. You can run it using a Django AMI directly or a service like BitNami Cloud Hosting (Disclaimer: I am a BitNami developer, I am actually in charge of many of the Python-based stacks). Both options allow you to run a micro instance of an Amazon Machine for free (680Mb Ram, 10Gb disk).
On BitNami Cloud Hosting, we recently added support for Python and Django (Python 2.6.5 and Django 1.3) and we already included Git. When you select to create a new server you will have access to all those components on top of Ubuntu 10.04.
Also if you are interested in using Redmine (as dgel suggests) you can select to install it when you create your server in the same machine. Since it is an university project, you may also want to consider hosting the Git part on github.com for free.
I would highly recommend sourcerepo.com for git and redmine hosting. $6.95 per month for unlimited projects including redmine instances with git hooks. You don't need to worry about setting up or maintaining the git repos or redmine instances yourself.
Then for your project's public hosting you can't beat linode.com for $19.95 per month.
I have five nodes behind a load balancer and I'm trying to determine the optimal configuration for a Django based site.
Each node has access to Postgres, mod_wsgi, Apache, Lighttpd, memcached, pgpool2 (for database replication) and glusterfs(for media file replication) and is running Ubuntu 8.04LTS.
So far, the setup is four nodes running Apache/Lighttpd/memcached/pgpool2 all reading/writing to one master node that is running the "master" Postgresql. Each of the four web nodes is also running Postgres for replication from the master via pgpool.
So, my question is: How would you configure this setup and/or what would you change so that there is no single point of failure, if possible?
This sounds like a good setup, although its hard to know exactly what your setup looks like. In terms of memory etc. and what traffic you expect to handle.
You might want to consider using Django's multidb support and have a read only postgres instance (use DB routing to direct reads to the read only for certain apps). This can offer up some quite nice speed improvements - and at the moment you could have a potential bottleneck at the single postgres instance depending how heavy your database work is.
As #ashwoods suggested, it might be working looking into gunicorn and nginx. I guess at the moment you use Apache only to run mod_wsgi? And lighttpd for the static files? With nginx, you can use it with a number of wsgi servers and its great for static files too.
The setup looks pretty good to me. I would consider using gunicorn/uwsgi + nginx. I would also benchmark using pbbouncer, although pgpool2 offers more out of the box.