Installing a gem bundle in a python app on Heroku - django

I have a Python app on Heroku running with Django. The app launches and works perfectly. The first couple lines of a push look like this:
(venv)➜ djangoproject git:(development) ✗ git push
Counting objects: 33, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (21/21), done.
Writing objects: 100% (21/21), 1.96 KiB, done.
Total 21 (delta 15), reused 0 (delta 0)
-----> Heroku receiving push
-----> Python/Django app detected
...
I need to install a gem program on the dyno (specifically, Compass).
Heroku's instructions are to provide a Gemfile and Gemfile.lock in the root directory with the required gems. As soon as I provide this, however, Heroku thinks the app is a Ruby app:
(venv)➜ djangoproject git:(development) ✗ git push
Counting objects: 33, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (21/21), done.
Writing objects: 100% (21/21), 1.96 KiB, done.
Total 21 (delta 15), reused 0 (delta 0)
-----> Heroku receiving push
-----> Ruby app detected (NOTE: this is paraphrased)
...
Is there any way I can install a ruby gem while running the site as a Python/Django app?

Try explicitly selecting the python buildpack by using heroku config:add BUILDPACK_URL=https://github.com/heroku/heroku-buildpack-python.git
It will still perform the detection process but I think (?) it will run the buildpack you've explicitly selected before or instead of attempting any others, and since you still have a python application installed, it should work.
Note that after you do the config:add you need to rebuild your slug on Heroku, which currently can ONLY be done by pushing an actual code change via git. You can make an empty git commit if you don't have any real changes to push, using git commit --allow-empty -m "Empty commit"
You can also create a new project using the --buildpack command line option.

I ran into the same problem and this worked for me:
https://github.com/ddollar/heroku-buildpack-multi
How it works:
You explicitly tell Heroku that you want to use this "multi" buildpack using the "heroku config:add BUILDPACK_URL=..." command
You create a .buildpacks file in your root directory that simply lists the git URLs for the various buildpacks you want to use. I used the python and ruby buildpacks.
git push to Heroku and watch all buildpacks get used
It's also worth mentioning that the python buildpack has a couple of hooks that you can use to do additional custom work. If you create a bin/pre_compile file or a bin/post_compile file then those scripts will be called by the python buildpack just before/after the main compile step. So you could also use these hooks to get Ruby or other dependencies installed. But IMO it's easier to let Ruby's own buildpack install the Ruby dependencies.

You need to use custom buildpack that allows you to build both ruby and python dependencies.
heroku config:add BUILDPACK_URL=https://github.com/mandest/heroku-buildpack-rubypython
Add a Gemfile to your project Run bundle install locally (to create
Gemfile.lock file) Push your Gemfile and Gemfile.lock to heroku
That should first install ruby, then run bundle install, then install python, and all deps in the requirements.txt file.
Howeve, in my case, I also wanted to run some commands using ruby libraries, namebly SASS/COMPASS. In order to do that, you have two options I think. First one, is to fork above repository and add running those commands in the build (this way they have all needed privileges rather than you running heroku run ...).
The second options is to add a Rakefile and specify those things in rake assets:precompile task.
So, in my case with Compass the Rakefile looks like:
require 'yaml'
require 'pathname'
require 'rspec/core/rake_task'
include FileUtils
namespace 'assets' do
desc 'Updates the stylesheets generated by Sass/Compass'
task :precompile do
print %x(compass compile --time)
end
end

Related

create dockerfile to build new images

FROM denmarkcontrevida/base:15.05
MAINTAINER Denmark Contrevida<denmarkcontrevida#esutek.com>
# Config files
# Config pyenv
# Config Nginx
# Config PostgreSQL
# Create DB & Restore database
This image will install to the newest version.
PostgreSQL
Nginx
Pyenv
Django
Python 3
IF are about to install that many different services, make sure to start from a base image made to manage them.
Use phusion/baseimage-docker (which always starts the my_init script, to take care of the zombie processes)
In that image, you can define multiple program (daemon) to run:
You only have to write a small shell script which runs your daemon, and runit will keep it up and running for you, restarting it when it crashes, etc.
The shell script must be called run, must be executable, and is to be placed in the directory /etc/service/<NAME>.
If your base image has a /etc/service/helper/run script, then any image based on it would run helper, plus any other /etc/service/xxx/run script of your own: replace xxx by the running services like nginx, django, postgresSQL.
You wouldn't need it for python3 (which is simply called, but doesn't run in the background)

heroku PostGIS syncdb error

I am having trouble getting a simple GeoDjango app running on heroku. I have created the postgis extension for my database but I am not able to run syncdb without getting the following error:
from django.contrib.gis.geometry.backend import Geometry
File "/app/.heroku/python/lib/python2.7/site-packages/django/contrib/gis/geometry/backend/__init__.py", line 14, in <module>
'"%s".' % geom_backend)
django.core.exceptions.ImproperlyConfigured: Could not import user-defined GEOMETRY_BACKEND "geos".
Any ideas what I am doing wrong? Also does anyone know of a tutorial for getting a simple geodjango project running on heroku? Thanks for your help
I ran into this same issue and Joe is correct, you are missing a buildpack. What I did differently was include both the heroku-geo-buildpack and the heroku-buildpack-python. Both can be included by using the heroku-buildpack-multi and adding a ".buildpacks" file to your home directory in which to include the other buildpacks.
https://github.com/ddollar/heroku-buildpack-multi
So set buildpack-multi as your buildpack and add a .buildpacks file in your project base directory:
$ heroku config:set BUILDPACK_URL=https://github.com/ddollar/heroku-buildpack-multi.git
$ touch .buildpacks
# .buildpacks
https://github.com/cyberdelia/heroku-geo-buildpack.git#1.0
https://github.com/heroku/heroku-buildpack-python
When you push this, Heroku will install the software packages required to run python (python, pip, etc), along with the software packages required to run postgis (geos, proj and gdal).
I gave heroku-buildpack-geodjango a try but I believe it might be out of date (hasn't been updated in a year).
I just ran into the exact same error after using the multi buildpack method from ddollar https://github.com/ddollar/heroku-buildpack-multi with no problems up until this morning. As Jeff writes, you just point your buildpack at the multi and then add a .buildpacks file.
$ heroku config:set BUILDPACK_URL=https://github.com/ddollar/heroku-buildpack-multi.git
$ cat .buildpacks
# .buildpacks
https://github.com/cyberdelia/heroku-geo-buildpack.git
https://github.com/heroku/heroku-buildpack-python
Also dont forget to add django.contrib.gis to the apps in settings.
Everything should go well and install the geos and gdal libs when you push to heroku but you will find that django doesnt find them and you get the error. This is because django wants the full path as per the docs:
https://docs.djangoproject.com/en/dev/ref/contrib/gis/install/geolibs/
So add this to settings.py:
GEOS_LIBRARY_PATH = "{}/libgeos_c.so".format(environ.get('GEOS_LIBRARY_PATH'))
GDAL_LIBRARY_PATH = "{}/libgdal.so".format(environ.get('GDAL_LIBRARY_PATH'))
It seems like you are missing some C libraries. Consider the GeoDjango Heroku buildpack:
https://github.com/cirlabs/heroku-buildpack-geodjango/
heroku create --stack cedar --buildpack http://github.com/cirlabs/heroku-buildpack-geodjango/
git push heroku master
The required libraries should be installed automatically using these commands.

heroku buildpack for python/geos

I am running my django app on heroku. I want to use their websolr plugin to add spatial search to the app via django haystack.
Spatial serach in django haystack depends on the GEOS C library that is not deployed on heroku by default.
So in order to use spatial search I followed https://devcenter.heroku.com/articles/buildpack-binaries to create a binary package of GEOS.
To deploy the binaries I forked the heroku buildback for python and modified bin/compile to include:
AWESOME_VM_BINARY="http://vulcan-dtornow.herokuapp.com/output/05391114-f314-4aa7-9aab-bc09025d4898"
mkdir -p /app/.heroku/vendor/geos
curl $AWESOME_VM_BINARY -o - | tar -xz -C /app/.heroku/vendor/geos -f -
I added the custom build pack to my application, redeployed but still I cannot access the library. When I run ls the geos folder does not show up
heroku run ls /app/.heroku/vendor
Any idea what I am missing? Thanks for your help!
Another option is using a buildpack that only contains the geospatial libraries and combine it with the python buildpack. This is a cleaner separation:
https://github.com/cyberdelia/heroku-geo-buildpack/
in combination with
https://github.com/heroku/heroku-buildpack-multi
To use it add a .buildpacks file to your repo that looks something like this
https://github.com/cyberdelia/heroku-geo-buildpack.git
https://github.com/heroku/heroku-buildpack-python.git
(the use of multi buildpacks is explained in the multi buildpack repo as well)
You should be able to use the GeoDjango buildpack that was already created here

django + virtualenv = atomic upgrade - is it possible?

I have a Django project running within a virtualenv with no site-packages. When it comes to pushing my new changes to the server, I would like to create a new virtualenv directory, install my project and all its dependancies, then do a quick renaming of the two virtualenv directories ONLY if the new code deployed successfully.
All is great on paper, till the point you rename the virtualevn directory. Relocate option on virtualenv is not reliable as per its documentation.
How do you suggest upgrading my project ONLY if the new code is proven to be deployable.
Here are the steps:
# fab update_server to run the following:
cd /srv/myvenv # existing instance
cd ../
virtualenv myenv-1
source myenv-1/bin/activate
git co http://my.com/project
pip install -r project/req.txt
# all worked
mv myenv myenv-2; mv myenv-1 myenv
touch /path/proj.wsgi # have apache to reload us
The above is perfect, but renaming or relocating virtualenv is not reliable.
Upgrading the live site within myvenv takes time and may break the site too.
How would you do it?
Buildout?
I do it with symlinks and completely separate release directories. Ie, a deployment involves cloning the entire project into a new directory, building the virtualenv inside that, then switching the "production" symlink to point at that directory.
My layout is basically
/var/www/myapp/
uploads/
tmp/
releases/
001/myapp/
002/myapp/
003/myapp/
ve/
...etc in each release directory...
myapp # symlink to releases/003/myapp/
So, when I deploy to production, my deployment scripts rsync a completely fresh copy to /var/www/myapp/releases/004/myapp/, build a virtualenv in there, install all the packages into it, then
rm -f /var/www/myapp/myapp
ln -s /var/www/myapp/releases/004/myapp/ /var/www/myapp/myapp
My actual deployment script is a little more complicated as I also make sure to keep the previous release around and marked so if I notice that something is really broken, rolling back is just a matter of switching the symlink to point back at the previous one. (some extra work is also necessary to clean up old, unused releases if you are worried about the disk space).
Every external reference (in apache config files or wsgi files or whatever) point at libraries and binaries in the virtualenv as /var/www/myapp/myapp/ve/. I also shun the use of source ve/bin/activate and instead point at the full path in config files and I edit manage.py to use #!ve/bin/python so I can run commands with ./manage.py whatever and it will always work without me having to remember if I've activated the right virtualenv.

how to push to a remote only some directories or files? in git

I'm using heroku to develop a Django app and they use git to push the code. My problem is that they need this file structure:
heroku_project/
requirements.txt (this a pip requirements file)
procfile (this file tell heroku how to run your app)
django_project (the project itself)
lib
bin
build
lib (these 4 folders belong to my python virtual env)
So i have to have my git initialised on this folder so this means that there are this additional files:
heroku_Project/
.gitignore
.git
According to their instructions inside .gitignore there should be these lines:
bin
build
include
lib
.Python
*.pyc
The problem is that I want to track those virtual env folders, because sometimes I install python only for testing and I discard them later, or I make experimental changes on them and I wish I could undo those changes using git, my question is how can i track these folders so I need to remove them from the .gitignore. The problem is when i do
git push heroku master
As this will push those folders and we don't want that, so how I can selectively push files and directories? Or what kind of work flow would you use to solve this problem?
Thanks
First, if you're doing active development in Heroku then you may be dead in the water. But if your doing development on your local machine - branches may be able to help you out.
My advice to you would be to create a separate branch for deploying code to heroku. In this scenario you could use the master branch for active development, and keep those virtual environment folders in there - and have a separate branch (say, "production") for deploying the code to heroku.
Whenever you're ready to release a new version of your code, you should switch over to the production branch, merge in the changes from master, delete those virtual environment folders, then push to Heroku. In practice, that command sequence will look something like this.
$ git checkout production
$ git merge master
$ rm -Rf bin build include lib .Python *.pyc
$ git commit -a -m "Cleanup for production."
$ git push heroku production
That seems as though it will be the best working solution. Some vectors you may want to look into on your own:
Ways to automate the process of deleting the files via shell scripts and git hooks.
To make sure that Heroku can use a branch other than "master" for running code (I would think that it should be able to).
To see if it may be possible to use different .gitignore files in different branches, and if so - whether or not that will remove the cleanup process of deleting those files manually.
Hope this helps!
Why don't you try virtualenvwrapper? It separates the virtualenv from your development environment.
Typical scenario is that you work on one virtualenv, let's say "main_env".
mkvirtualenv main_env
And when you need another one for testing, you can do
mkvirtualenv test_env
and you can switch between them with one command: workon [name]. You really shouldn't keep those files in git. They're simply not related to project. And thanks to virtualenwrapper, you don't need git to switch between those virtual environments.
If you insist on keeping them, well, you can simply NOT stage them in git. If you don't add a file/folder with git add, it won't be sent to the server.