"github" and "git heroku" easy way to keep both - django

Until now I used svn as source control. At this time I have started a new project and it is stored on gitHub.
Issue is that Heroku and GitHub, both use git. First one to publish app and second one for version control.
My app schema is:
base-dir <--github base
some-text-files (Readme, ... )
django-project-dir <--heroku base
manage.py
main-app-dir
settings.py
other-app-dirs
views.py
models.py
When I push to gitHub base-dir and all subfolders are pushed.
To Heroku only django-project-dir should be pushed.
Notice: I have tried to create a new git repository at django-project-dir level but git take it as a submodule and excluded from gitHub.
Because this is a new project I can easily change to another schema dirs.
My question:
What is the easy way to coexist both Heroku and GitHub git configurations?

Your best option is probably to push the full repository to Heroku, but make sure Heroku ignores all files not required to run your application (see https://devcenter.heroku.com/articles/slug-compiler). Alternatively, consider creating two repositories (one for documentation and one for production code).

Your best bet is to move you readme and other files to your project root. Then just add GitHub as a separate remote (when you're in your project directory).
git remote add origin https://github.com/USERNAME/REPO
Then you can push to GitHub with git push origin master. You will have to do a forced push (the -f option) the first time assuming you're pushing what used to be the repo you used exclusively for Heroku.
You'll still be able to push to Heroku with git push heroku master.

You should have two remotes.
This is good and even desirable.
You have github and that's your remote code repository of record.
Then you have a current deployment via heroku and that is the 2nd remote.
Heroku is actually set up to use git as part of the system of pushing changes to your site on it.

Related

Local -> Github working | Local -> Heroku not working

I have a Django application which is present in my local system.
As I want to ignore db.sqlite3 files while transferring the repository, I have put the following in .gitignore
db.sqlite3
I push it to Github using:
git push origin master
When I do this, the updated db.sqlite3 from local system is NOT transferred to git.
As the next step, I need to transfer the files from local system to Heroku using:
git push heroku master
However, it seems that the file from Github is copied to heroku, which is weird.
Perhaps my understanding of the git push heroku master is incorrect.
Deployment method is using heroku cli
To check this weird way of working :
I added couple of entries in db.sqlite3 in my local system
I made a small change to the code in my local system
I made new entries in the Django application which is deployed to heroku
I pushed the application to Github using git push origin master and checked the timestamp on db.sqlite3 in git and it wasn't changed - I downloaded the db.sqlite3 from git and checked, the new entries that I made to the local system weren't there. This is good.
I pushed the application to Heroku using git push heroku master and found that the entries which I made in step 3 are gone and the entries in step 1 are also not reflected.
I checked my Github db.sqlite3 file and heroku db.sqlite3 file and they matched.
My requirements are as follows :
The changes to the data in db that I make in my local system should not reflect in the application deployed to heroku (I believe therefore .gitignore -> db.sqlite3)
The structural and the application changes should only go to production.
Any pointers in the right direction ?
I figured this out, like my last two queries.
I was misled by this command :
git update-index --assume-unchanged db.sqlite3
Though this link clearly tells not do to so.
For the solution, git and .gitignore works perfectly fine (stating the obvious) . It requires only one entry called db.sqlite3 and you need to ensure that you do not send db.sqlite3 to heroku. You need to have your .gitignore file updated with db.sqlite3 and use PostgreSQL in heroku.
When I did this, I received an error called django-session not setup. Basically it meant that PostgreSQL is not ready for use. You need to ensure that you are ready to follow the steps below.
Few things to remember :
When experimenting with Django, locally you use db.sqlite3 and eager to send the database file db.sqlite3 to heroku and do not make entries in .gitignore. Don't do that.
In local, use db.sqlite3 and while deploying to heroku , use PostgreSQL
Create a virtual environment using pipenv
Use pipenv install psycopg2
Use heroku run bash -a <appname>
Go to manage.py folder and run python manage.py migrate
Create your superuser python manage.py createsuperuser
This worked for me. I shall come back and update this a bit more. Three days of brain-wreck.
Finally keep searching in Github , we have a goldmine of problems and solutions already provided. Sometimes we just need to connect the dots.

Force recreation of Heroku slug on git push

I'm hosting a Django app on Heroku for a while already, and so far it worked fine. I now made some changes in the requirements.txt file, but feel like they are not correctly installed on Heroku when I make the git push. The log output shows the installation of some of the requirements, but not all of them. I guess the slug does not get recreated, but rather just updated. Is there a way to force the recreation of the complete slug?
This plugin https://github.com/lstoll/heroku-repo will allow you to do that. Just install and run heroku repo:rebuild -a appname

Multiple Django sites with shared codebase and DB

I have created a Django proyect with 20 sites (one different domain per site) for 20 different countries. The sites share everything: codebase, database, urls, templates, etc.
The only thing they don't share are small customizations (logo, background color of the CSS theme, language code, etc.) that I set in each of the site settings file (each site has one settings file, and all of these files import a global settings file with the common stuff). Right now, in order to run the sites in development mode I'll do:
django-admin.py runserver 8000 --settings=config.site_settings.site1
django-admin.py runserver 8001 --settings=config.site_settings.site2
...
django-admin.py runserver 8020 --settings=config.site_settings.site20
I have a couple of questions:
I've read that it is possible to create a virtual host for each site (domain) and pass it the site's settings.py file. However, I am afraid this would create one Django instance per site. Am I right?
Is there a more efficient way of doing the deployment? I've read about django-dynamicsites but I am not sure if it is the right way to go.
If I decide to deploy using Heroku, it seems that Heroku expects only one settings file per app, so I would need to have 20 apps. Is there a solution for that?
Thanks!
So, I recently did something similar, and found that the strategy below is the best option. I'm going to assume that you are familiar with git branching at this point, as well as Heroku remotes. If you aren't, you should read this first: https://devcenter.heroku.com/articles/git#multiple-remotes-and-environments
The main strategy I'm taking is to have a single codebase (a single Git repo) with:
A master branch that contains all your shared code: templates, views, URLs.
Many site branches, based on master, which contain all site-specific customizations: css, images, settings files (if they are vastly different).
The way this works is like so:
First, make sure you're on the master branch.
Second, create a new git branch for one of your domains, eg: git checkout -b somedomain.com.
Third, customize your somedomain.com branch so that it looks the way you want.
Next, deploy somedomain.com live to Heroku, by running heroku create somedomain.com --remote somedomain.com.
Now, push your somedomain.com branch code to your new Heroku application: git push somedomain.com somedomain.com:master. This will deploy your code on Heroku.
Now that you've got your somedomain.com branch deployed with its own Heroku application, you can do all normal Heroku stuff by adding --remote somedomain.com to your normal Heroku commands, eg:
heroku pg:info --remote somedomain.com
heroku addons:add memcache:5mb --remote somedomain.com
etc.
So, now you've basically got two branches: a master branch, and a somedomain.com branch.
Go back to your master branch, and make another new branch for your next domain: git checkout master; git checkout -b anotherdomain.com. Then customize it to your liking (css, site-specific stuff), and deploy the same way we did above.
Now I'm sure you can see where this is going by now. We've got one git branch for each of our custom domains, and each domain has it's own Heroku app. The benefit (obviously) is that each of these project customizations are based off the master branch, which means that you can easily make updates to all sites at once.
Let's say you update one of your views in your master branch--how can you deploy it to all your custom sites at once? Easily!
Just run:
git checkout somedomain.com
git merge master
git push somedomain.com somedomain.com:master # deploy the changes
And repeat for each of your domains. In my environment, I wrote a script that does this, but it's easy enough to do manually if you'd like.
Anyhow, hopefully that helps.

Google App Engine Development and Production Environment Setup

Here is my current setup:
GitHub repository, a branch for dev.
myappdev.appspot.com (not real url)
myapp.appspot.com (not real url)
App written on GAE Python 2.7, using django-nonrel
Development is performed on a local dev server. When I'm ready to release to dev, I increment the version, commit, and run "manage.py upload" to the myappdev.appspot.com
Once testing is satisfactory, I merge the changes from dev to main repo. I then run "manage.py upload" to upload the main repo code to the myapp.appspot.com domain.
Is this setup good? Here are a few issues I've run into.
1) I'm new to git, so sometimes I forget to add files, and the commit doesn't notify me. So I deploy code to dev that works, but does not match what is in the dev branch. (This is bad practice).
2) The datastore file in the git repo causes issues. Merging binary files? Is it ok to migrate this file between local machines, or will it get messed up?
3) Should I be using "manage.py upload" for each release to the dev or prod environment, or is there a better way to do this? Heroku looks like it can pull right from GitHub. The way I'm doing it now seems like there is too much room for human error.
Any overall suggestions on how to improve my setup?
Thanks!
I'm on a pretty similar setup, though I'm still runing on py2.5, django-nonrel.
1) I usually use 'git status' or 'git gui' to see if I forgot to check in files.
2) I personally don't check in my datastore. Are you familiar with .gitignore? It's a text file in which you list files for git to ignore when you run 'git status' and other functions. I put in .gaedata as well as .pyc and backup files.
To manage the database I use "python manage.py dumpdata > file" which dumps the database to a json encoded file. Then I can reload it using "python manage.py loaddata".
3) I don't know of any deploy from git. You can probably write a little python script to check whether git is up to date before you deploy. Personally though, I deploy stuff to test to make sure it's working, before I check it in.

how to push to a remote only some directories or files? in git

I'm using heroku to develop a Django app and they use git to push the code. My problem is that they need this file structure:
heroku_project/
requirements.txt (this a pip requirements file)
procfile (this file tell heroku how to run your app)
django_project (the project itself)
lib
bin
build
lib (these 4 folders belong to my python virtual env)
So i have to have my git initialised on this folder so this means that there are this additional files:
heroku_Project/
.gitignore
.git
According to their instructions inside .gitignore there should be these lines:
bin
build
include
lib
.Python
*.pyc
The problem is that I want to track those virtual env folders, because sometimes I install python only for testing and I discard them later, or I make experimental changes on them and I wish I could undo those changes using git, my question is how can i track these folders so I need to remove them from the .gitignore. The problem is when i do
git push heroku master
As this will push those folders and we don't want that, so how I can selectively push files and directories? Or what kind of work flow would you use to solve this problem?
Thanks
First, if you're doing active development in Heroku then you may be dead in the water. But if your doing development on your local machine - branches may be able to help you out.
My advice to you would be to create a separate branch for deploying code to heroku. In this scenario you could use the master branch for active development, and keep those virtual environment folders in there - and have a separate branch (say, "production") for deploying the code to heroku.
Whenever you're ready to release a new version of your code, you should switch over to the production branch, merge in the changes from master, delete those virtual environment folders, then push to Heroku. In practice, that command sequence will look something like this.
$ git checkout production
$ git merge master
$ rm -Rf bin build include lib .Python *.pyc
$ git commit -a -m "Cleanup for production."
$ git push heroku production
That seems as though it will be the best working solution. Some vectors you may want to look into on your own:
Ways to automate the process of deleting the files via shell scripts and git hooks.
To make sure that Heroku can use a branch other than "master" for running code (I would think that it should be able to).
To see if it may be possible to use different .gitignore files in different branches, and if so - whether or not that will remove the cleanup process of deleting those files manually.
Hope this helps!
Why don't you try virtualenvwrapper? It separates the virtualenv from your development environment.
Typical scenario is that you work on one virtualenv, let's say "main_env".
mkvirtualenv main_env
And when you need another one for testing, you can do
mkvirtualenv test_env
and you can switch between them with one command: workon [name]. You really shouldn't keep those files in git. They're simply not related to project. And thanks to virtualenwrapper, you don't need git to switch between those virtual environments.
If you insist on keeping them, well, you can simply NOT stage them in git. If you don't add a file/folder with git add, it won't be sent to the server.