Which is the best way to have two different repositories on the same Django project?
I started developing a project months ago and I have the whole folder in the repository. I want to reuse some apps in the project and I would like to create a different repository for them since they will be spin-offs project. But I want to keep it updated.
Which is the best workflow, methodology, etc... to achieve this? Or is it a bad approuch?
Thanks!
Xavi
You can wrap each app as a python package, which has its own GIT repo. And save all your packages in some private (or public?) python packages repository (like Gemfury).
Then, in your projects, just use the app as you install django itself.. pip install myapp
This way the apps a reusable and decoupled from any project.
(This works very well for myself.. perhaps there is a better way)
You can use submodule,
$git submodule add git://github.com/yourusername/project2.git project2
$cat .gitmodules
.gitmodules output:
[submodule "project2"]
path = project2
url = git://github.com/yourusername/project2.git
If you want to Clone some git project like submodule,
git clone git://github.com/yourusername/project2.git
cd project2
git submodule init
Related
I have a django project A which contains an app App1. I also have a django project B in which I'd like to have the same App1. Then, when I edit App1 in my project A, I'd like to update App1 and be able to pull changes in project B.
A and B are under separate git repository.
What should be my workflow?
Should I create a submodule and a new git repo for the App1 I want to duplicate? I read a lot about people struggling with submodules and tricking git...
I don't want to break anything...
Cheers
Okay, my question was pretty stupid. I created a separate git repo for App1. I create a new directory on my computer where I copy/paste my App1 then sync it to git. Then, for each of my django project, I created git submodule and here it is!
I want to clone (via GIT) an external app into my project directory. Unfortunately there is one folder on top of the project that makes Django not see the cloned folder as an app.
For example see allauth. After cloning the app itself is in allauth/allauth resp. from the project view my_project/allauth/allauth. If just adding allauth to INSTALLED_APPS, the app is not found by the server. I also tried adding allauth.allauth, which also doesn't work.
What is the recommended way to clone an external app into a Django project folder (and manage it as submodule for example)?
you can clone it into a vendor/ directory and then symlink it's app folder into your project, but I'd recommend against that.
A better way would be to use a virtual environment, and install the application as an editable package.
$ pip install -e git+https://github.com.au/person/project#v0.1.1#egg=project
This will clone the repo into the src/ folder in your virualenv and set up the paths correctly such that it can be loaded normally with django.
I know this is perfectly possible and a lot of people already doing it, but my problem is slightly different and I couldn't figure out the solution yet:
Let a 3rd party Django App's structure as below:
django-module
module
init.py
views.py
models.py
requirements.txt
setup.py
I want to bundle only the module directory as a submodule, because then I can access views.py file just by typing "module.views". If I import the django-module directory, I would have to write django-module.module.views to access the module files, which is not feasible.
My purpose is to modify the module and make pull requests occasionally to the original repository. Is there workflow that I can follow, or what are the best practices for this purpose?
Pip tips
Pip has support for editable packages and retrieving packages with git, so you could create a virtualenv, use pip to install the packages, and update them using pip when you want.
So you could add:
-e git://git.myproject.org/MyProject.git#da39a3ee5e6b4b0d3255bfef95601890afd80709#egg=MyProject
To your requirements.txt to retrieve that exact commit.
Suggested workflow
I think that the best way to solve your issue is the following:
Make a private fork of the package
Edit the package in a specific development branch in the forked repository
Use the package from your fork's development branch in your requirements file.
When you feel like it, update the forked package where you're using it using pip.
When you're ready to make a pull request, pull origin, rebase your working branch to origin/master, and make the pull request from the master branch of your fork.
This means you have three places where the code is present:
The original repo (where you don't have access)
Your forked repo (where you work on your fork)
Wherever pip installed it (where you use your fork)
The best practice is to keep the submodules exactly as they are.
Once you've added a 3rd party module to your app as a submodule, the next step is to make sure you add that "django-module" directory to your python path. As long as django-module is on your python path, you will be able to use the submodule as you usually would by typing "module.views" as you wish.
I am finally going to start using virtualenv for my Django projects on my development machine. Before I start I want to know if there are any special considerations for dealing with my existing projects. My presumed workflow is something like:
make a new virtualenv
activate the new virtualenv
Install Django in there
pip install all the packages I know I need for my existing project
copy my Django project files, app files, and git files into the project folder within the virtualenv.
Edit
6. make requirements file for deployment
This is obviously very simplified but are there any steps or considerations I am fundamentally missing? Is git going to be happy about moving? Also is it best practice to have a separate virtualenv for each Django project?
I know this is not a typical code problem, but I hope those that know more than I do can point me in the right direction.
Many thanks.
I don't see any big issue on migrating your projects and I think your 5-steps plan is correct, in particular, for steps 3/4/5 (I'd merge them), you can handle project dependencies with pip, possibly using requirement files.
Requirement files are plain text files telling to pip which packages have to be installed in your virtualenv, included your git-tracked projects which eventually can be deployed in your virtual environment as development eggs (they bring with them version control infos).
Once you have a req file, it's a matter of:
pip install -r file.req
to have all needed packages installed in your env.
As you can see from virtualenv docs, a typical req file would contain something like:
django==1.3.0
-e git://git.myproject.org/MyProject.git#egg=MyProject
I usually keep each project in its own virtualenv, so I can deploy it to the production server the same way I do for local development.
I have a django project sitting in a bitbucket repository. Is it possible to automate using buildout, the following process:
1. Install django
2. clone the django project from hg repository
3. install the dependency modules of the django project
Update: I have achieved what I wanted to, with the help of mr.developer extension as suggested by Ross.
While doing that I had another question popping up. Which is the best place to specify the dependencies - in the buildout.cfg or in the setup.py of the 'develop' modules? For now I have duplicated the specification.
Generally, you make your checkout the buildout itself, so you'd place buildout.cfg and bootstrap.py in your project root. That way when someone checks out/clones your project, they just do the bootstrap/buildout dannce and they're up and running.
If you have multiple checkouts, then look into mr.developer.