django: pip install 'app' installs to the venv? - django

I read that it is not proper to have the venv in the git repo, which should be taken care of by requirements.txt, but I have run into a problem...
If I am working in my venv and I do a pip install of some app, it installs it into the venv site-packages. It still works and everything if I add it to installed_apps, but what if I make some changes within that directory? Then git doesn't track them and I am out of luck when I try to push it.
What is the proper way to do this?
EDIT: I must be having a huge miscommunication here so let me explain with a concrete example...
I run...
pip install django-messages
This then install django messages into me venv, I know I can run...
local...pip freeze > requirements.txt
remote....pip install -r requirements.txt
My problem is that I want to make changes to django-messages/templates or django-messages/views thus deviating my django-messages from the one which can be installed from requirements.txt
I don't see how these are to remain in my venv without being completely uneditable/untrackable

This is exactly how it is supposed to work. You track what libraries you install via your requirements.txt, which is committed along with your code. You use that file to generate your venv, and the libraries are installed there. You don't include the venv itself in your repo.
Edit The reason you are finding this hard is that you are not supposed to do that. Don't change third-party projects; you should never need to. They will be configurable.
If you really really find something you need to fix, do as suggested in the comments and fork the app. But this is definitely not something you need to do all the time, which points to the likelihood that you have not understood how to configure the apps from within your own project.
For example, in the case of customising templates, you can simply define the templates inside your own templates dir, rather than editing the ones provided with the app; Django does the right thing and uses yours first.

From your edits it looks like what you want to do is fork the django-messages library. This means that installing it into site-packages is a bad idea in the first place, since site-packages is not supposed to be version controlled or edited, it is designated for 3rd party software. You have two options. You can just grab the source from GitHub and put it somewhere where your Django app can find it (maybe fiddle with your python path) and add this location to git. Maybe even make your own fork on github. The second option is to use pip install -e github.com/project to have pip install an "editable" version. The advantage of the first way is better control over your changes, the advantage of the second way is having pip manage source download and install.
That being said, you seem kinda new to python environment. Are you REALLY sure you want to make your own fork? Is there some functionality you are missing that you want to add to the messages library? You do know that you can override every single template without changing the actual library code?

Related

Using Sass with Django

Im looking for a reasonably simple toolset and workflow for incorporating Sass into my Django projects. Im predominantly backend focused and have just started investigating Sass, so bear with me.
My initial idea was to keep things simple by just using node-sass without trying to incorporate Gulp, django-pipeline or anything else initially.
My Django apps are usually structured such that I create a static/app/css folder in each app. One option I guess would be to now create an additional folder per app for scss files, ie: static/app/scss. The problem there would be that when running collectstatic in production, the scss files will be gathered as well. So should the scss files for each app be kept somewhere else? (I guess it doesn't really matter if the scss files are included when collectstatic runs?)
Next, outside of my Django project folders I would create a folder to install node-sass since I wouldn't want to install it globally and I don't want the node-modules folder inside my Django project or inside source control.
I guess the node-modules folder can be thought of like using a python virtualenv instead of installing packages globally?
Next, inside my Django project somewhere (not sure where?) I would have the package.json file containing a scripts section for every scss file I want compiled to css, eg:
"scripts": {
"compile:sass": "node-sass app1/static/app1/scss/style.scss app1/static/app1/css/style.css",
"compile:sass": "node-sass app2/static/app2/scss/style.scss app2/static/app2/css/style.css",
"compile:sass": "node-sass app3/static/app3/scss/style.scss app3/static/app3/css/style.css"
}
Lastly, I would just run compile:sass with the watch flag to constantly compile any files I work on and put them in the correct folders.
So my questions are, is the above setup a good approach (at least initially if im not ready to add yet another tool like Gulp etc to the mix)?
Also, how will I run compile:sass considering my package.json file will be in the Django project somewhere and the node-modules folder containing the node-sass installation will be somewhere else.
I help maintain node-sass, so I won't say not to use it. There is an alternate libsass-python that you might want to look at if you're working with Python though.
Check out the django-sass-processor package. It's simple to configure and use. I've used it a few times and have had good experiences with it. The package abstracts away Gulp, so you don't have to worry about it and streamlines the whole process.
Here's a tutorial on how to integrate django-sass-processor into a Django project.

modifying 3rd party django projects

I'm to the point where I use many 3rd party apps, but I need to extend them. I'm trying to figure out the best way to modify an existing app. In the past I have just modified the code in the site-packages (knowing that it was bad), but that has its obvious downfalls.
Right now the closest I can get is to fork the github repo, and then I create an app in my project that I hook up to the forked github repo.
The problem is that what I'm forking is the project and not the app. So that means I have a structure like: project/app-project/app after I fork. Other things are also in the app-project directory such as LICENSE, AUTHORS etc.
Project
--App1
----Code
--App2
----Code
--ForkedAppProject
----LICENCE
----AUTHORS
----ForkedApp
------Code
If I just take the code inside the App directory, I can modify that, but then I'm losing all the source control for the Project (LICENSE, AUTHORS, etc) and that kind of defeats the purpose of forking the open-source app.
I want to set it up so that I am modifying the app code, but I am modifying so that my changes could contribute to the open-source project.
It's not necessary to place forked app inside your project. After forking and modifying you can add ForkedApp folder to PYTHONPATH or build your own package like this
python setup.py sdist
Then install it in your system/virtualenv and update INSTALLED_APPS.
The simplest and most convenient way to do this is to use pip --editable option. See “Editable” Installs in pip documentation.

Best way to incorporate external django app into a project and safely make local changes

I am working on an ecommerce site, based on django-lfs, and I am finding that I need to make a number of changes to the django-lfs core files... i.e. adding additional properties to models, updating views, adding url patterns etc. I started out placing django-lfs in my pythons site-packages path with the view that if i needed to make any changes to the code I would either re-define the url pattern (in the case where i needed to do something different in a view) or monkey patch code.
The thinking behind this was that i'd be able to keep the original django-lfs trunk clean and un-touched, allowing me to update it to the most recent version and then independently update / test the local overrides, subclassing and monkey patches written.
As you might have guessed, this is quickly becoming a bit of a nightmare to manage so i'm in desperate need of a cleaner and more stable solution.
The client project sits in a git repo and so i have been looking into submodules or subtree merging strategies... from everything i have read though, I am finding it difficult to find any definitive answers that are simple to understand (i'm relatively new to git).
In short, I need to be able to:
1) Include an external git repository into the main projects repo
2) Either make changes directly to the external repo (but have it push to the projects git repo and not to the external remote origin) OR create a local copy of the external repo and then periodically merge the external repo with the copied folder.
I have no idea how to achieve this though. to be clear, I would like to end up with the following folder structure:
PROJECT_NAME
MEDIA
TRUNK
APPS
django-lfs
EXTERNAL-REPOS
django-lfs
The lfs app in the external-repos folder should be able to pull down updates from the official (external) django-lfs repository and i should be able to freely make changes to tehe lfs folder stored in the APPS folder.
What i'm looking for, if at all possible, is a set of git commands / instructions to achieve the above and that make use of the real-worls folders outlined above, rather than using foo and bar references.
I hold my fingers tightly crossed and hope that someone out there can offer some advice :)
My quick take on this: either fork the project on bitbucket or github (depending on your preference of hg vs. git) and make a branch for your changes.
This will make it easier to keep your branch and the official master in sync.
Then, assuming you use pip+virtualenv, add the pointer to your repo/branch in your pip requirements file.
Unfortunately LFS uses buildout, so not quite sure what the equivalent of python setup.py develop would be (i.e. to install the package in your virtualenv site-packages but with links back to your repo so you can make changes without having to constantly run setup.py).

Is there a C++ dependency index somewhere?

When trying new software and compiling with the classic ./configure, make, make install process, I frequently see something like:
error: ____.h: No such file or directory
Sometimes, I get really lucky and apt-get install ____ installs the missing piece and all is well. However, that doesn't always happen and I end up googling to find the package that contains what I need. And sometimes the package is the wrong version or flavor and is already used by another package that I downloaded.
How do people know which packages contain which .h files or whatever resource the compiler needs? Is there a dependency resolver website or something that people use to decode failed builds to missing packages? Is there a more modern method of automatically downloading and installing transitive dependencies for a build (somewhat like Java's Maven)?
You can also use "auto-apt ./configure" (on Ubuntu, and probably also on Debian?) and it will attempt to download dependencies automatically.
If it's a package in Debian, you can use apt-get build-dep to get all deps.
Otherwise, read the README that comes with the program -- hopefully, it lists all the deps for that program.
The required packages will hopefully be listed in the documentation for building the package. If it says you require foo, you'll probably want to look for foo and foo-devel, and perhaps libfoo-devel. If that doesn't help, in Fedora I'd do something like
yum install install /usr/include/_____.h
(yum will look for the package containing said file). If none of the above works, look for the file name in Google, that should tell you the package where it comes from. But then the going will get rough...

The correct place to put a markdown extension file in a django project?

I've created a markdown extension file (called mdx_xxx.py) for a django project I'm working on but I can't really decide where to put it.
The documentation says that the file should reside on the PYTHONPATH and I've seen several blog posts inviting to just put the file in the root directory of the project.
However, that seems like an odd place to me as I would rather see it in the related application directory but then it's not on the PYTHONPATH anymore.
Could some experienced django programmer shed some light on this issue?
Thanks
Requiring extension files to live directly on the Python path, and not inside any package, is (IMO) an unfortunate limitation of the Python markdown implementation.
If your extension is very specific to your project, I think putting it in the project root is the best option available.
On the other hand, if your extension is reusable in other cases, I would package it up with a simple setup.py and install it into a virtualenv using pip, like I do with all my other dependencies.
Not everything should be in your project. This is a requirement, a dependency. You could still package them together, and you'll need to put this at the top level, I think. That basically means importable from the same location as the project itself. Personally, I push everything to a virtualenv, so its nice and clean. If you do the same, you're deployment process should include putting both your project and any dependencies safely into that virtualenv. Otherwise, to whatever location you have in path.
If you are using standard markdown library from pip (pip install markdown) now at version 2.3.1, the extension can be anywhere. You just have to provide dotted path to it. The old-style - having it directly on the PYTHONPATH in module prefixed mdx_ still works as well.
I have it in my app code:django_file_downloads.mdx_download.
To use it from django templates:
{% load markup %}
{{ variable|markdown:'django_file_downloads.mdx_download' }}