I'm integrating two Django apps together, but am finding that one requires django-mptt version 3.x and the other requires 5.x. I can't upgrade the 3.x app because I don't 'own' that particular app and it might be needed for some old dependencies.
Is there any way short of forking and namespacing the django-mptt 5.x version so both can be used? I'd really hate to fork it and am wondering if there are better options out there.
Just wondering what others have done in similar situations. Thank you for reading.
This is not possible with the usual python tools, since virtualenv are supposed to be used for that.
One possibility is to put each version of the dependancy in each app directory, this way they may import this one first instead of the other.
But if your goal is to not edit any of the apps code, you better prey they don't play with the Python Path, nor share any imports related to the dependancies.
Related
I read that it is not proper to have the venv in the git repo, which should be taken care of by requirements.txt, but I have run into a problem...
If I am working in my venv and I do a pip install of some app, it installs it into the venv site-packages. It still works and everything if I add it to installed_apps, but what if I make some changes within that directory? Then git doesn't track them and I am out of luck when I try to push it.
What is the proper way to do this?
EDIT: I must be having a huge miscommunication here so let me explain with a concrete example...
I run...
pip install django-messages
This then install django messages into me venv, I know I can run...
local...pip freeze > requirements.txt
remote....pip install -r requirements.txt
My problem is that I want to make changes to django-messages/templates or django-messages/views thus deviating my django-messages from the one which can be installed from requirements.txt
I don't see how these are to remain in my venv without being completely uneditable/untrackable
This is exactly how it is supposed to work. You track what libraries you install via your requirements.txt, which is committed along with your code. You use that file to generate your venv, and the libraries are installed there. You don't include the venv itself in your repo.
Edit The reason you are finding this hard is that you are not supposed to do that. Don't change third-party projects; you should never need to. They will be configurable.
If you really really find something you need to fix, do as suggested in the comments and fork the app. But this is definitely not something you need to do all the time, which points to the likelihood that you have not understood how to configure the apps from within your own project.
For example, in the case of customising templates, you can simply define the templates inside your own templates dir, rather than editing the ones provided with the app; Django does the right thing and uses yours first.
From your edits it looks like what you want to do is fork the django-messages library. This means that installing it into site-packages is a bad idea in the first place, since site-packages is not supposed to be version controlled or edited, it is designated for 3rd party software. You have two options. You can just grab the source from GitHub and put it somewhere where your Django app can find it (maybe fiddle with your python path) and add this location to git. Maybe even make your own fork on github. The second option is to use pip install -e github.com/project to have pip install an "editable" version. The advantage of the first way is better control over your changes, the advantage of the second way is having pip manage source download and install.
That being said, you seem kinda new to python environment. Are you REALLY sure you want to make your own fork? Is there some functionality you are missing that you want to add to the messages library? You do know that you can override every single template without changing the actual library code?
I'm to the point where I use many 3rd party apps, but I need to extend them. I'm trying to figure out the best way to modify an existing app. In the past I have just modified the code in the site-packages (knowing that it was bad), but that has its obvious downfalls.
Right now the closest I can get is to fork the github repo, and then I create an app in my project that I hook up to the forked github repo.
The problem is that what I'm forking is the project and not the app. So that means I have a structure like: project/app-project/app after I fork. Other things are also in the app-project directory such as LICENSE, AUTHORS etc.
Project
--App1
----Code
--App2
----Code
--ForkedAppProject
----LICENCE
----AUTHORS
----ForkedApp
------Code
If I just take the code inside the App directory, I can modify that, but then I'm losing all the source control for the Project (LICENSE, AUTHORS, etc) and that kind of defeats the purpose of forking the open-source app.
I want to set it up so that I am modifying the app code, but I am modifying so that my changes could contribute to the open-source project.
It's not necessary to place forked app inside your project. After forking and modifying you can add ForkedApp folder to PYTHONPATH or build your own package like this
python setup.py sdist
Then install it in your system/virtualenv and update INSTALLED_APPS.
The simplest and most convenient way to do this is to use pip --editable option. See “Editable” Installs in pip documentation.
I have a package I want to install. I would like the files to end up in a different directory than the installation wizard choses for them.
For example, my Sitecore copy is running at C:\SiteCore\website
The module added files to C:\SiteCore\website\Console
I would like the files to ultimately live at C:\SiteCore\website\sitecore_modules\Console
I am using Sitecore 6.5 rev 111230, but we are planning to upgrade very soon. I would like for my installed packages to migrate seamlessly once we have upgraded. For reference, the package I want to install at the moment is the Sitecore Powershell Extensions. Although, I would prefer to apply a similar method to any future packages that I install.
Is there a secret switch in the package installation process to allow me to do this? Can I do it from the package installation wizard? Is there another way to install packages?
I'm assuming I can't just change the package path and expect everything to keep working. Do I have to update a configuration somewhere (a file or inside the Sitecore CMS GUI) to make the package recognize the new file locations?
The module creator defines where files exist. If you move them you run the risk of something not working. The best idea is to ask the creator on the Marketplace page of the module.
There is no turn-key way to change this.
I guess you cand take the code from MarketPlace and you can modify it.
I don't know how exactly is the licenses with MarketPlace modules, but I think people can modify others code.
Please check on code and also on items, maybe on some fields are values for folder path.
I discovered a way to accomplish this, but it can be quite involved or even impossible, depending on the complexity and size of the package.
First of all, I did take the question to the module creator and had a very helpful and informative conversation with the creator. So thanks for that suggestion - they may even move the install location in a future release, based on my request.
The workaround is to first install the package on a system as normal. Then you figure out everything that comes with the package. For files, this is easy if your Sitecore root is under source control. For items, this is really complicated. You can search for the installed items by owner, if you had the foresight to create & use a unique user for the package installation. Or you can check the untyped files in the package that are essentially xml based item manifests.
Once you have a detailed list, you make the desired modifications to the locations. Then you recreate the package yourself using the Sitecore package designer.
This works for simple packages - I did it to one small package that I hope to get up on the Sitecore marketplace as shared source soon. And by small, I mean it was 2 files and 3 items. The package that prompted me to ask this question would not cooperate with this workaround. The included .dll had some assumptions about the file structure hard-coded into it.
The workaround I took for the more complex package was really quite basic: I just created a new source-code external to the required path. That let me wrap everything up neatly without getting medieval on the package files.
Thanks for both your answers, a very fine +1 to you.
I have received several recommendations to use virtualenv to clean up my python modules. I am concerned because it seems too good to be true. Has anyone found downside related to performance or memory issues in working with multicore settings, starcluster, numpy, scikit-learn, pandas, or iPython notebook.
Virtualenv is the best and easiest way to keep some sort of order when it comes to dependencies. Python is really behind Ruby (bundler!) when it comes to dealing with installing and keeping track of modules. The best tool you have is virtualenv.
So I suggest you create a virtualenv directory for each of your applications, put together a file where you list all the 'pip install' commands you need to build the environment and ensure that you have a clean repeatable process for creating this environment.
I think that the nature of the application makes little difference. There should not be any performance issue since all that virtualenv does is to load libraries from a specific path rather than load them from the directory where they are saved by default.
In any case (this may be completely irrelevant), but if performance is an issue, then perhaps you ought to be looking at a compiled language. Most likely though, any performance bottlenecks could be improved with better coding.
There's no performance overhead to using virtualenv. All it's doing is using different locations in the filesystem.
The only "overhead" is the time it takes to set it up. You'd need to install each package in your virtualenv (numpy, pandas, etc.)
Virtualenvs do not deal with C dependencies which may be an issue depending on how how keen you are about reproducible builds and capturing all of the machine setup in one process. You might end up needing to install C libraries through another package manager such as brew apt or rpm, and these dependencies can be different between machine or change over time. To avoid this, you might end up using docker and friends - which then adds another layer of complexity.
conda goes tries to address the non-python dependencies. The issue is that it is bigger and slower.
I have found a problem with the test environment in a c++ problem.
We have a machine which downloads the code from the version control system and, build it and execute the unit test, nothing new.
The problem arise when we add a new dependency in our project. We are developing a lot of features at the same time and it is something relatively common. We this happens we have to advise testers and give them an easy way to reproduce the compilation environment ...
And I was thinking if there is any other easy way to go through this ... don't know, some tool like virtualenv or buildout for python ..
I have been searching at google, but with no luck.
Any help will be appreciated.
You can always add all of the dependencies to the revision control system and provide automated scripts that will install the required subsystems. Where I work, if you just download the current version from the repository, you can build in one step an ISO image that can be installed by testers in any computer they want. The image contains everything from the OS up to the application.
Depending on your particular situation, you might want to start with smaller steps, like adding the dependencies to the repository and having the testers check there whether any new file appears or changes version.
No ready tool, AFAIK, except maybe for CMake which can control things like that for you.
For C++, it's fairly easy to manage "by hand" since you can set LIB, LIBPATH and PATH environment variables to carefully selected directories. No site.py, eggs, .pth files and the like as with Python.
We do this at our shop, setting up our build/development environment closely and have everything in revision control (mostly scripts that download huge zips of prebuilt libs and unpack them to the right places).
Small libs are copied to common dirs, larger get their own entry in the env-vars.
This works equally well for Python and Java. Haven't tried other languages...yet. :)