I'm to the point where I use many 3rd party apps, but I need to extend them. I'm trying to figure out the best way to modify an existing app. In the past I have just modified the code in the site-packages (knowing that it was bad), but that has its obvious downfalls.
Right now the closest I can get is to fork the github repo, and then I create an app in my project that I hook up to the forked github repo.
The problem is that what I'm forking is the project and not the app. So that means I have a structure like: project/app-project/app after I fork. Other things are also in the app-project directory such as LICENSE, AUTHORS etc.
Project
--App1
----Code
--App2
----Code
--ForkedAppProject
----LICENCE
----AUTHORS
----ForkedApp
------Code
If I just take the code inside the App directory, I can modify that, but then I'm losing all the source control for the Project (LICENSE, AUTHORS, etc) and that kind of defeats the purpose of forking the open-source app.
I want to set it up so that I am modifying the app code, but I am modifying so that my changes could contribute to the open-source project.
It's not necessary to place forked app inside your project. After forking and modifying you can add ForkedApp folder to PYTHONPATH or build your own package like this
python setup.py sdist
Then install it in your system/virtualenv and update INSTALLED_APPS.
The simplest and most convenient way to do this is to use pip --editable option. See “Editable” Installs in pip documentation.
Related
Scenario:
I wish to use Google OR-Tools with it C++ version as a dependency in my C++ project. I choose to install it from binary. I download the binary file and use the provided Makefile to install it.
Question:
In the future, when a new version of OR-Tools is released, how do I update the dependency in my local project?
If I were using Python, JavaScript or Ruby, I'd use pip, npm/yarn or gem (i.e. package managers) to update dependencies. But since C++ doesn't really have one, how do I systematically update C++ dependencies that I installed from binary?
how do I systematically update C++ dependencies that I installed from binary?
Write a script that fetches and installs the dependencies however you wish or setup one of the many package managers or build systems with package support that C++ has.
What I tend to do when my project relies on 3rd party code is create a Git repository for your project (if you don't already have one) and add the third party code as a Git submodule. That way, you can easily just pull the latest changes that were committed to the repository of the third party code.
You can optimize your workflow even more by adding this third party project to your project and change your project's config to consider the third party project to be a dependency. That way, it will compile the dependency every time a change was made to it.
Im looking for a reasonably simple toolset and workflow for incorporating Sass into my Django projects. Im predominantly backend focused and have just started investigating Sass, so bear with me.
My initial idea was to keep things simple by just using node-sass without trying to incorporate Gulp, django-pipeline or anything else initially.
My Django apps are usually structured such that I create a static/app/css folder in each app. One option I guess would be to now create an additional folder per app for scss files, ie: static/app/scss. The problem there would be that when running collectstatic in production, the scss files will be gathered as well. So should the scss files for each app be kept somewhere else? (I guess it doesn't really matter if the scss files are included when collectstatic runs?)
Next, outside of my Django project folders I would create a folder to install node-sass since I wouldn't want to install it globally and I don't want the node-modules folder inside my Django project or inside source control.
I guess the node-modules folder can be thought of like using a python virtualenv instead of installing packages globally?
Next, inside my Django project somewhere (not sure where?) I would have the package.json file containing a scripts section for every scss file I want compiled to css, eg:
"scripts": {
"compile:sass": "node-sass app1/static/app1/scss/style.scss app1/static/app1/css/style.css",
"compile:sass": "node-sass app2/static/app2/scss/style.scss app2/static/app2/css/style.css",
"compile:sass": "node-sass app3/static/app3/scss/style.scss app3/static/app3/css/style.css"
}
Lastly, I would just run compile:sass with the watch flag to constantly compile any files I work on and put them in the correct folders.
So my questions are, is the above setup a good approach (at least initially if im not ready to add yet another tool like Gulp etc to the mix)?
Also, how will I run compile:sass considering my package.json file will be in the Django project somewhere and the node-modules folder containing the node-sass installation will be somewhere else.
I help maintain node-sass, so I won't say not to use it. There is an alternate libsass-python that you might want to look at if you're working with Python though.
Check out the django-sass-processor package. It's simple to configure and use. I've used it a few times and have had good experiences with it. The package abstracts away Gulp, so you don't have to worry about it and streamlines the whole process.
Here's a tutorial on how to integrate django-sass-processor into a Django project.
I read that it is not proper to have the venv in the git repo, which should be taken care of by requirements.txt, but I have run into a problem...
If I am working in my venv and I do a pip install of some app, it installs it into the venv site-packages. It still works and everything if I add it to installed_apps, but what if I make some changes within that directory? Then git doesn't track them and I am out of luck when I try to push it.
What is the proper way to do this?
EDIT: I must be having a huge miscommunication here so let me explain with a concrete example...
I run...
pip install django-messages
This then install django messages into me venv, I know I can run...
local...pip freeze > requirements.txt
remote....pip install -r requirements.txt
My problem is that I want to make changes to django-messages/templates or django-messages/views thus deviating my django-messages from the one which can be installed from requirements.txt
I don't see how these are to remain in my venv without being completely uneditable/untrackable
This is exactly how it is supposed to work. You track what libraries you install via your requirements.txt, which is committed along with your code. You use that file to generate your venv, and the libraries are installed there. You don't include the venv itself in your repo.
Edit The reason you are finding this hard is that you are not supposed to do that. Don't change third-party projects; you should never need to. They will be configurable.
If you really really find something you need to fix, do as suggested in the comments and fork the app. But this is definitely not something you need to do all the time, which points to the likelihood that you have not understood how to configure the apps from within your own project.
For example, in the case of customising templates, you can simply define the templates inside your own templates dir, rather than editing the ones provided with the app; Django does the right thing and uses yours first.
From your edits it looks like what you want to do is fork the django-messages library. This means that installing it into site-packages is a bad idea in the first place, since site-packages is not supposed to be version controlled or edited, it is designated for 3rd party software. You have two options. You can just grab the source from GitHub and put it somewhere where your Django app can find it (maybe fiddle with your python path) and add this location to git. Maybe even make your own fork on github. The second option is to use pip install -e github.com/project to have pip install an "editable" version. The advantage of the first way is better control over your changes, the advantage of the second way is having pip manage source download and install.
That being said, you seem kinda new to python environment. Are you REALLY sure you want to make your own fork? Is there some functionality you are missing that you want to add to the messages library? You do know that you can override every single template without changing the actual library code?
I have a package I want to install. I would like the files to end up in a different directory than the installation wizard choses for them.
For example, my Sitecore copy is running at C:\SiteCore\website
The module added files to C:\SiteCore\website\Console
I would like the files to ultimately live at C:\SiteCore\website\sitecore_modules\Console
I am using Sitecore 6.5 rev 111230, but we are planning to upgrade very soon. I would like for my installed packages to migrate seamlessly once we have upgraded. For reference, the package I want to install at the moment is the Sitecore Powershell Extensions. Although, I would prefer to apply a similar method to any future packages that I install.
Is there a secret switch in the package installation process to allow me to do this? Can I do it from the package installation wizard? Is there another way to install packages?
I'm assuming I can't just change the package path and expect everything to keep working. Do I have to update a configuration somewhere (a file or inside the Sitecore CMS GUI) to make the package recognize the new file locations?
The module creator defines where files exist. If you move them you run the risk of something not working. The best idea is to ask the creator on the Marketplace page of the module.
There is no turn-key way to change this.
I guess you cand take the code from MarketPlace and you can modify it.
I don't know how exactly is the licenses with MarketPlace modules, but I think people can modify others code.
Please check on code and also on items, maybe on some fields are values for folder path.
I discovered a way to accomplish this, but it can be quite involved or even impossible, depending on the complexity and size of the package.
First of all, I did take the question to the module creator and had a very helpful and informative conversation with the creator. So thanks for that suggestion - they may even move the install location in a future release, based on my request.
The workaround is to first install the package on a system as normal. Then you figure out everything that comes with the package. For files, this is easy if your Sitecore root is under source control. For items, this is really complicated. You can search for the installed items by owner, if you had the foresight to create & use a unique user for the package installation. Or you can check the untyped files in the package that are essentially xml based item manifests.
Once you have a detailed list, you make the desired modifications to the locations. Then you recreate the package yourself using the Sitecore package designer.
This works for simple packages - I did it to one small package that I hope to get up on the Sitecore marketplace as shared source soon. And by small, I mean it was 2 files and 3 items. The package that prompted me to ask this question would not cooperate with this workaround. The included .dll had some assumptions about the file structure hard-coded into it.
The workaround I took for the more complex package was really quite basic: I just created a new source-code external to the required path. That let me wrap everything up neatly without getting medieval on the package files.
Thanks for both your answers, a very fine +1 to you.
I am working on an ecommerce site, based on django-lfs, and I am finding that I need to make a number of changes to the django-lfs core files... i.e. adding additional properties to models, updating views, adding url patterns etc. I started out placing django-lfs in my pythons site-packages path with the view that if i needed to make any changes to the code I would either re-define the url pattern (in the case where i needed to do something different in a view) or monkey patch code.
The thinking behind this was that i'd be able to keep the original django-lfs trunk clean and un-touched, allowing me to update it to the most recent version and then independently update / test the local overrides, subclassing and monkey patches written.
As you might have guessed, this is quickly becoming a bit of a nightmare to manage so i'm in desperate need of a cleaner and more stable solution.
The client project sits in a git repo and so i have been looking into submodules or subtree merging strategies... from everything i have read though, I am finding it difficult to find any definitive answers that are simple to understand (i'm relatively new to git).
In short, I need to be able to:
1) Include an external git repository into the main projects repo
2) Either make changes directly to the external repo (but have it push to the projects git repo and not to the external remote origin) OR create a local copy of the external repo and then periodically merge the external repo with the copied folder.
I have no idea how to achieve this though. to be clear, I would like to end up with the following folder structure:
PROJECT_NAME
MEDIA
TRUNK
APPS
django-lfs
EXTERNAL-REPOS
django-lfs
The lfs app in the external-repos folder should be able to pull down updates from the official (external) django-lfs repository and i should be able to freely make changes to tehe lfs folder stored in the APPS folder.
What i'm looking for, if at all possible, is a set of git commands / instructions to achieve the above and that make use of the real-worls folders outlined above, rather than using foo and bar references.
I hold my fingers tightly crossed and hope that someone out there can offer some advice :)
My quick take on this: either fork the project on bitbucket or github (depending on your preference of hg vs. git) and make a branch for your changes.
This will make it easier to keep your branch and the official master in sync.
Then, assuming you use pip+virtualenv, add the pointer to your repo/branch in your pip requirements file.
Unfortunately LFS uses buildout, so not quite sure what the equivalent of python setup.py develop would be (i.e. to install the package in your virtualenv site-packages but with links back to your repo so you can make changes without having to constantly run setup.py).