I'm coming from a .Net and Node background where packages from the package manager e.g. NuGet or NPM get installed into a specific /packages folder.
I'm learning python and when I do a pip install my packages either end up in a global store or I can tell pip to put them into a specific place.
Is it good practice to put them into a separate folder or does that upset the Zen of python?
I want to do this so I can docker ignore that folder and then pip install them inside the docker container based on my production operating system from a requirements.txt file.
Is there a good reason I shouldn't do this? Why does Python not do this as default as it seems to be a clean and portable approach to package management?
Usually you install them in the default location or within a python virtual environment.
virtual environment
Do the virtual environment when you don't want to clutter up your default python environment.
Related
I am deploying a Django app using Heroku.
When I run
git push heroku master
in my terminal I get the following error:
Could not find a version that satisfies the requirement command-not-found==0.3"
When I run
sudo apt-get install command-not-found
I find that command-not-found is version 20.04.2. However, pip freeze tells me command-not-found is version 0.3.
command-not-found doesn't seem to exist on PyPI, but it is a package in Ubuntu and Debian repositories. It doesn't look like anything that your application should depend on, and it certainly doesn't belong on Heroku.
I suspect
you're trying to create your dependencies file after the fact, by simply doing pip freeze > requirements.txt, and
that you're either not working in a virtual environment or you created your virtual environment with system packages.
This is an antipattern that will cause several packages that your application doesn't actually need to be included in your requirements.txt. In this case it is even including Python packages that come from system packages and aren't meant to be installed from PyPI. Your requirements.txt should contain only your actual dependencies.
Instead of creating it with pip freeze after the fact, add things to that file before, and install them into your virtual environment with the same pip install -r requirements.txt command that you'll use in production. I also very strongly urge you to use a virtual environment.
In this case, I suggest you edit your requirements.txt and remove anything you don't actually need, commit, and redeploy.
I'm using Django and Python 3.7. I want to write a script to help me easily migrate my application from my local machien (a Mac High Sierra) to a CentOS Linux instance. I'm using a virtual environment in both places. There are many things that need to be done here, but to keep the question specific, how do I determine on my remote machine (where I'm deploying my project to), what dependencies are lacking? I'm using rsync to copy the files (minus the virtual environment)
On the source system execute pip freeze > requirements.txt, then copy the requiremnts.txt to the target system and then on the target system install all the dependencies with pip install -r requirements.txt. Of course you will need to activate the virtual environments on both systems before execute the pip commands.
If you are using a source code management system like git it is a good idea to keep the requirements.txt up to date in your source code repository.
Good day.
I'm a newbie to Django and I have a slight confusion:
When deploying my Django app, do I need to deploy it with all the Python 'come-with' modules, or the hosts already have them installed.
Also, I installed PIL for image manipulation. Would they also have it installed or i have to find a way to install it on their servers. Thanks in advance
do I need to deploy it with all the Python 'come-with' modules
Never do that. It might conflict with the dependencies on the server. Instead issue the following command to create a dependency file (requirements.txt).
pip freeze > requirements.txt (issue this command where manage.py is located)
On the server create a new virtual environment. Now copy django project to the server (you can do this using git clone or just plain old Filezilla). Activate virtual environment. Then change you current working directory to the where manage.py is located. to install all the dependencies issue the following command.
pip install -r requirements.txt
This will install the required dependencies on on server.
This may be a silly question, but I would still like to ask:
I am developing a project using Django, CherryPy, and Nginx. I noticed that a file requirement.txt is usually created to indicate the packages and versions installed in the development environment. Suppose the directory of the virtual environment is /home/me/project/python2Venv.
When I deploy my Django project (tango) into production, the project is copied to the production directory:
sudo cp -r /home/me/project/tango /webapps/tango
For the virtual environment, may I just copy the whole directory using the following command or I should install each of the packages into the production environment again according to requirement.txt?
sudo cp -r /home/me/project/python2Venv /webapps/tango/python2Venv
I think virtualenv uses absolute paths in some files so recreating the env and installing the packages via requirements.txt would be more safe.
In my opinion, it is recommended to install the packages with requirements.txt. Copying directory, can end up being a nightmare.
Say in Update 1:
You have 4 packages each with a specific version(pkg1-ver1, pkg2-ver1, pkg3-ver1, pkg-ver1).
In Update 2:
You have upgraded one package to its new version(pkg1-ver2). With requirements.txt you would just upgrade that one package. Instead of the copying all the packages(Although, i am not sure how well copying of the directory would work).
Hope this helps !
You should install packages with the file requirements.txt.
Or you can use virtualenvwrapper. It helps to clone virtual environments locally easily such as cpvirtualenv, rmvirtualenv, etc.
So I was installing an app in order to bootstrap my Django admin interface, and I thought this app was going to be Project specific but it appears to be installed on a global Django level.
http://riccardo.forina.me/bootstrap-your-django-admin-in-3-minutes/
https://github.com/riccardo-forina/django-admin-bootstrapped
My question is how can I uninstall it if I need to do so at a later date? I wanted my project to be as independent as possible. I was also wondering if there was a way of doing the installation within the project so that people that download my repository will automatically get it.
Also some minor questions are that after adding "add django_admin_bootstrapped into the INSTALLED_APPS list before django.contrib.admin" I was not required to run a syncdb command like we usually are when installing models. I guess this applications doesn't creates tables on my database so that is probably why, but I just wanted to know your thoughts.
I know it is a lot to answer but any clarification is appreciated. Thanks.
If you have installed the django app using pip do:
pip uninstall app_name
Or you have to go manually to your site-packages directory and remove it.
After that,
Remove the app from INSTALLED_APPS. django-admin-boostrapped might have overridden your templates. After you are done, do ./manage.py collectstatic and ./manage.py syncdb
If you're writing something that you want other people to use, and it relies on other packages (whether Django apps or more generic Python packages) it's standard to use pip. This makes it easy to install and uninstall packages, and specific versions of those packages. You can then create a requirements.txt file, which you include with your project. This lets other people know what packages are required, and they can easily install them using pip.
So, first off, install pip.
Then you would install django-admin-bootstrapped by doing:
$ pip install django-admin-bootstrapped
You can also install django using pip:
$ pip install django
If you then do this:
$ pip freeze > requirements.txt
you'll end up with a requirements.txt file that lists any packages you've installed with pip, and which version of each. Include that file with your project when you share it with others (on GitHub or wherever). Those people would then do this, to install the same packages:
$ pip install -r requirements.txt
It would also be worth installing and using virtualenv – this lets you have separate environments for your python work, so that when you pip install something it's only available in that environment. Then you can have different versions of packages (eg, different versions of Django) in each environment. Virtualenvwrapper also makes some of the common virtualenv tasks a little easier.
It's a lot to get to grips with at first, as I know from experience, but it's worth doing so and will make development easier in the long term.
As to your other question, it looks like django-admin-bootstrapped doesn't have any models, so it doesn't require any updating of the database after installation.