pipenv updates all dependencies bringing breaking changes - django

I'm having problems with an app that uses Django. Everything is in a docker container, there is a pipfile and a pipfile.lock. So far, so good.
The problem is when I want to install a new depedency. I open the docker container shell, and I install the dependency with pipenv install <package-name>.
After installing the package, pipenv runs a command to update the pipfile.lock file and doing so updates all packages to their last version, bringing whit these updates a lot of breaking changes.
I don't understand why is this happening, I have all packages listed in my pipfile with ~=, this is suppose to avoid updating to versions that can break your app.
I'll give you an example, I have this dependency in my pipfile: dj-stripe = "~=2.4". But, in the pipfile.lock file, after pipenv runs the command lock, that depedency is updated to its last version (2.5.1).
What am I doing wrong?

Are you sure you're installing it within Docker? A common cause of pipfile.lock conflicts is installing a package locally instead of within Docker and then when the local environment syncs with Docker it will override your pipfile.lock.
Assuming you're using docker-compose, this is how I'm installing my packages:
docker-compose exec web pipenv install <package-name>

I discovered what my problem was.
I've been listing the dependencies like this: ~=2.4, I thought that was indicating not to update to 2.5 or greater, but that's not true, that only tells pipenv not to update to 3 or greater.
In order to stay in 2.4 version, I must specify the last number version, for example: ~=2.4.0
That way, I'm telling pipenv not to update from 2.4.

Related

Getting the following error: Could not find a version that satisfies the requirement command-not-found==0.3

I am deploying a Django app using Heroku.
When I run
git push heroku master
in my terminal I get the following error:
Could not find a version that satisfies the requirement command-not-found==0.3"
When I run
sudo apt-get install command-not-found
I find that command-not-found is version 20.04.2. However, pip freeze tells me command-not-found is version 0.3.
command-not-found doesn't seem to exist on PyPI, but it is a package in Ubuntu and Debian repositories. It doesn't look like anything that your application should depend on, and it certainly doesn't belong on Heroku.
I suspect
you're trying to create your dependencies file after the fact, by simply doing pip freeze > requirements.txt, and
that you're either not working in a virtual environment or you created your virtual environment with system packages.
This is an antipattern that will cause several packages that your application doesn't actually need to be included in your requirements.txt. In this case it is even including Python packages that come from system packages and aren't meant to be installed from PyPI. Your requirements.txt should contain only your actual dependencies.
Instead of creating it with pip freeze after the fact, add things to that file before, and install them into your virtual environment with the same pip install -r requirements.txt command that you'll use in production. I also very strongly urge you to use a virtual environment.
In this case, I suggest you edit your requirements.txt and remove anything you don't actually need, commit, and redeploy.

Pipfile.lock (d67565) out of date, updating to (4f9dd2)

Can somoene explain why i get this error Pipfile.lock (d67565) out of date, updating to (4f9dd2)…when i try to install packages using pipenv .Altoough i have used pipenv install
enter image description here
I noticed that it happens due to following reasons,
when you install a new app from terminal using pipenv.
secondly when you run pipenv shell command again after closing your project,
also when you make some changes in your pipfile, and lastly when you deploy your project on somthing like heroku.

Keeping Pipfile Updated

I just started a new Django project and use Postgresql as my database, so I installed psycopg2 to make it work properly. When i deployed the project in the beginning the app did not work because psycopg2 was not installed on the production server. As i quickly realized this was because psycopg2 was missing in my pipfile.
So my question is:
Do I have to update the pipfile manually every time i install something for my project? I thought that the pipfile would take care of that automatically every time I install something.
Isn't there anything similar to pip freeze > requirements.txt where I can do the update with one short command?
Do I have to update the pipfile manually every time i install something for my project? I thought that the pipfile would take care of that automatically every time I install something.
requirements.txt is just a file. There is no logic around it that updates that (unless of course you have an IDE that does that). It is not per se the file that is used for the package manager. You can use any file, and you can use multiple files (for example sometimes one makes a requirements_test.txt file that contains extra packages that should be installed when testing the software).
You do not per se need to update the requirements.txt file each time you install software, as long as the requirements.txt file is correct when you deploy software (on another machine), it is fine.
You can however automate this to some extent. If you use git subversioning for example, you can make a pre-commit hook, that will each time run when you commit changes. For example by constructing an executable file in .git/hooks/pre-commit in the repository. Something that might look like:
#!/bin/bash
. env/bin/activate
pip freeze > requirements.txt
Each time you thus make a commit, the requirements.txt will be "harmonized" with the packages installed in the virual environment.

confusion in deploying module's with django

Good day.
I'm a newbie to Django and I have a slight confusion:
When deploying my Django app, do I need to deploy it with all the Python 'come-with' modules, or the hosts already have them installed.
Also, I installed PIL for image manipulation. Would they also have it installed or i have to find a way to install it on their servers. Thanks in advance
do I need to deploy it with all the Python 'come-with' modules
Never do that. It might conflict with the dependencies on the server. Instead issue the following command to create a dependency file (requirements.txt).
pip freeze > requirements.txt (issue this command where manage.py is located)
On the server create a new virtual environment. Now copy django project to the server (you can do this using git clone or just plain old Filezilla). Activate virtual environment. Then change you current working directory to the where manage.py is located. to install all the dependencies issue the following command.
pip install -r requirements.txt
This will install the required dependencies on on server.

how to bring out django project from virtualenv

I have django project in virtualenv and now I am publishing it in server but the problem is I can not move project from virtualenv, when I do this then related packages inside site-package, cant be read and errors occur, how can I bring out my project from virtualenv without any issuing
Create a new virtualenv on the server. It's easy
Step 1 Get the list of modules in the current virtualenv
source /path/to/current/bin/activate
pip freeze > /tmp/requirements.txt
Step 2 Create a new virtualenv. Login to your new server, copy the requirements file there. Then either change into a suitable directory before excuting the virtualenv command or give a full path.
deactivate
virtualenv -p python envname
Step 3 Install modules
source envname/bin/activate
pip install -r /tmp/requirements.txt
That's it.
As #bruno has pointed out, you really should be using a virtualenv on the server. And you should be using it on your local dev server as well. Then you can be really sure that the code will run at both ends without any surprises.