Some Confusion about easy_install without Root Access - python-2.7

Preface
I am so new to ssh/unix protocols that I hope I don't offend anybody.
Context
I am using the cores at my university, and do not have root access. Thus, when I install python modules, I resort to the answer on these two related stack overflow posts:
1) How to install python modules without root access?
2) How to install python packages without root privileges?
In the second post, Col Panic highly recommends getting pip or easy_install on the cores, and if they are not already there, 'you should politely ask the admins to add it, explaining the benefit to them (they won't be bothered anymore by requests for individual packages)."
Following that piece of advice, I request that the admin put easy_install on all the cores. They did and after some proverbial futzing around with export, PATH and PYTHONPATH, I was able to get numpy and scipy on the cores and import them into iPython environment.
Unfortunately, there was some problems with matplotlib related to this question: ImportError: No module named backend_tkagg
I thought I could just ignore this problem related to SUSE by pickling everything and then plotting it on my laptop.
My Problem
I really do need NetworkX. I wrote down some notes on all the small intricacies that I used to install the other packages my last go, but failed this time around. Maybe I am forgetting something that I did last time?
nemo01.65$ easy_install --prefix=/u/walnut/h1/grad/cmarshak/xdrive/xpylocal networkx
TEST FAILED: /u/walnut/h1/grad/cmarshak/xdrive/xpylocal/lib/python3.3/site-packages does
NOT support .pth files
error: bad install directory or PYTHONPATH
You are attempting to install a package to a directory that is not
on PYTHONPATH and which Python does not read ".pth" files from. The
installation directory you specified (via --install-dir, --prefix, or
the distutils default setting) was:
/u/walnut/h1/grad/cmarshak/xdrive/xpylocal/lib/python3.3/site-packages
and your PYTHONPATH environment variable currently contains:
'/u/walnut/h1/grad/cmarshak/xdrive/xpylocal/lib/python2.7/site-packages'
Here are some of your options for correcting the problem:
* You can choose a different installation directory, i.e., one that is
on PYTHONPATH or supports .pth files
* You can add the installation directory to the PYTHONPATH environment
variable. (It must then also be on PYTHONPATH whenever you run
Python and want to use the package(s) you are installing.)
* You can set up the installation directory to support ".pth" files by
using one of the approaches described here:
https://pythonhosted.org/setuptools/easy_install.html#custom-installation-locations
Please make the appropriate changes for your system and try again.
My Attemps to Fix This
I really do networkx otherwise I have to adjust a bunch of my code that I want to put on the clusters.
1) I typed in:
export PYTHONPATH=/u/walnut/h1/grad/cmarshak/xdrive/xpylocal/lib/python3.3/site-packages
into the bash environment. No luck...
2) I asked another grad for some help. He suggested I install pip via easy_install, which I did and then use:
pip install --user networkx
When I type in:
find ./local/lib/python2.7/site-packages/ | grep net
I get a ton of files that are all from the networkx library. Unfortunately, there is still some problems with dependencies.
THANK YOU IN ADVANCE FOR YOUR HELP. Really enjoy learning new things from your answers.

It looks like there are multiple versions of pip floating around (cf pip: dealing with multiple Python versions? ). Try installing pip using a specific version of easy_install. For example, this gave me a pip2.7
walnut.39$ easy_install-2.7 -U --user pip
Searching for pip
Reading https://pypi.python.org/simple/pip/
Best match: pip 1.5.6
Processing pip-1.5.6-py2.7.egg
pip 1.5.6 is already the active version in easy-install.pth
Installing pip script to /u/walnut/h1/grad/rcompton/.local/bin
Installing pip2.7 script to /u/walnut/h1/grad/rcompton/.local/bin
Installing pip2 script to /u/walnut/h1/grad/rcompton/.local/bin
Using /net/walnut/h1/grad/rcompton/.local/lib/python2.7/site-packages/pip-1.5.6-py2.7.egg
Processing dependencies for pip
Finished processing dependencies for pip
walnut.40$
Then use pip2.7
walnut.40$ pip2.7 install --user networkx
Also, for non-root package installations, I've got the follow lines in my .bashrc:
export PYTHONPATH=$PYTHONPATH:$HOME/.local/lib/python2.7/site-packages
export PATH=$PATH:~/.local/bin

Related

Getting the following error: Could not find a version that satisfies the requirement command-not-found==0.3

I am deploying a Django app using Heroku.
When I run
git push heroku master
in my terminal I get the following error:
Could not find a version that satisfies the requirement command-not-found==0.3"
When I run
sudo apt-get install command-not-found
I find that command-not-found is version 20.04.2. However, pip freeze tells me command-not-found is version 0.3.
command-not-found doesn't seem to exist on PyPI, but it is a package in Ubuntu and Debian repositories. It doesn't look like anything that your application should depend on, and it certainly doesn't belong on Heroku.
I suspect
you're trying to create your dependencies file after the fact, by simply doing pip freeze > requirements.txt, and
that you're either not working in a virtual environment or you created your virtual environment with system packages.
This is an antipattern that will cause several packages that your application doesn't actually need to be included in your requirements.txt. In this case it is even including Python packages that come from system packages and aren't meant to be installed from PyPI. Your requirements.txt should contain only your actual dependencies.
Instead of creating it with pip freeze after the fact, add things to that file before, and install them into your virtual environment with the same pip install -r requirements.txt command that you'll use in production. I also very strongly urge you to use a virtual environment.
In this case, I suggest you edit your requirements.txt and remove anything you don't actually need, commit, and redeploy.

Will I still able to install dependency package via Pip after Python 2 End of Life in 2020?

I'm using Python 2.7 and python pip to download all dependency packages from requirements.txt file for running my project. As we know the end of life of Python 2 in 2020, I have concern I could not install my dependency package from pip as pip install -r requirements.txt anymore.
One way I could think of is download all dependencies from requirements.txt and install them locally. Is it possible?
I have read a similar question Will PIP work for python 2.7 after its End of Life on 1st Jan 2020, but the possibility is still not clear for me whether or not pip still be working or not after 2020 EOL of Python 2.
How could I confirm that? How could I do to keep dependencies still be installed via pip in 2020 end of life of python 2? Thanks
As seen in the question you linked, the pip maintainers are willing to support Python 2.7 for as long as reasonably feasible. Now you need to consider the other side: the index.
The default index for pip is PyPI, and I can't imagine that they would decide to delete existing artifacts or stop serving them just because the corresponding interpreter reached their end of life. I am pretty sure they already host and serve artifacts that are aimed at Python interpreters that have already reached their end of life. See for example the artifacts for lxml version 3.4.4 clearly show that it is still serving files aimed at Python 2.6 and 3.4, both of them have alredy reached EOL.
One way I could think of is download all dependencies from requirements.txt and install them locally. Is it possible?
This is definitely feasible as well. You could even go a step further and host your own index. See the following links for ideas:
https://packaging.python.org/guides/index-mirrors-and-caches/
https://packaging.python.org/guides/hosting-your-own-index/
https://pypi.org/project/devpi/
https://pypi.org/project/pypiserver/

ModuleNotFoundError: No module named 'flask_openid' (for Python3.6)

I was trying to run a Flask project using Python 3.6.
I encountered an error:
...
from flask_openid import OpenID
ModuleNotFoundError: No module named 'flask_openid'
Flask-OpenID is available in my Python v3.5 dist-packages.
(When I run:
"sudo pip3 install Flask-OpenID", it shows
"Requirement already satisfied (use --upgrade to upgrade): Flask-OpenID in /usr/local/lib/python3.5/dist-packages" )
What should I do to install openid for Python 3.6?
The documentation states that you should import OpenID as follows:
from flask.ext.openid import OpenID
The package itself is installed correctly (in your Python3.5 environment), as shown by pip when you try to install it again:
Requirement already satisfied (use --upgrade to upgrade): Flask-OpenID in /usr/local/lib/python3.5/dist-packages
However, as you state the in your question:
I was trying to run a Flask project using Python 3.6
You might want to make sure your python3 and pip3 are actually pointing to where you want them to, e.g. on your terminal:
$ ls -l $(which pip3)
Or even better, you should really look into creating virtualenvs for your projects, it helps a lot avoiding these kinds of problems in the first place:
create a new Python 3.6 virtualenv
activate your new virtualenv
install your requirements with pip inside the virtualenv
Then run your script in this virtualenv, and you'll be sure you are using exactly the Python you want, and your dependencies are where you expect them to be (and only there, not somewhere else messing up other projects).
Now this might look like a lot of effort, but it takes no more than a couple minutes the first time you do it, will quickly become second nature, and save you a ton of headache down the road.
For me,
python3.6 -m pip install flask_openid
solved the issue.
The above command will install openid for python3.6.

Is it possible to install a django package without pip?

I am trying to install django-dash to run one of the dashboard examples and see what it's like.
I am on Windows running Python 2.7 and Django 1.6.5. I know the usual approach is to download pip then install the package using pip. However, I am on a work computer with no administrative rights so I can't access my Internet Option Settings to find my proxy URL to follow the instructions below:
Proxy problems
If you work in an office, you might be behind a HTTP proxy. If so, set the environment variables http_proxy and https_proxy. Most Python applications (and other free software) respect these. Example syntax:
http://proxy_url:port
http://username:password#proxy_url:port
I had the same issue when trying to install Django but was able to get it to work by moving the django directory under Python27/Lib/site-packages. Is there something similar I can do with django-dash?
I also tried downloading the sources and running python setup.py install. I received the following error:
File "setup.py", line 3, in <module> from setuptools import setup, find_packages ImportError: No module named setuptools
Link to django-dash: http://django-dash.readthedocs.org/en/latest/
Yes, you can probably get the sources from The Python Package Index
Once you have them, uncompress the files and install them manually (this will depend on you OS).
On Linux systems:
python setup.py build
python setup.py install
Here's the full reference
EDIT : Note that when manually installing those packages, you must also install any missing dependencies, eg. setuptools in your case

How can I uninstall Django apps?

So I was installing an app in order to bootstrap my Django admin interface, and I thought this app was going to be Project specific but it appears to be installed on a global Django level.
http://riccardo.forina.me/bootstrap-your-django-admin-in-3-minutes/
https://github.com/riccardo-forina/django-admin-bootstrapped
My question is how can I uninstall it if I need to do so at a later date? I wanted my project to be as independent as possible. I was also wondering if there was a way of doing the installation within the project so that people that download my repository will automatically get it.
Also some minor questions are that after adding "add django_admin_bootstrapped into the INSTALLED_APPS list before django.contrib.admin" I was not required to run a syncdb command like we usually are when installing models. I guess this applications doesn't creates tables on my database so that is probably why, but I just wanted to know your thoughts.
I know it is a lot to answer but any clarification is appreciated. Thanks.
If you have installed the django app using pip do:
pip uninstall app_name
Or you have to go manually to your site-packages directory and remove it.
After that,
Remove the app from INSTALLED_APPS. django-admin-boostrapped might have overridden your templates. After you are done, do ./manage.py collectstatic and ./manage.py syncdb
If you're writing something that you want other people to use, and it relies on other packages (whether Django apps or more generic Python packages) it's standard to use pip. This makes it easy to install and uninstall packages, and specific versions of those packages. You can then create a requirements.txt file, which you include with your project. This lets other people know what packages are required, and they can easily install them using pip.
So, first off, install pip.
Then you would install django-admin-bootstrapped by doing:
$ pip install django-admin-bootstrapped
You can also install django using pip:
$ pip install django
If you then do this:
$ pip freeze > requirements.txt
you'll end up with a requirements.txt file that lists any packages you've installed with pip, and which version of each. Include that file with your project when you share it with others (on GitHub or wherever). Those people would then do this, to install the same packages:
$ pip install -r requirements.txt
It would also be worth installing and using virtualenv – this lets you have separate environments for your python work, so that when you pip install something it's only available in that environment. Then you can have different versions of packages (eg, different versions of Django) in each environment. Virtualenvwrapper also makes some of the common virtualenv tasks a little easier.
It's a lot to get to grips with at first, as I know from experience, but it's worth doing so and will make development easier in the long term.
As to your other question, it looks like django-admin-bootstrapped doesn't have any models, so it doesn't require any updating of the database after installation.