How to install Graphlab Create on Ubuntu? - python-2.7

When i am trying to install Graphlab Create, it's just retrying connections. I have healthy broadband wifi connection at my home (NO proxy).
Error: (graphlab)ankit#ankit21:~$ pip install graphlab-create==0.9.1
Collecting graphlab-create==0.9.1
Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connection broken by 'ConnectTimeoutError(<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7fd6a16ae850>, 'Connection to 192.168.16.253 timed out. (connect timeout=15)')': /simple/graphlab-create/
The proxy 192.168.16.253 is proxy of my college internet. But i have changed the setting in networks setting to automatic proxy. Others installs in terminal are working properly only this is having the problem.

You need your specific academic license key, but trying to install the older version automatically updates your install to 1.7.1. If you choose to copy your code from the installation instructions and you remove --upgrade, and change to 0.9.1 it will still not work.
pip install --no-cache-dir https://get.dato.com/GraphLab-Create/0.9.1/your_key/GraphLab-Create-License.tar.gz
You can obtain your key for the latest version when you register with GraphLab on the Dato website. The academic license is free for each user for one year.

On the official site (https://turi.com/download/install-graphlab-create.html?email=**YOU**%40gmail.com&key=**7C68-...-D3D7**) we have:
Registered email address: YOUR_ADDRESS
Product key: YOUR_PRODUCT_KEY
Installing with dependencies:
Install Anaconda
bash /path to download file/Anaconda2-4.0.0-Linux-x86_64.sh
Create conda environment
conda create -n gl-env python=2.7 anaconda=4.0.0
source activate gl-env
Ensure pip version >= 7
conda update pip
Install GraphLab Create
pip install --upgrade --no-cache-dir https://get.graphlab.com/GraphLab-Create/2.1/YOUR_ADDRESS/YOUR_PRODUCT_KEY/GraphLab-Create-License.tar.gz
Ensure installation of IPython and IPython Notebook
conda install ipython-notebook

Related

How to install pip on Python 2.7 in 2021

I have legacy production servers that are still running Python 2.7.6. We have a local environment built from the docker image for ubuntu 14.04 intended to replicate that environment (things still work there once everything is installed.) The packer build script that creates this environment recently stopped working apparently due to PyPi dropping non-SNI support.
I tried using get-pip.py from the docs to download pip:
wget -c https://bootstrap.pypa.io/pip/2.7/get-pip.py
python2 get-pip.py
This gives me the following warning:
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality.
/tmp/tmpBb3LJu/pip.zip/pip/_vendor/urllib3/util/ssl_.py:424: SNIMissingWarning: An HTTPS request has been made, but the SNI (Server Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
/tmp/tmpBb3LJu/pip.zip/pip/_vendor/urllib3/util/ssl_.py:164: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
ERROR: Could not find a version that satisfies the requirement pip<21.0 (from versions: none)
ERROR: No matching distribution found for pip<21.0
The proposed solution for that is to use pip to upgrade urllib3
https://serverfault.com/questions/866062/easy-install-and-pip-fail-with-ssl-warnings
I don't have pip so I installed a legacy version using
apt-get install python-pip
This installs pip 1.5.4
When I try to pip install "urllib3[secure]" I get the following:
Requirement already satisfied (use --upgrade to upgrade): urllib3[secure] in /usr/lib/python2.7/dist-packages
Installing extra requirements: 'secure'
Cleaning up...
If I try pip install "urllib3[secure]" --upgrade or pip install --index-url https://pypi.python.org/simple/ --upgrade pip I get:
Cannot fetch index base URL https://pypi.python.org/simple/
Could not find any downloads that satisfy the requirement urllib3[secure] in /usr/lib/python2.7/dist-packages
Downloading/unpacking urllib3[secure]
Cleaning up...
No distributions at all found for urllib3[secure] in /usr/lib/python2.7/dist-packages
Storing debug log for failure in /root/.pip/pip.log
(the pip message reflects pip, not urllib3[secure])
When I try to use pip 1.5 to install uWSGI
pip install uWSGI
I get the following:
Downloading/unpacking uWSGI
Cannot fetch index base URL https://pypi.python.org/simple/
Could not find any downloads that satisfy the requirement uWSGI
Cleaning up...
No distributions at all found for uWSGI
Storing debug log for failure in /root/.pip/pip.log
Upgrading pip doesn't work here either
Downloading/unpacking uWSGI==2.0.18 (from -r /root/requirements.txt (line 1))
Cannot fetch index base URL https://pypi.python.org/simple/
Could not find any downloads that satisfy the requirement uWSGI==2.0.18 (from -r /root/requirements.txt (line 1))
Cleaning up...
No distributions at all found for uWSGI==2.0.18 (from -r /root/requirements.txt (line 1))
Storing debug log for failure in /root/.pip/pip.log
Reinstalling pip doesn't work:
python -m pip install -U --force-reinstall pip
Gives me:
Downloading/unpacking pip
Cannot fetch index base URL https://pypi.python.org/simple/
Could not find any downloads that satisfy the requirement pip
Cleaning up...
No distributions at all found for pip
Storing debug log for failure in /root/.pip/pip.log
If I open /root/.pip/pip.log I see the following:
Downloading/unpacking pip
Getting page https://pypi.python.org/simple/pip/
Could not fetch URL https://pypi.python.org/simple/pip/: 403 Client Error: [[[!!! BREAKING CHANGE !!!]]] Support for clients that do not support Server Name Indication is temporarily disabled and will be permanently deprecated soon. See https://status.python.org/incidents/hzmjhqsdjqgb and https://github.com/pypa/pypi-support/issues/978 [[[!!! END BREAKING CHANGE !!!]]]
The link says that SNI support was dropped:
For users of Python 2.7.{0...8}
Upgrading to the last Python 2.7 release is an option.
However, note that Python 2.7 series itself is now End of Life and support in pip was dropped with version 21.0.
For users of Python 2.6.x and lower:
Neither the Python core developers, or pip maintainers support Python 2.6 and below.
If someone is aware of a work around for this issue (SNI support specifically) they are welcome to share it here for others.
There is no recommended solution from the PyPI team.
How can I get a local environment set up for new developers to work on our legacy application? I've created a new Python 3 dev server and local environment but it will be some time before I can roll out the staging and live environments, get everything moved over, and test it.
As the message says, PyPi has discontinued support for Python <2.7.9 as of May 6th 2021. If you're running a version < 2.7.9 and you cannot upgrade to a newer version of Python then your only option is to manually download the wheels from PyPi.
These are the modification I needed to make to my build script to make it work:
I needed to install software-properties-common and gcc
apt-get install -y software-properties-common gcc
Then I downloaded (setuptools](https://pypi.org/project/setuptools/44.1.1/#files) and unzipped and installed it:
python ./setuptools-44.1.1/setup.py install
Next, I downloaded pip and added it to a folder called wheels. Then I could use the whl file to run pip to get pip
python ./wheels/pip-20.3.4-py2.py3-none-any.whl/pip install --no-index --find-links ./wheels/ pip --ignore-installed
It was suggested to build a Docker container using Ubuntu 16.04 with Python 2.7.17 and use that to download the packages.
pip download -r requirements.txt
But the versions of the packages were wrong, so I ended up going through the requirements.txt and downloading each package manually from PyPi and adding it to the wheels folder. A running instance is useful so you can run pip freeze or look at a requirements.txt file to grab the version numbers of all the packages you need.
Now that I could use pip, I can install my other packages:
python pip install --no-index --find-links ./wheels/ -r /root/requirements.txt
This uncovered some dependencies that I hadn't downloaded packages for yet so I had to go through and download those and added them to the wheels folder. There were a few other things I found needed different versions than I had originally downloaded and a few packages relied on pbr and many more wanted wheel:
pip install --no-index --find-links ./wheels/ pbr==5.5.1 wheel==0.36.2
I also needed to download cMake and add it to the wheels folder
After that I could install my requirements.txt:
pip install --no-index --find-links ./wheels/ -r /root/requirements.txt --ignore-installed
May be late to the party but something similar happened to me while trying to make an HTTPS request with Python 2.7.6 (lack of SNI support). This was causing a lot of issues on a remote web server I work on.
Looking for answers I tried installing urllib3[secure] and entered a loophole since pip was complaining about a lack of SNI support to install this and other packages as well.
I found out this StackOverflow answer which helped me install the required dependencies to make Python 2.7.6 and pip itself support SNI as well as install urllib[secure].
You need to create a folder containing the required wheels (download them from PyPi using wget for instance):
pip, asn1crypto, enum34, idna, six, ipaddress, pyOpenSSL, cffi,
cryptography wheels; and also pycparser (a non-wheel, it will be a
tar.gz)
Make sure the wheels you download support Python 2.7 and that you install pip before the rest of them.
In the original answer, its stated you can use python -m OpenSSL.debug to verify everything worked correctly (a ModuleNotFoundError would mean the pyOpenSSL package was not installed). You can also use pip -Vto check that the new pip version was installed correctly as well.
After updating pip and installing these dependencies I was able to install urllib3[secure] and get SNI support from python as well as pip.
Good luck!
I am sharing this answer as an update to Jonathan Rys's answer that contains the steps required as of the date of this answer. I tried to keep this concise.
As the message says, PyPi has discontinued support for Python <2.7.9 as of May 6th 2021. If you're running a version < 2.7.9 and you cannot upgrade to a newer version of Python then your only option is to manually download the wheels from PyPi.
For Ubuntu 20.04, I have installed build-essential sudo apt-get install build-essential
I installed Python 2 from source, downloading tar bundle and built and installed this. Note, I removed the python command, to avoid that old confusion, so we have python2 and python2.7 and also python3 (for example).
tar xf Python-2.7.18.tgz
cd Python-2.7.18
./configure && make && sudo make install
(cd /usr/local/bin;sudo rm python python-config)
cd ..
Now to get pip installed. Download setuptools zip archive. Then install it:
unzip setuptools-44.1.1.zip
cd setuptools-44.1.1
python2 bootstrap.py
sudo python2 setup.py install
cd ..
Next, I downloaded pip .tar.gz archive. Then unpack and install it. Note, I took extra steps to preserve and restore the original python3 pip in /usr/local/bin. Pip for Python2 are still available as pip2 and pip2.7 in the same directory.
tar xf pip-20.3.4.tar.gz
cd pip-20.3.4
(cd /usr/local/bin;sudo mv pip pip-save)
sudo python2 setup.py install
(cd /usr/local/bin;sudo mv pip-save pip)
Now that I could use pip, I can install the most important package any Python user should install, ipython. Note that I do a user install for this and also preserve my "ipython" command to run python3 and have ipython2 to run python2:
pip2.7 install ipython
(cd ~/.local/bin;rm ipython;ln -s ipython3 ipython)
and it works!
$ ipython2
Python 2.7.18 (default, Jun 22 2022, 09:38:45)
Type "copyright", "credits" or "license" for more information.
IPython 5.10.0 -- An enhanced Interactive Python.

tightVNC connection from python2.7 error --No connection could be made because the target machine actively refused it

I have installed vncdotool on my python2.7
Download and install python-2.7.5.exe from the Python Downloads website
Open up Powershell, and paste in the following:
Restart your Windows Machine
Upon Restart, go to the Twisted Downloads and get and install 32bit Twisted, Twisted-13.1.0.win32-py2.7.exe
Download and install PIL-1.1.7.win32-py2.7.exe, from PIL Downloads.
Download ez_setup.py and get_pip.py and save them to your Python/Scripts folder, C:\Python27\Scripts
Open up Powershell and type the following:
pip install pip --upgrade
pip install distribute
pip install setuptools --upgrade
pip install Twisted --upgrade
pip install vncdotool < -- Finally install vncdotool
At a Powershell prompt:
vncdo.exe --server 192.168.2.2 type "Hello World"
i tried above follow.
now i got error
like this.
c:\Python27\Scripts>vncdo.exe --server 192.168.2.11 type "hello world"
CRITICAL:root:Connection was refused by other side: 10061: No connection could be made because the target machine actively refused it..
Please help me to solve this problem. Thanks.
Your code has not problem.
May be problem is firewall or antivirus or server will sent your set rather than ack.

How to install tqdm on a linux server without internet access?

I am trying to install the python package tqdm on a linux server.
However, the said server has no internet access. Hence, I am unable to install it using pip. I am also unable to find the tqdm package in Debian's package index.
However, what I am able to do is scp files from my local machine to the server. My local has full internet and sudo access.
Any leads please?
Note: I have sudo access on the server.
You could install it with pip. Just use the available commandline options as follows:
pip --no-index --find-links /path/to/directory/with/egg tqdm
Documentation:
https://pip.pypa.io/en/stable/reference/pip_wheel/#no-index
https://pip.pypa.io/en/stable/reference/pip_wheel/#find-links

'certificate verify failed' when using PyFCM to send a Push Notification

I'm trying to send push notifications via PyFCM (via Firebase Cloud Messaging).
When I tried to do this initially, I got the SNIMMissingWarning telling me that the request to the Firebase server was insecure.
I upgraded packages to handle this, but now I'm stuck with a 'certificate_verify_failed' error.
I went through the PyFCM code and found that it was using the requests module to send a request to the server.
I know that this issue is related to not having the CA certificates for the Firebase server, but have no idea how to get these certificates and setup the requests module to use them.
Can someone help?
Found the problem, sharing it for the benefit of others,
Apart from installing requests[security], I needed to install libssl-dev via
sudo apt-get install libssl-dev
So the overall setup for this is, first install the development versions of ffi and ssl libraries:
sudo apt-get install libffi-dev libssl-dev
Then, install requests[security]
pip install requests[security]

Turn an Anaconda Python installation into a conda environment to be able to switch between different versions?

I need to use both Python 2.7, and 3.5. Ideally, I should use conda to create an alternative environment, but after using
bash Anaconda2-4.2.0-Linux-x86_64.sh
to install python 2.7, in a few minutes after downloading the shell script, then with
conda create -n py35 python=3.5 anaconda
always failed due to download time out, after more than dozen minutes of downloading numerous packages. (I'm in China, behind GFW.)
But I have no problem to install python 3.5 in a few minutes after downloading the shell script, by
bash Anaconda3-4.2.0-Linux-x86_64.sh
so I wonder if it's possible to manually turn the python 3.5 installation into a conda managed environment so that I can switch between them?
Thanks a lot for your help!
set up shadowsocks client and proxchains4, I just set up an ubuntu server .
step1:
buying or setting up a shadowsocks server
step2:
install shadowsocks client
pip install shadowsocks
step3:
install proxychains
sudo apt-get install proxychains
step4: setup and run setup
shadow socks client tutorial and
proxychains tutorial
step5:
you can use proxychains to create new env
proxychains conda create -n py35 python=3.5