Installing kernel_sas for Jupyter onto OpenShift - sas

I'm trying to add kernel_sas to my Jupyter Notebook installation so I've added it to my requirements.txt before building my image,
requirement.txt
kernel_sas==2.1.0
but it seems that the package is unknown in the conda registry.
I've been looking to the anaconda registry via their web interface, and it seems to have one :
https://anaconda.org/danielfrg/sas_kernel
Is there a way to install it via requirements.txt or directly via github releases to my OpenShift container? I've tried with a dash or underscore, same result.

Related

Linux python django site-packages not recognized in envelope

I have tried to create envelope on my linux pop os system using miniconda. When I activate it, I can install packages using pip, but when I run my django instance it doesn't find the modules.
If I type which python is shows the miniconda path correctly. I can look in the site-packages folder and see the packages installed.
I've tried installing django-anymail and corsheaders and they are both not being found. It does find my locally installed apps.
If I use the command line and open python and then import, it does not recognize my modules installed in the virtual envelope either. I thought it was a problem with conda, so I also created an envelope using python's native method: python3 -m venv
I have the same problem with it finding pip install site-packages.
Is there a command I can run to show all available packages?
I hadn't realized I had aliased my python. Now it is working.

How to Install Pillow on Openshift

I'm trying to deploy my django web application on OpenShift. When I try to deploy it I get this error:
I have no idea how I can access the terminal in the openshift to install pip.
Add pillow==6.2.1 on your requirements.txt and deploy again, OpenShift will read thi's file and install all requirements.

Moving django project to Github removes its dependency packages

So I'm working on a blog right now with Django and Python. I created a virtual environment and created my project there. I activated my virtual environment every time, so that whenever I do pip install Django or whatever it installs those packages to my virtual environment. However, When I uploaded my project with virtual env to GitHub and downloaded it on my laptop, and do "pip list" only 4 Django packages(pip, pytz, setuptools, virtualenv) are there. They are all different versions from the original too. My original virtual env has these packages below:
certifi 2018.11.29
chardet 3.0.4
Django 2.1.5
django-embed-video 1.2.0
idna 2.8
pip 18.1
pytz 2018.7
requests 2.21.0
setuptools 40.6.3
urllib3 1.24.1
wheel 0.32.3
Can anyone explain to me what is going on?
When you setup your python project with virtualenv, it install your third party packages on your local machine. That means whenever you copy or clone your repo from github to other machine you have to install all the packages again to run.
To know more about virtulenv please refer to this awesome tutorial.
If you want to build your django project machine independent then I guess Docker is your best bet.

confusion in deploying module's with django

Good day.
I'm a newbie to Django and I have a slight confusion:
When deploying my Django app, do I need to deploy it with all the Python 'come-with' modules, or the hosts already have them installed.
Also, I installed PIL for image manipulation. Would they also have it installed or i have to find a way to install it on their servers. Thanks in advance
do I need to deploy it with all the Python 'come-with' modules
Never do that. It might conflict with the dependencies on the server. Instead issue the following command to create a dependency file (requirements.txt).
pip freeze > requirements.txt (issue this command where manage.py is located)
On the server create a new virtual environment. Now copy django project to the server (you can do this using git clone or just plain old Filezilla). Activate virtual environment. Then change you current working directory to the where manage.py is located. to install all the dependencies issue the following command.
pip install -r requirements.txt
This will install the required dependencies on on server.

offline install for py2neo in ubuntu

I downloaded the .zip from the py2neo github and placed in the site-packages folder and ran
pip install py2neo
Everything looks like it's in the right place (I compared to windows setup and they both contain the same files in the same places) but when I run a .py I get:
ImportError: No module named batch *
It sounds like your paths aren't setup correctly. To install, I would recommend simply running the pip install py2neo line without first downloading the zip and allowing pip to pull py2neo from PyPI. Alternatively, if you are trying to avoid using a network connection from your server, run python setup.py install from within a copy of the GitHub repository.
Note: You will want to checkout the latest release branch from the GitHub repository before installing. At the time of writing, this is named release/1.6.4.