Every time my notebook shuts down and restarts, I lose the plugins and have to reinstall them from the terminal
Is there a way to set jupyterlab extensions to be installed automatically on starting my sagemaker notebook?
The plugin I'm trying to install is:
jupyter nbextension enable --py widgetsnbextension
jupyter labextension install #jupyter-widgets/jupyterlab-manager
Any insight would be appreciated
Lifecycle Configuration can be used to install a extension every time your Notebook Instance starts up.
There are sample Lifecycle Configuration scripts for installing a JupyterLab extension [1] as well as an NBExtension [2] that can be used for this purpose.
[1] https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/install-lab-extension/on-start.sh
[2] https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/install-nb-extension/on-start.sh
Related
On starting the SageMaker Studio server, I can only see a set of predefined kernels when
I select kernel for any notebook.
I create conda environments and persist them between sessions by pointing .condarc to a custom miniconda directory stored on EFS.
I want all notebooks to have access to environments stored in the custom miniconda directory. I can do that on the system terminal but can't seem to find a way to make the kernels available to notebooks.
I am aware of Life Cycle Configuration but that seems to be working only with notebooks instances rather than SageMaker Studio.
Desired outcomes
Ideally making custom kernels persistently available to notebooks but if that isn't feasible or requires custom docker image, I am happy with running a script manually every time I run the server.
What I have tried so far:
I ran the following which is a tweaked version of start.sh meant to be for Life Cycle Configuration.
#!/bin/bash
set -e
sudo -u sagemaker-user -i <<'EOF'
unset SUDO_UID
WORKING_DIR=/home/sagemaker-user/.SageMaker/custom-miniconda/
source "$WORKING_DIR/miniconda/bin/activate"
for env in $WORKING_DIR/miniconda/envs/*; do
BASENAME=$(basename "$env")
source activate "$BASENAME"
python -m ipykernel install --user --name "$BASENAME" --display-name "$BASENAME"
done
EOF
That didn't work and I couldn't access the kernels from the notebooks.
If you need a persistent custom kernel in SageMaker studio, you can create an ECR repository and build a docker image with custom environment configurations. This image can then be attached to the SageMaker studio notebooks. Reference link!
SageMaker studio now also supports the use of lifecycle configurations. Reference link!
I am Using SageMaker Notebook for some deep learning tasks. However, raw SageMaker does not provide much of nbextension options such as auto complete for Hinter, not even provide a configurator as traditional Notebook. So, I installed jupyter_contrib_nbextensions and try to enable Hinter by jupyter nbextension enable Hinter/main.
However, the functionality still does not show up.
When I run Jupyter nbextension list, it shows like:
Can someone tell me how to successfully add and use nbextension on SageMaker Notebook?
Thanks in advance
I want to install additional libraries on AWS notebook (connected to EMR cluster), however I do not see any option to connect from Notebook to internet. If I do "pip install ", it always come back saying that network is not reachable. I am not sure which network need to be changed for network connection and library installation.
I did login to Jupyter terminal, and ping to google.com, which just timed out. I do not see any network / security group etc... configuration under Notebook section for making any relevant changes.
May be I need to take some additional steps?
If you use PySpark kernel then you can install libraries using
sc.install_pypi_package("celery")
Or by running
sudo pip-3.6 install boto3
The following document has more details
If you use python 3 kernel, then only the packages are installed and there is no direct way to install extra libraries except uploading the python package to the notebook then using jupyterlab terminal to run
pip install package.tar.gz
I started using VM instances in Google Cloud Platform to train deep learning. In a Linux machine what is the best way to start running cells of Jupyter Notebook without opening browser, just by command in terminal. Also I want to see all the output in terminal.
Yes this is possible and there are different ways of doing it;
One way is to use runipy. This will run all cells in a notebook.
The source code is here runipy
You can also save the output as html report or a notebook
You can install runipy using pip
$ pip3 install runipy
Another method is to use the python3 module nbconvert.
This would allow you to use a python interactive shell.
See the official Python documentation here Executing notebooks from the command line
I want to develop using Sagemaker notebooks but I want to use TF 2.0,
Is there a way to do it?
That's a great question. To enable TF 2.0 on the SageMaker notebooks, you can go to the Conda tab (on Jupyter and not JupyterLab) and select the tensorflow_p36 environment. On the bottom left, you can search "tensorflow" in the available Conda packages. There will be a TF2.0 option. Check that box, and the click on the right arrow, which will install TF2.0 into your tensorflow_p36 environment.
Restart any existing notebooks that are open.
Note: This works only with Jupyter and not JupyterLab.
Alex Chung
Senior Product Manager - AWS