I am Using SageMaker Notebook for some deep learning tasks. However, raw SageMaker does not provide much of nbextension options such as auto complete for Hinter, not even provide a configurator as traditional Notebook. So, I installed jupyter_contrib_nbextensions and try to enable Hinter by jupyter nbextension enable Hinter/main.
However, the functionality still does not show up.
When I run Jupyter nbextension list, it shows like:
Can someone tell me how to successfully add and use nbextension on SageMaker Notebook?
Thanks in advance
Related
I am working in JupyterLab within a Managed Notebook instance, accessed through the Vertex AI workbench, as part of a Google Cloud Project. When the instance is created, there are a number of JupyterLab extensions that are installed by default. In the web GUI, one can click the puzzle piece icon and enable/disable all extensions with a single button click. I currently run a post-startup bash script to manage environments and module installations, and I would like to add to this script whatever commands would turn on the existing extensions. My understanding is that I can do this with
# Status of extensions
jupyter labextension list
# Enable/disable some extension
jupyter labextension enable extensionIdentifierHere
However, when I test the enable/disable command in an instance Terminal window, I receive, for example
[Errno 13] Permission denied: '/opt/conda/etc/jupyter/labconfig/page_config.json'
If I try to run this with sudo, I am asked for a password, but have no idea what that would be, given that I just built the environment and didn't set any password.
Any insights on how to set this up, what the command(s) may be, or how else to approach this, would be appreciated.
Potentially relevant:
Not able to install Jupyterlab extensions on GCP AI Platform Notebooks
Unable to sudo to Deep Learning Image
https://jupyterlab.readthedocs.io/en/stable/user/extensions.html#enabling-and-disabling-extensions
Edit 1:
Adding more detail in response to answers and comments (#gogasca, #kiranmathew). My goal is to use ipyleaft-based mapping, through the geemap and earthengine-api python modules, within the notebook. If I create a Managed Notebook instance (service account, Networks shared with me, Enable terminal, all other defaults), launch JupyterLab, open the Terminal from the Launcher, and then run a bash script that creates a venv virtual environment, exposes a custom kernel, and performs the installations, I can use geemap and ipywidgets to visualize and modify (e.g., widget sliders that change map properties) Google Earth Engine assets in a Notebook. If I try to replicate this using a Docker image, it seems to break the connection with ipyleaflet, such that when I start the instance and use a Notebook, I have access to the modules (they can be imported) but can't use ipyleaflet to do the visualization. I thought the issue was that I was not properly enabling the extensions, per the "Error displaying widget: model not found" error, addressed in this, this, this, this, etc. -- hence the title of my post. I tried using and modifying #TylerErickson 's Dockerfile that modifies a Google deep learning container and should handle all of this (here), but both the original and modifications break the ipyleaflet connection when booting the Managed Notebook instance from the Docker image.
Google Managed Notebooks do not support third-party JL extensions. Most of these extensions require a rebuild of the JupyterLab static assets bundle. This requires root access which our Managed Notebooks do not support.
Untangling this limitation would require a significant change to the permission and security model that Managed Notebooks provides. It would also have implications for the supportability of the product itself since a user could effectively break their Managed Notebook by installing something rogue.
I would suggest to use User Managed Notebooks.
Every time my notebook shuts down and restarts, I lose the plugins and have to reinstall them from the terminal
Is there a way to set jupyterlab extensions to be installed automatically on starting my sagemaker notebook?
The plugin I'm trying to install is:
jupyter nbextension enable --py widgetsnbextension
jupyter labextension install #jupyter-widgets/jupyterlab-manager
Any insight would be appreciated
Lifecycle Configuration can be used to install a extension every time your Notebook Instance starts up.
There are sample Lifecycle Configuration scripts for installing a JupyterLab extension [1] as well as an NBExtension [2] that can be used for this purpose.
[1] https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/install-lab-extension/on-start.sh
[2] https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/install-nb-extension/on-start.sh
I started using VM instances in Google Cloud Platform to train deep learning. In a Linux machine what is the best way to start running cells of Jupyter Notebook without opening browser, just by command in terminal. Also I want to see all the output in terminal.
Yes this is possible and there are different ways of doing it;
One way is to use runipy. This will run all cells in a notebook.
The source code is here runipy
You can also save the output as html report or a notebook
You can install runipy using pip
$ pip3 install runipy
Another method is to use the python3 module nbconvert.
This would allow you to use a python interactive shell.
See the official Python documentation here Executing notebooks from the command line
I want to develop using Sagemaker notebooks but I want to use TF 2.0,
Is there a way to do it?
That's a great question. To enable TF 2.0 on the SageMaker notebooks, you can go to the Conda tab (on Jupyter and not JupyterLab) and select the tensorflow_p36 environment. On the bottom left, you can search "tensorflow" in the available Conda packages. There will be a TF2.0 option. Check that box, and the click on the right arrow, which will install TF2.0 into your tensorflow_p36 environment.
Restart any existing notebooks that are open.
Note: This works only with Jupyter and not JupyterLab.
Alex Chung
Senior Product Manager - AWS
Is there a way to do it right from a cell in the notebook? similar to pip install ... --upgrade
I didn't know how to do what's instructed on https://docs.qubole.com/en/latest/faqs/general-questions/install-custom-python-libraries.html#pre-installed-python-libraries
The current Python version is 3.5.3, and Pandas 0.20.1. I need to upgrade Pandas, and Matplotlib
In Qubole are two ways to upgrade/install a package for the python environment. Currently there is no interface available inside notebook to install new packages.
New and Recommended Way (via Package Mangement) : User can enable Package Management functionality for an account and add new packages to a cluster via UI. There are lot of advantages of using package management over cluster versions in terms of performance and usability. Refer to https://docs.qubole.com/en/latest/user-guide/package-management/index.html for further details.
Old Way (via bootstrap) : User can configure a bootstrap which is basically a shell script executed on each node when the cluster starts and or upscales (more nodes are getting added to cluster). This can be configured via clusters UI and need a cluster start for every change. This is what is instructed in link you shared.
You cannot download/upgrade packages directly from the cell in the notebook. This is because your notebook is associated to a cluster. Now, to ensure that all the nodes of the cluster have the package installed, you must either use the package management (https://docs.qubole.com/en/latest/user-guide/package-management/package-management-environment.html) or the cluster's node bootstrap (https://docs.qubole.com/en/latest/user-guide/clusters/run-scripts-cluster.html#examples-node-scripts).
Do let me know if you have any further questions.