Is it possible to access sagemaker jupyter notebook from intellij IDE? - amazon-web-services

I have deployed a model via jupyter notebook on sagemaker instance.
Now, I am wondering is there any chance to access sagemaker jupyter notebook from intellij IDE?
I am looking for a way to make an environment to work with peers so that I can get code reviews.
I can see I can control aws lambda functions via terminal, but not sure about Jupyter notebook on Sagemaker instance.

Related

AWS glue interactive session in sagemaker notebooks via lifecycle configurations

I am trying to work with glue interective session in sagemaker notebook by configuring the glue-conda-pyspark kernel via aws lifecycle configurations. It worked earlier while creating a notebook instance. Now the instance is running with configuration but i am no longer able to see the conda glue pyspark kernel in the kernel list. Could anybody help with the create script and start script to run the notebook with glue-pyspark.
I am configuring using this aws doc: https://docs.aws.amazon.com/glue/latest/dg/interactive-sessions-sagemaker.html#is-sagemaker-existing
and also aws took help from aws github scripts: https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/install-conda-package-single-environment/on-start.sh

GCP AI notebook instance permission

I have a GCP AI notebook instance. Anyone with admin access to notebook in my project can open notebook and read, modify or delete any folder/file which is created by me or any other user in my team. Is there a way to create a private repository like /home/user, like we could've done if we used JupyterHub installed on a VM?
Implementing your requirement is not feasible with AI Notebooks. AI Notebooks is intended for a rapid prototyping and development environment that can be easily managed, and advanced multi-user scenarios fall outside its intended purpose.
The Python Kernel in AI Notebooks always runs under the Linux user "Jupyter" regardless of what GCP user accesses the notebook. Anyone who has editor permissions to your Google Cloud project can see each other's work through the Jupyter UI.
In order to isolate the user's work, the recommended option is to set up individual notebook instances for each user. Please find the 'Single User' option.
It’s not feasible to combine multiple instances to a master instance in AI Notebooks. So, the recommended ways to give each user a Notebook instance and share any source code via GIT or other repository system. Please find Save a Notebook to GitHub doc for more information.
You probably created a Notebook using Service Account mode. You can provide access to single users only via single-user mode
Example:
proxy-mode=mail,proxy-user-mail=user#domain.com

How can solve a scheduling problem a .ipnyb notebook in Sagemaker using AWS lambda and Lifecycle Configuration?

I want to schedule my .ipynb file with Amazon Lambda. I am following the steps of this publications https://towardsdatascience.com/automating-aws-sagemaker-notebooks-2dec62bc2c84. For notebook instance is working very well starting and stoping, but my .ipynb file is not executing, i wrote as the same above mentioned publication in lifecycle configuration.
Just i change these lines with my notebook instance source
"NOTEBOOK_FILE="/home/ec2-user/SageMaker/Test Notebook.ipynb"
/home/ec2-user/anaconda3/bin/activate "$ENVIRONMENT"
"source /home/ec2-user/anaconda3/bin/deactivate".
Cloudwatch is working very well for notebook instance, but .ipynb file is not executed.
Can someone help me about my problem!
Check out the this aws-sample of how to run a notebook in aws-sagemaker.
This document shows how to install and run the sagemaker-run-notebooks library that lets you run and schedule Jupyter notebook executions as SageMaker Processing Jobs.
This library provides three interfaces to the notebook execution functionality:
A command line interface (CLI)
A Python library
A JupyterLab extension that can be enabled for JupyterLab running locally, in SageMaker Studio, or on a SageMaker notebook instance
https://github.com/aws-samples/sagemaker-run-notebook
Also, check out this example of Scheduling Jupyter notebooks on SageMaker. you can write code in a Jupyter notebook and run it on an Amazon SageMaker ephemeral instance with the click of a button, either immediately or on a schedule. With the tools provided here, you can do this from anywhere: at a shell prompt, in JupyterLab on Amazon SageMaker, in another JupyterLab environment you have, or automated in a program you’ve written.
https://aws.amazon.com/blogs/machine-learning/scheduling-jupyter-notebooks-on-sagemaker-ephemeral-instances/

Can you start and execute a Jupyter Notebook in Sagemaker using Step Functions?

I am planning to start a Jupyter Notebook instance and execute each notebook file in AWS Sagemaker using AWS Step Functions. Can this be achieved?
The AWS Step Functions Data Science SDK is an open source library that allows data scientists to easily create workflows that process and publish machine learning models using SageMaker and Step Functions.
The following Example notebooks, which are available in Jupyter notebook instances in the SageMaker console and the related GitHub project:
hello_world_workflow.ipynb
machine_learning_workflow_abalone.ipynb
training_pipeline_pytorch_mnist.ipynb

Error in launching Jupyter notebook on Google Cloud Platform

Followed this blog post on creating a new VM and trying to launch Jupyter notebook on GCP.
https://medium.com/#kn.maragatham09/installing-jupyter-notebook-on-google-cloud-11979e40cd10
Getting this error message
Did you try with the obvious ones?
Does the Cert exist? (maybe it was removed along the process..)
Does the Cert has correct ownership?
Does jupyter/user executing jupyter has the right to access the Cert?
Maybe you could try and run Jupyter in verbose mode. And post it here.
Fyi, GCP now offers a easy to use, pre-setup Jupyter Notebook environment called AI Platform Notebooks. Have you tried using that instead? You won't need to worry about setting up any certs :)