Unable to delete notebooks from my project - data-science-experience

I have a project in DSX Local and I have added some Jupyter notebooks. When I go to see my assets and click on delete (via the 3-dot menu at the right of the screen) it asks me whether I want to delete my notebook. When I confirm my screen shows "Deleting" for about a minute. Then it returns an error at the top of my screen: "There was an error deleting the notebook." and my notebook is still in the list of assets. I can even still open my Jupyter notebook afterwards.
How can I remove my notebooks?

The problem could be due to the kernel not running when you tried to delete the notebook. Can you try to start the kernel and then delete the notebook again? This is known limitation documented here:
https://datascience.ibm.com/docs/content/local/limitations.html#kernel-must-be-running-for-the-notebook-to-be-deleted

Related

View Notebook in Sagemaker Studio without automatically starting a kernel or instance

Currently, when you double-click on a notebook in Sagemaker that you've ran before, the notebook will open with your previous kernel and rent an instance
Sometimes, I just want to view the notebook and not run a notebook kernel session
How can I disable this automatic behaviour, and only pick a kernel when I explicitly pick from the dropdown?
I've tried looking for documentation but cannot find this
This is expected behavior on Studio, and it picks up the instance type from the notebook's metadata. There is LCC script you can use to avoid this behaviour - https://github.com/aws-samples/sagemaker-studio-lifecycle-config-examples/pull/26

How can I change the security setting and enable terminal for a Vertex AI managed notebook?

I created a notebook using Vertex AI without enabling terminal first, but I want to enable terminal now so that I can run a Python file from a terminal. Is there any way I can change the setting retrospectively?
As of now, when you create a Notebook instance with unchecked "Enable terminal" like the below screenshot, you cannot re-enable this option once the Notebook instance is already created.
The only workaround is to recreate the Notebook instance and then enable it.
Right now, there is already a Feature Request for this. You can star the public issue tracker feature request and add ‘Me too’ in the thread. This will bring more attention to the request as more users request support for it.

Run Sagemaker notebook instance and be able to close tab

I'm currently using Sagemaker notebook instance (not from Sagemaker Studio), and I want to run a notebook that is expected to take around 8 hours to finish. I want to leave it overnight, and see the output from each cell, the output is a combination of print statements and plots.
Howevever, when I start running the notebook and make sure the initial cells run, I close the Jupyterlab tab in my browser, and some minutes after, I open it again to see how is it going, but the notebook is stopped.
Is there any way where I can still use my notebook as it is, see the output from each cell (prints and plots) and do not have to keep the Jupyterlab tab open (turn my laptop off, etc)?
Jupyter will stop your kernel when you close the tab. If you want to benefit from your jobs running after you close the jupyter tab, I would recommend looking into using SageMaker Processing or Training jobs for your workloads. Alternatively, this link provides some options on how to keep the notebook running with the tab closed.
Answering my own question.
I ended up using Sagemaker Processing jobs for this. As initially suggested by the other answer. I found this library developed a few months ago: Sagemaker run notebook, which helped still keep my notebook structure and cells as I had them, and be able to run it using Sagemaker run notebook using a bigger instance, and modifying the notebook in a smaller one.
The output of each cell was saved, along the plots I had, in S3 as a jupyter notebook.
I see that no constant support is given to the library, but you can fork it and make changes to it, and use it as per your requirements. For example, creating a docker container based on your needs.

Not able to run cell on a jupyterlab notebook Google cloud ai platform

I am running 2 instances under Google AI Platform, which basically launches 2 VM instances to run jupyter lab. I have been happily making notebooks on both VMs. I shutdown both VMs for the day...
What's strange is that next morning, notebook from one VM will launch but when I run any cell containing simple things like "import pandas", it never return result and hang the whole thing (with a * where the cell # would have generated). I create a whole new notebook and just do a simple print("hello"). it also never returns. I restarted the instance a few times and still doesn't work. What I noticed is the "dot" on the top right corner is filled black. I think it should be white when the kernel is restarted. So there could be a problem with the kernel.
Any ideas what could go wrong? I don't even know where to debug this. The strange thing is the other VM still worked. I don't want to do anything drastic like re-creating a new VM, since I like to be able to fix this for a known cause.
Anyone out there experienced same thing?
In case you didn't attempt this, I would try refreshing the notebook window after restarting the machine.

Ipython notebook remote server on AWS

I'm running a remote IPython notebook server on an EC2 instance on AWS. The instance is running Ubuntu.
Followed this tutorial to set up, and everything seems to work - I can access the notebook via https with a password and run code.
However, I can't seem to save changes to the notebook - It says "saving notebook" and then nothing happens (i.e, still written 'unsaved changes' on top).
Any ideas would be greatly appreciated.
Edit: It's not a permissions problem, since running in sudo doesn't help.
When creating a new notebook in the remote server, I am able to save. Problem only occurs for notebooks pulled from my git repository. Also, when opening a problematic notebook, and deleting all cells until it's absolutely empty, I can sometimes (!) save the empty notebook, and sometimes (!!) I still can't.
I've encountered an issue where notebooks wouldn't save on the nbserver on AWS EC2 instance I set up in a similar manner via different tutorial. It turns out I had to refresh and re-login using the password, because my browser would automatically log out have a certain period. Might help if you close and re-attempt to go the the nbserver and see if it asks you to re-login.
Here's a few other things you can try:
try to copy a problematic notebook into the server (scp) and try to open+save, as opposed to going thru repo pull to see if anything changes
check if the hanging "saving notebook" message appear for notebooks in certain directories
check the ipython console messages when you save a problematic notebook and see if anything there can help you pinpoint the issue