can not create the python jupyter notebook (New Notebook is deactivated) in the dsx - data-science-experience

I can not create the python jupyter notebook (New Notebook is deactivated) in the dsx
But previously it was possible. Could you check this?
P.S. but when I created the project not with the cloud object but with the swift then also it was not possible to create the python jupyter notebook (New Notebook is deactivated) even at the first time. But at that time it was possible to create the python jupyter notebook in the cloud object project.
Why some function icons are deactivated? ex) + New flow in the SPSS Modeler flows, + New Model in the Models.
Thank you!

You need an instance of "Data Science Experience" service before you can create notebooks in DSX. Please check if you have that in your Apps and Services. Go to Profile (top-right) --> Settings --> Apps and Service. Under the Apps, there should be an app of the type "Data Science Experience". If its not there create one from the IBM Cloud console: https://console.bluemix.net/ . Make sure you that in the correct Account and Org (same as your DSX account).

Related

How to add a environment variable to Google AI Notebook?

I would like to include an environment variable on a Google VM which is running a JupyterLab notebook - this variable needs to be present before the notebook is started.
So setting it in the terminal or in the notebook does not work.
I have also tried to modify the bashrc with no luck.
In order to have an environment variable set up on you Compute Engine instance from boot you might be interested in checking startup scripts.
Startup scripts are automated tasks that are performed when your instance boots up. To set them it can be done when creating the instance under the automation section; if the instance is already created accessing your instance details in the compute engine console and under custom metadata clicking on Add item.
Steps to create startup scripts can be found here and here.
If you mean google Colab, one solution is using python, for example:
import os
os.environ["BASE_DIR"]="/content/drive/MyDrive/"

Connect Colab to paid TPU

I'd like to connect Colab to a PAID TPU (upgrading from the free TPU). I created a JSON key using this guide: https://cloud.google.com/docs/authentication/production#auth-cloud-explicit-python, then uploaded it to Colab. I'm able to connect to my storage but not to the TPU:
%tensorflow_version 2.x
import tensorflow as tf
import os
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = './gcp-permissions.json'
# Authenticated API request - works.
storage_client = storage.Client.from_service_account_json(
'gcp-permissions.json')
print(list(storage_client.list_buckets())
#Accessing the TPU - does not work. Request times out.
cluster_resolver = tf.distribute.cluster_resolver.TPUClusterResolver(
tpu='My-TPU-Name',
zone='us-central1-a',
project='My-Project-Name'
)
I've also tried the TPUClusterResolver call with just the tpu name, and with 'credentials=gcp-permissions.json' - same result. I've double-checked that my TPU is up and running in the GCP console. It is not preemptible. What am I missing?
Thanks!
So it looks like you're trying to connect to a paid TPU from your own Google Cloud project from a Colab notebook, is that right? That won't work as the Colab runtime is backed by a GCE VM that is in a different project than your own My-project-name. So instead, you want to also create a GCE VM in that same project and run your training script from that VM. Checkout this tutorial: https://cloud.google.com/tpu/docs/quickstart.

Don't know how to load data to GCP notebook (Platform AI)

I am turning into GCP "Google cloud platform" to train a Keras model using google's powerful GPUs, for that I created an instance of VM on which I run a JupyterLab notebook.
I found my self unable to access my data that is stored as a bucket on google storage.
I found this small doc, under python, they define two function allowing to create and fill a dataset. my problem here is that I couldn't install the datalabeling_v1beta1 module.
I already tried the command below but had no result.
! gcloud components install datalab
I am new to GCP, so I really don't know much about the terminology, my goal for the moment is to uplaod my set of data to be able to use it as if I were on Google Colab or on my local machine.
Please refer to installing dependencies
Create a new notebook, File -> New -> Notebook
%pip install google-cloud-datalabeling
For Data labeling reference

GCP) How to keep jupyter session connected after disconnecting jupyter session from my local laptop?

I keep jupyter server running on GCP VM instance by tmux.
But the problem is that I wanna keep fitting my model after leaving jupyter server session from my local laptop
(eg. I turn off my laptop but jupyter session is still alive, fitting model, and I am able to re-connect that session to check status).
The only way I came up with is to use ~.py and execute $python3 fitting.py, but I wanna run and fit model on jupyter notebook to monitor avoiding adding extra code.
If there is a possible way to do so, please kindly teach me.
Thanks!
Have you considered using the Fairing library? It comes pre-installed with GCP's new AI Platform Notebooks.
This library allows you to pack up your notebook and send it off for remote execution. A new notebook will the executed content will be saved to your GCP Storage bucket. No active internet connection required once you kick of the notebook run.
You can learn how to use it by creating a new GCP AI Platform Notebook and looking at the tutorials folder inside it. You can also find additional tutorials for Fairing here

Using Google Cloud Shell Editor on an instance

I am using the Beta version of Google's newer file browser along with the web based shell window to access my Google Cloud instance (https://cloud.google.com/shell/docs/features#code_editor).
I want to use the new file editor, when it initially loads it shows the files in my dev shell instances, when I boot up the actual instance I want to work in the files still show those from my persistent storage.
Can I get this window to show the files on the instance, so I can edit them on the fly?
As you can see in the screenshot below, files shown in the top left window do not match those in the active directory on the instance, can I tell the file browser to look at the instance?
No, unfortunately you cannot view/edit files on the remote instance to which you are connecting. Think of Google Cloud Shell as your workstation in the cloud and the web editor runs right on that workstation: when you connect to a remote machine you cannot see it's filesystem directly.
You could, however, install a web editor on your remote instance. Google Cloud Shell uses open-source Orion editor that's pre-installed on the Cloud Shell VM.
You can run vscode in your browser locally with connection to remote google cloud vm instance. Needs to download code-server and the repo supplies a binary version. After downloading, you caninstall it on the GCP vm instance and run vscode in your browser.
Hope this blog and video will also help.