Run sagemaker jupyter notebook using aws cli - amazon-web-services

I have a "notebook instance" in aws sagemaker, this instance has many notebooks.
Is there any way to use aws cli (shell) to individually start the notebooks and check their current status (running, stopped, etc...)?
If not, what ways can I use to perform the above actions programmatically?

You can see the status of sagemaker notebook instance by using command
$ aws sagemaker describe-notebook-instance --notebook-instance-name your_instance_name
which will return detail about instance.
To launch notebook from aws cli use the following command
$ aws sagemaker create-presigned-notebook-instance-url --notebook-instance-name your_instance_name
The above command will return url of jupyter notebook, copy paste it in the browser.
For more help see
https://docs.aws.amazon.com/cli/latest/reference/sagemaker/create-presigned-notebook-instance-url.html
and
https://docs.aws.amazon.com/sagemaker/latest/dg/howitworks-access-ws.html

Related

AWS - is it possible to pass the parameters from AWS lambda function to AWS sagemaker notebook

I am beginner to AWS console. I am facing issues while building Machine Learning pipeline.
Currently, The Lambda function does the job of getting the uploaded filename, username from front end and invoking the notebook instance.
Also, life cycle configuration at the instance will invoke the notebook for training. So the question is how to pass above variables variables to sagemaker notebook for training the machine learning model. Is it possible to achieve this? Thank you.
#invoke command from lambda
client.start_notebook_instance(NotebookInstanceName='<sagemaker_instance_name>')
Lifecycle configuration under sagemaker instance:
#!/bin/bash
set -e
ENVIRONMENT=JupyterSystemEnv
#JupyterSystemEnv
NOTEBOOK_FILE=/home/ec2-user/SageMaker/XGBoost_training.ipynb
source /home/ec2-user/anaconda3/bin/activate "$ENVIRONMENT"
nohup jupyter nbconvert --ExecutePreprocessor.timeout=-1 --ExecutePreprocessor.kernel_name=python3 --to notebook --execute "$NOTEBOOK_FILE" &

How to run Jupyter notebook on AWS instance

How to run Jupyter notebook on AWS instance, chmod 400 error
I want to run my jupyter notebooks in the cloud, ec2 AWS instance.
--
I'm following this tutorial:
https://www.codingforentrepreneurs.com/blog/jupyter-notebook-server-aws-ec2-aws-vpc
--
I have the Instance ec2 all set up as well as nginx.
--
Problem is..
When typing chmod 400 JupyterKey.pem just work for MAC not Windowns Power shell
cd path/to/my/dev/folder/
chmod 400 JupyterKey.pem
ssh ubuntu#34.235.154.196 -i JupyterKey.pem
Error: The term 'chmod' is not recognized as the name of a cmdlet, function, cript, or operation
category info: ObjectNotFound
FullyQualifiedErrorId: Command notFoundException
AWS has a managed Jupyter Notebook service as part of Amazon SageMaker.
SageMaker hosted notebook instances enable you to easily spin up a Jupyter Notebook with one click, with pay per hour pricing (similar to EC2 billing), and with the ability to easily upload your existing notebook directly onto the managed instance, all directly through the instance URL + AWS console.
Check out this tutorial for a guide on getting started!
I had the same permission problem and could fix it by running the following command in the Amazon Machine Image Linux:
sudo chown user:user ~/certs/mycert.pem

Cannot Transfer files from my mac to VM instance on GCP

I have managed to set up a VM instance on Google cloud platform using the following instructions:
https://towardsdatascience.com/running-jupyter-notebook-in-google-cloud-platform-in-15-min-61e16da34d52
I am then able to run a Jupyter notebook as per the instructions.
Now I want to be able to use my own data in the notebook....this is where I am really struggling. I downloaded the Cloud SDK onto my mac and ran this from the terminal (as per https://cloud.google.com/compute/docs/instances/transfer-files)
My-MacBook-Air:~ me$ gcloud compute scp /Users/me/Desktop/my_data.csv aml-test:~/amlfolder
where aml-test is the name of my instance and amlfolder a folder I created on the VM instance. I don't get any error messages and it seems to work (the terminal displays the following after I run it >> 100% 66MB 1.0MB/s 01:03 )
However when I connect to my VM instance via the SSH button on the google console and type
cd amlfolder
ls
I cannot see any files! (nor can I see them from the jupyter notebook homepage)
I cannot figure out how to use my own data in a python jupyter notebook on a GCP VM instance. I have been trying/googling for an entire day. As you might have guessed I'm a complete newbie to GCP (and cd, ls and mkdir is the extent of my linux command knowledge!)
I also tried using Google Cloud Storage - I uploaded the data into a google storage bucket (as per https://cloud.google.com/compute/docs/instances/transfer-files) but don't know how to complete the last step '4. On your instance, download files from the bucket.'
If anyone can figure out what i am doing wrong, or an easier method to get my own data running into a python jupyter notebook on GCP than using gcloud scp command please help!
Definitely try writing
pwd
to verify you're in the path you think you are, there's a chance that your scp command and the console SSH command login as different users.
To copy data from a bucket to the instance, do
gsutil cp gcs://bucket-name/you-file .
As you can see in gcloud compute docs , gcloud compute scp /Users/me/Desktop/my_data.csv aml-test:~/amlfolder will use your local environment username, so the tilde in your command refers to the home directory of a username that is the same name as your local.
But when you SSH from the Browser as you can see from docs that your Gmail username will be used.
So, you should check the home directory of the user used by gcloud compute scp ... command.
The easiest way to check, SSH to your VM and run
ls /home/ --recursive

Enable AWS Batch in AWS CLI

I am working in the US-East-1 (N.Virginia) and have even configured the Default Region Name to us-east-1 using the command aws configure.
But I am not able to access Batch using CLI. Batch is not even listed as one of the Available Services in aws help.
Any ideas how to enable Batch in AWS Cli? I have administrative access in IAM console so permissions don't seem to be the issue.
The batch service is relatively new, so its commands only exist in fairly new versions of the aws CLI.
Commands for batch in the latest cli documentation: http://docs.aws.amazon.com/cli/latest/reference/batch/index.html?highlight=batch
If you are running Windows, simply download the updated installer. https://aws.amazon.com/cli/
If you are using OSX or Linux use pip. pip install --upgrade awscli

How to run a script on an EC2 instance remotely?

I have an EC2 instance and I need to download a file from its D drive through my program. Currently, it's a very annoying process because I can't access the instance directly from my local machine. The way what I am doing now is running a script on the instance and the instance uploads the file I need to S3 and my program access S3 to read the file.
Just wonder whether there is any simple way to access the drive on the instance instead of going through S3?
I have used AWS DataPipeline and its task runner to execute scripts on a remote instance. The taskrunner waits for a pipeline event published to its worker group.
http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-using-task-runner.html
I use it to execute shell script and commands on a schedule. The script to run should be uploaded to S3, and the Data pipeline template specifies the script's path. Works great for periodic tasks. You can do anything you want on the remote box via the script.
You cannot directly download the file from EC2, but via s3( or maybe using scp command) from your remote ec2.
But to simplify this annoying process you can use AWS Systems Manager.
AWS Systems Manager Run Command allows you to remotely and securely run set of commands on EC2 as well on-premise server. Below are high-level steps to achieve this.
Attach Instance IAM role:
The ec2 instance must have IAM role with policy AmazonSSMFullAccess. This role enables the instance to communicate with the Systems Manager API.
Install SSM Agent:
The EC2 instance must have SSM agent installed on it. The SSM Agent process the run command requests & configure the instance as per command.
Execute command :
Example usage via AWS CLI:
Execute the following command to retrieve the services running on the instance. Replace Instance-ID with ec2 instance id.
aws ssm send-command --document-name "AWS-RunShellScript" --comment "listing services" --instance-ids "Instance-ID" --parameters commands="service --status-all" --region us-west-2 --output text
More detailed information: https://www.justdocloud.com/2018/04/01/run-commands-remotely-ec2-instances/