How to access-protect to data on my Google Compute Engine VM? - google-cloud-platform

I want to work with sensitive data that should not be seen by other members on a specific VM instance in GCP that my organization has contracted.
Usually, if I just set up a VM instance, other members of your organization are free to create a user to connect to her VM with SSH and have sudo privileges.
So, I'm wondering if I shouldn't have sensitive data inside the VM.
Q. Is there a way to prevent other users from accessing my VM instance data?
Q. Is the OS login suitable for the above purposes?If there is a simpler and more typical method, I would like to adopt it.
I currently have the role of "editor" on the GCP project.
Thanks.

Related

GCP AI notebook instance permission

I have a GCP AI notebook instance. Anyone with admin access to notebook in my project can open notebook and read, modify or delete any folder/file which is created by me or any other user in my team. Is there a way to create a private repository like /home/user, like we could've done if we used JupyterHub installed on a VM?
Implementing your requirement is not feasible with AI Notebooks. AI Notebooks is intended for a rapid prototyping and development environment that can be easily managed, and advanced multi-user scenarios fall outside its intended purpose.
The Python Kernel in AI Notebooks always runs under the Linux user "Jupyter" regardless of what GCP user accesses the notebook. Anyone who has editor permissions to your Google Cloud project can see each other's work through the Jupyter UI.
In order to isolate the user's work, the recommended option is to set up individual notebook instances for each user. Please find the 'Single User' option.
It’s not feasible to combine multiple instances to a master instance in AI Notebooks. So, the recommended ways to give each user a Notebook instance and share any source code via GIT or other repository system. Please find Save a Notebook to GitHub doc for more information.
You probably created a Notebook using Service Account mode. You can provide access to single users only via single-user mode
Example:
proxy-mode=mail,proxy-user-mail=user#domain.com

multiuser public jupyter notebook on AWS sagemaker

I know there is a good tutorial on how to create jupyter notebooks on AWS sagemaker "the easy way".
Do you know if it is possible to allow 10 students to create jupyter-notebooks who do not have an AWS accounts, and also allow them to edit jupyter-notebooks?
Enabling multiple users to leverage the same notebook (in this case, without authentication) will involve managing your Security Groups to enable open access. You can filter, allowing access for a known IP address range, if your students are accessing it from a classroom or campus, for example.
Tips for this are available in this answer and this page from the documentation, diving into network configurations for SageMaker hosted notebook instances.
As for enabling students to spin up their own notebooks, I'm not sure if it's possible to enable completely unauthenticated AWS-level resource provisioning -- however once you've spun up a single managed notebook instance yourself, students can create their own notebooks directly from the browser in Jupyter, once they've navigated to the publicly available IP. You may need to attach a new SageMaker IAM role that enables notebook creation (amongst other things, depending on the workload requirements). Depending on the computational needs (number, duration, and types of concurrent workloads), there will be different optimal setups of number of managed instances and instance type to prevent computational bottlenecking.

How do I make a VM where none of the project editors can ssh into the machine?

With the gcloud command anyone with editor level access to connect to any of the machines. While most machines we want to make accessible to anyone, some of the machines we don't want people touching. How can I set it up to where people editors can't ssh into certain machines?
You will have to use some more fine grained roles than the primitive ones (which are owner, editor and viewer).
From the doc :
If you just want to give a user the ability to connect to a virtual machine instance as an SSH user, but don't want to grant them the ability to manage your Compute Engine resources, add a user's public key to the project, or add a user's public key to a specific instance.
Here the whole document.

Enable storage permission on Google Cloud VM instance

I have a Google Cloud VM instance, but I overlooked setting the Storage permission to read-write when creating it.
Now further down the line, I'm looking to experiment with cloud storage, but my instance is read-only.
How can this be changed? I understand it may be possible by relaxing the storage buckets policy, but I'd prefer my instance had write access to all future project buckets.
I presume there is an option in gcloud to change the devstorage parameter?
So, you can't change the option to grant the VM permission, but I did find that you can just run
gcloud auth login
then you can sign in with your management account from the VM and you'll be able to run commands that way.
Unfortunately you can't change the scopes of the VM. You will have to create a new one to change them.
When you create a new one you can reuse the disk of the old VM, if that helps avoid the pain.
You can update the permissions of your VM instance now but only when it is shut down. Check the following documentation:
Changing the service account and access scopes for an instance
If you want to update the API permissions for the kuberntes clusters VM instance then you cannot do that unless you create a new cluster and give the API access to the nodes associated with the clusters.
I believe they have added the option to change it now, without creating another VM.
Once you have stopped the instances, click on the instances you want to change. On the top there is a Edit button, you can click on it and change any the permissions.
Hope the image helps
edit button
Once you have changed the permission to read_write and it still says Access Denied. Go in your instance SSH browser window and enter 'gcloud auth login', follow the steps and hopefully it works!
You need to stop your vm first, then click on edit and change the cloud api access scopes for storage.
You may find more information here: https://ismailyenigul.medium.com/setting-access-scope-of-google-cloud-vm-instances-c8637718f453

How to find a user on Amazon AWS console

There is a user that can log in via FTP on a setup I’m working with. I can’t get a hold of the people who set it up, but it is with Amazon. I wanted to find out how I could see what permissions this FTP user has and how to set up another one for third-party access. I think it may be setup with EC2, but I’m not sure.
The FTP server, if it is running on EC2, has no relationship with your AWS console - it is specific to your instance, and whatever ftp software is running on the server.
You will need to get access to the instance to find out any more information. You can see the key associated with the instance from the console. If you don't have access to that instance, there are ways to get access, but it will involve stopping the instance, mounting the volume to another instance, adding a new key to volume, and then restarting and using that key key to access it.