How do I prevent local users from downloading data from Azure virtual machine to outside - azure-virtual-machine

I am building a system in Azure such that the users should not be able to transfer data from Azure virtual machine to outside world. But the administrator should be able to do it.
I am aware that this can be controlled via Azure NSG, but this solution will block ALL users (including the Administrator account). The virtual machines are not part of Domain, so GPO is out of the question.
Is there any other way to achieve this?

I would suggest you to place the data in a Storage account and then make the container as Private and also configure RBAC in such a way that only Admins have access.So, no one from Internet will be able to download.
When you want to download, Admin can generate SAS URL and users can download it.
This is the cost effective and secured way of doing the task.

Related

How to limit Service Account to only be able to access Google Drive?

I need to make an app (without UI) that uploads some files to a shared folder on Google Drive (I don't need to work with multiple accounts). From what I found, Service Account appears to be a good solution for accessing the Drive API. I want to limit my newly created Service Account to only be allowed to use Google Drive, but I can't figure out how to do that. I know that I can create a role and assign it to the account, but I have no idea which permissions (if any) relate to the Drive API.
To create perrmission for a Service Account to access File or a Shared drive, follow the below guide:
https://developers.google.com/drive/api/v3/reference/permissions/create
The guide has an API on the right which helps you create methods to perform calls on live data.

Secure way to upload files to GCP Cloud Storage

We're making some machines in which there's a part which uploads the images captured by the camera to Google Cloud Storage. For this purpose what I've done is
Create a service account for each machine.
Create a custom role with
permissions:
storage.objects.create
storage.buckets.get
storage.objects.get
Apply this role to that service account.
Download the JSON credentials key file and use this file with python script (in which I specify bucket name) to upload image to GCP Storage.
Is this way of doing things efficient and secure given that we only ship 2-3 machines each month?
Also I will have to ship JSON file with each machine, if the above method is valid, is this fine or there's any method to hide this key file?
Your case isn't so simple!
Firstly, if you want to put a service account in each machine, you will be limited a day (you are limited to 100 service accounts per project). And using the same service account, or the same key is too dangerous
Secondly, your use case sounds like IoT use case where you have lot of devices on edge to communicate with the cloud. But PubSub messages are limited to 10Mb max and IoT Core solution doesn't fit with your case.
The 2 latest solutions are based on the same principle:
Make an endpoint public (Cloud Run, Cloud Functions, App Engine or whatever you want)
Call this endpoint with your machine, and their own token (i.e. a string, encrypted or not)
Check the token, if OK you can (here the 2 alternatives)
Create an access token (short lived token) on a service account with the minimal permission for the machine usage, and send it back to the machine. The machine will use it to call the Google Cloud API, such as Cloud Storage API. The advantage of this solution is that you will be able to use the access token to reach other GCP APIs in the future if your use case, and your machine update require them.
Create a signedUrl and send it back to the machine. Then the machine has to upload file to this URL. The advantage is the strict limitation to Cloud Storage, no other GCP service.
The main issue with the 2 latest solution is that required public endpoint and you are exposed to attacks on it. You can protect it behind a load balancer and mitigate the attacks with Cloud Armor. Think also to limit the scalability of your public endpoint, to prevent any useless expenses in case of attacks.

In AWS EC2, is there a way to allow access only to a designated device?

I am new to AWS EC2. I want to set up a website only for my family members.
It will contain some content that is not necessarily private, but would be more appropriate if only family members can access.
IP address discrimination wouldn't work here as we may on the go and use other wifi.
I'm considering MAC code as the screening basis.
Is such access restriction allowed in EC2? Thanks.
Restricting is using MAC won't work, the devices will reach EC2 over a public Network and the MAC changes at every hop. I assume you would be interested in setting remote vpn/L2TP vpn ? EC2 can be used as VPN server and can be allowed from certain client, if no, try to setup a log in based page and create account for your family members.
Here a free open source tool to achieve it:
https://www.digitalocean.com/community/tutorials/how-to-sync-and-share-your-files-with-seafile-on-ubuntu-18-04
Cognito is designed for such things, you can manage there user accounts. You can add Application Load Balancer before your EC2 instance, which will forward to Cognito authentication - but this is a bit expensive solution for "family usage".
If there's no very fragile data on this website, you can use just BasicAuth, which will prompt for username and password on site-entry, or you can add standard login page in your website.
Least, but not last, is Lambda with ApiGateway (free tier allows to free usage of this service for ALOT of requests) - this is more programatically solution - but - it's up to you which one to choose.

Using Google Cloud Platform Storage to store user images

I was trying to understand the Google Cloud Platform storage but couldn't really comprehend the language used in the documentation. I wanted to ask if you could use the storage and the APIs to store photos users take within your application and also get the images back if provided with a URL? and even if you can, would it be a safe and reasonable method to do so?
Yes you can pretty much use a storage bucket to store any kind of data.
In terms of transferring images from an application to storage buckets, the application must be authorised to write to the bucket.
One option is to use a service account key within the application. A service account is a special account that can be used by an application to authorise to various Google APIs, including the storage API.
There is some more information about service accounts here and information here about using service account keys. These keys can be used within your application, and allow the application to inherit the permission/scopes assigned to that service account.
In terms of retrieving images using a URL, one possible option would be to use signed URLs which would allow you to give users read or write access to an object (in your case images) in a bucket for a given amount of time.
Access to bucket objects can also be controlled with ACL (Access Control Lists). If you're happy for you images to be available publicly (i.e. accessible to everybody), it's possible to set an ACL with 'Reader' access for AllUsers.
More information on this can be found here.
Should you decide to make the images available publically, the URL format to retrive the object/image from the bucket would be:
https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]
EDIT:
In relation to using an interface to upload the files before the files land in the bucket, one option would be to have a instance with an external IP address (or multiple instances behind a Load Balancer) where the images are initially uploaded. You could mount Cloud Storage to this instance using FUSE, so that uploaded files are easily transferred to the bucket. In terms of databases you have the option of manually installing your database on a Compute Engine instance, or using a fully managed database service such as Cloud SQL.

How can I "hide" the data in AWS from users?

I want to build an application using Amazon Web Services (AWS).
The way the application should work is this;
I make a program that lets the user import a large file in an external format and send it to AWS (S3?) in my own format.
Next many users can access the data from web and desktop applications.
I want to charge per user accessing the data.
The problem is that the data on AWS must be in an unintelligible format or the users may copy the data over to another AWS account where I can not charge them. In other words the user need to do some "decrypting" of the data before they can be used. On the web this must be done in JavaScript which is plaintext and would allow the users to figure out my unintelligible format.
How can I fix this problem?
Is there for instance a built in encryption/decryption mechanism?
Alternatively is there some easy way in AWS to make a server that decrypts the data using precompiled code that I upload to AWS?
In general when you don't want your users to access your application's raw data you just don't make that data public. You should build some sort of server-side process that reads the raw data and serves up what the user is requesting. You can store the data in a database or in files on S3 or wherever you want, just don't make it publicly accessible. Then you can require a user to login to your application in order to access the data.
You could host such a service on AWS using EC2 or Elastic Beanstalk or possibly Lambda. You could also possibly use API Gateway to manage access to the services you build.
Regarding your specific question about a service on AWS that will encrypt your public data and then decrypt it on the fly, there isn't anything that does that out of the box. You would have to build such a service and host it on Amazon, but I don't think that is the right way to go about this at all. Just don't make your data publicly accessible in the first place, and make all requests for data go through some service to verify that the user should be able to access the data. In your case that would mean verifying that the user has paid to access the data they are requesting.