Upload Sagemaker Model to S3 without encryption - amazon-web-services

I am training models in sagemaker with its Pytorch Module and Huggingface module.
When the training is done, It successfully uploads model artifacts into s3.
But the problem is the code automatically uploads the model with encryption though I did not give any encryption key.
If I go to s3 and check the file details, I found
Encryption key
AWS Key Management Service key (SSE-KMS)
But I do not want encryption. Because I copy this artefact to another s3 on a different AWS, it creates a problem and needs extra permission.
Can anyone say, How to save the model without encryption in S3 from sageamaker?
I am using these modules
from sagemaker.huggingface import HuggingFace
from sagemaker.pytorch import PyTorch

Related

importing encryption key to google cloud key-management

I want to import the encryption key to GCP "key-management" currently the key is on my GCP storage. are there any steps to import to GCP key-management without affecting the operation? if any on did before. thanks!!!
I found this video useful.
In GCP documentation they recommended that you create a new project to test this feature, to ease clean-up after testing and to ensure that you have adequate IAM permissions to import a key.

How to deploy the model using sagemaker which has no model artifacts

I want to deploy the model which has no artifacts(model zip file) using sagemaker and use its endpoint in my application to get the results. Can someone help me in doing this
The steps to deploy a model in SageMaker is to -
Create a SageMaker Model
Create an endpoint configuration
Create the endpoint
SageMaker needs to access to a model.tar.gz file stored in S3 with the model and code stored in it. You can also deploy a model by training an Estimator and deploying that to an endpoint.
You can not deploy a model that has no model artifacts to a SageMaker hosted endpoint.
https://docs.aws.amazon.com/sagemaker/latest/dg/how-it-works-deployment.html
I work for AWS & my opinions are my own

How do i create object approval from aws management console or aws cli for S3 object

I am using Django boto3 module to upload images and videos to AWS S3 and also using cloudfront CDN.
User create their account and upload images and videos to AWS S3 , but i want to put a check and implement admin approval for video and images .
Currently, the images and videos uploaded in AWS s3 via Django app is public by default.
Can it be possible via AWS management console or AWS cli to implement admin approval for images and videos?
Please help.
use some specific prefix (like "unapproved") when user uploads files .
create one application(Admin Panel) on web/mobile, there you can show/list of image files prefixed with "unapproved".
now check and approve ( one button) .. after approve, copy original file and rename it with "approved" prefix or simply without prefix to s3 and delete old one..

Handle encrypted JSON in AWS Glue Job

In our on premise environment JSON is generating for loan data and encrypted using a core crypto jar and this encrypted JSON is getting saved into mysql tables and the same core crypto jar being called from java to decrypt the same JSON value. Now we have decided to use a Glue service for ETL purpose. Can anyone help me here to call a core crypto when the the data from the encrypted JSON exists during the Glue execution.
How can we handle the above process in AWS Glue ETL Job ?
You may need to use a custom script.
https://docs.aws.amazon.com/glue/latest/dg/console-custom-created.html
You can specify the jars that your script is dependent upon:
Dependent jars path Comma-separated Amazon S3 paths to JAR files that
are required by the script. Note Currently, only pure Java or Scala
(2.11) libraries can be used.
The create a Glue job as described here:
https://docs.aws.amazon.com/glue/latest/dg/add-job.html
Your system is no more secure if at the end of the day you will be needing to upload your secret key to AWS to decrypt this JSON. You may as well not encrypt this JSON when you save it to the database, and instead configure the database to be encrypted by a customer managed KMS key.
You'll get much more functionality from doing things this way as you can log KMS key usage as well as restricting what services have access to be able to decrypt the data. If you keep the secret in your jar file you will need to have this jar file wherever you read this data, and will end up distributing this secret in different places, without security controls KMS gives you, or the auditing.

How to upload files from React-native to S3 and store filename via REST API

I have dilemma in how to architect my React-native app in the best way using Amazon AWS S3 as image/file storage and Django backend using REST API.
My react-native app has to be able to collect information from user together with couple of images and signatures. I am saving all information as props in redux and I can successfully transfer that to the database using Rest API in Django that I use as backend system.
I can also send images to Amazon AWS S3 bucket, but that is a separate operation.
My dilemma is if it is good practice to send images to S3 first and then send filename in the REST API call together with other info that is collected from the user in app?
In this way, I have files in the place on S3 and I can use them in the creation of a PDF file that is going to be done by Django backend system.
You should be using AmazonS3 as the storage backend for Django using S3Boto3Storage. That will make it a single operation as well as give django access to S3.
Other option is also to mount S3 as a file system on the machine running django and make the MEDIA path as the mounted location. Though, this would add a step of mounting S3 on startup of the machine every time.
Check out this link for the first option.