How can i secure my aws s3 bucket to be write only - amazon-web-services

I have s3 bucket where i will upload pdf files from my front end. The front end will have 'careers' page where everybody can apply for some work position and upload his CV. I am using aws sdk for that on my nodejs api. But the CV on the S3 bucket needs to be private of course. The problem is that SDK code upload works if the s3 bucket is maked public only.
For file upload i am using multer.
const upload = multer({
fileFilter: fileFilter,
storage: multerS3({
acl: 'public-read',
s3,
bucket: bucket_name',
key: function(req, file, cb) {
req.file = file.originalname;
cb(null, file.originalname);
}
})
});
How can i make the uploaded cv secure on my s3 bucket ? So everyody can upload files on it, but it should be private and no one will not get access to it to READ.

Use the SDK to gnerate a pre-signed url. Here is the JS example for this task.
https://github.com/awsdocs/aws-doc-sdk-examples/blob/master/javascriptv3/example_code/s3/src/s3_put_presignedURL.ts

Related

Clean way to sync public s3 bucket to my private s3 bucket

Binance made its data public through an s3 endpoint. The website is 'https://data.binance.vision/?prefix=data/'. Their bucket URL is 'https://s3-ap-northeast-1.amazonaws.com/data.binance.vision'. I want to download all the files in their bucket to my own s3 bucket. I can:
crawl this website and download the CSV files.
make a URL builder that builds all the URLs and downloads the CSV files using those URLs.
Since their data is stored on s3. I wonder if there is a cleaner way to sync their bucket to my bucket.
Is the third way really doable?
If you want to copy it to your own s3 bucket, you can do:
aws s3 sync s3://data.binance.vision s3://your-bucket-name --no-sign-request
If you want to copy it to your own computer into your current folder (.) you can do:
aws s3 sync s3://data.binance.vision . --no-sign-request

What bucket permissions are required to download zip file directly from URL?

I am following a tutorial here and if I take this s3 URL from the tutorial, https://s3.eu-central-1.amazonaws.com/deepset.ai-farm-qa/datasets/documents/wiki_gameofthrones_txt.zip, I am able to directly download the zip file to local.
When I subsistute my own zip file URL, I get an error that BadZipFile: File is not a zip file, and if I try my URL for zip file, I get permission denied instead of being able to download.
I also confirmed the zip files are formated correctly using terminal: unzip -t zipfile.zip
What permissions do I need to change in s3 or on the s3 object to allow download of zip file directly from URL?
Still very new to IAM s3 permissions and current permission are the standard ones when creating bucket.
Objects in Amazon S3 are private by default. This means that they cannot be accessed by an anonymous URL (like you have shown).
If you want a specific object to be publicly available (meaning that anyone with the URL can access it), then use the Make Public option in the S3 management console. This can also be configured at the time that the object is uploaded by specifying ACL=public-read.
If you want a whole bucket, or particular paths within a bucket, to be public, then you can create a Bucket Policy that grants access to the bucket. This requires S3 Block Public Access to be disabled.
You can also generate n Amazon S3 pre-signed URL, which provides time-limited access to a private object. The pre-signed URL has additional information added that grants permission to access the private object. This is how web applications provide access to private objects to authorized users, such as photo websites.
If an object is accessed via an AWS API call or the AWS Command-Line Interface (CLI), then AWS credentials are used to identify the user. If the user has permission to access the object, then they can download it. This method uses an API call rather than a URL.
Two solutions:
Make your bucket/file public. Check this ( Not recommended)
{
"Version":"2012-10-17",
"Statement":[
{
"Sid":"PublicRead",
"Effect":"Allow",
"Principal": "*",
"Action":["s3:GetObject","s3:GetObjectVersion"],
"Resource":["arn:aws:s3:::DOC-EXAMPLE-BUCKET/*"]
}
]
}
Use pre-signed URLs with SDK .. check this
var params = {Bucket: bucketname , Key: keyfile , Expires: 3600 , ResponseContentDisposition : `attachment; filename="filename.ext"` };
var url = s3.getSignedUrl('getObject', params);

Amazon Transcribe and Golang SDK BadRequestException

I uploaded a .flac file to an Amazon S3 bucket but when I try to transcribe the audio using the Amazon Transcribe Golang SDK I get the error below. I tried making the .flac file in the S3 bucket public but still get the same error, so I don't think its a permission issue. Is there anything that prevents the Transcribe service from accessing the file from the S3 bucket that I'm missing? The api user that is uploading and transcribing have full access for the S3 and Transcribe services.
example Go code:
jobInput := transcribe.StartTranscriptionJobInput{
JobExecutionSettings: &transcribe.JobExecutionSettings{
AllowDeferredExecution: aws.Bool(true),
DataAccessRoleArn: aws.String("my-arn"),
},
LanguageCode: aws.String("en-US"),
Media: &transcribe.Media{
MediaFileUri: aws.String("https://s3.us-east-1.amazonaws.com/{MyBucket}/{MyObjectKey}"),
},
Settings: &transcribe.Settings{
MaxAlternatives: aws.Int64(2),
MaxSpeakerLabels: aws.Int64(2),
ShowAlternatives: aws.Bool(true),
ShowSpeakerLabels: aws.Bool(true),
},
TranscriptionJobName: aws.String("jobName"),
}
Amazon Transcribe response:
BadRequestException: The S3 URI that you provided can't be accessed. Make sure that you have read permission and try your request again.
My issue was the audio file being uploaded to s3 was specifying an ACL. I removed that from the s3 upload code and I no longer get the error. Also per the docs, if you have "transcribe" in your s3 bucket name, the transcribe service will have permission to access it. I also made that change but you still need to ensure you aren't using an ACL

Upload a folder to an S3 bucket using java aws sdk

I'm trying to upload an entire folder to AWS S3 bucket. I'm successful at uploading individual files, but couldn't find a way to upload the entire folder.
AWS CLI has an option called sync, does and equivalent methods available in Java SDK?
Sample code for uploading individual objects is as below (only part)
final PutObjectRequest dashboardUploadRequest = PutObjectRequest.builder()
.bucket(aws.getApplihealthReportBucket())
.key(projectName + "/"+ executionConfig.getTestType() +"/" + aws.getS3Folder() + "/index.html").build();
S3Client s3 = S3Client.builder().credentialsProvider(credentialsProvider).build()
s3.putObject(dashboardUploadRequest, Paths.get(resultsDirectory + "/extent.html"));
There are no way to upload folder directly to s3. Only forearch and upload file by file.
Or using CLI
https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
The TransferManager API in Java SDK provides methods to upload a directory to S3.

AWS S3 file URL generation for user to download the file

How can we get the URL for the amazon S3 bucket files. I have tried to get the file by below format
http://s3-REGION-.amazonaws.com/BUCKET-NAME/KEY
This format will be helpful to download the file if it has public access and server side encryption is disabled.
Purpose of URL generation is to share with internal teams in my organization. This file might have exceptions of any applications.
I have to make the file or the bucket to be restricted to my organization (not for public). The bucket what ever I have server side encryption is enabled. How can we generate the file url which has server side encryption is enabled ?
You can generate a presigned URL for an S3 object: https://docs.aws.amazon.com/cli/latest/reference/s3/presign.html
Presigned URLs can be generated programmatically as well with all AWS SDKs.
For example in Java: https://docs.aws.amazon.com/AmazonS3/latest/dev/ShareObjectPreSignedURLJavaSDK.html
if you are using .net aws sdk to generate predesign url
var s3Client = new AmazonS3Client();
var request1 = new GetPreSignedUrlRequest
{
BucketName = "bucketName",
Key = "filename(original one and no coding)",
Expires = DateTime.Now.AddMinutes(5)
};
var urlString = s3Client.GetPreSignedURL(request1);