Athena on S3 using secrets key - amazon-web-services

I have got a key and secrets key for an external S3 bucket full of .gz files.
I need to analyze this data using Athena.
Is there an option to connect Athena to the S3 bucket using a key and secret key? if so, how to do it?
I know how to connect S3 when I get a console connection but not when I've only a key and secret key.

It is more comlicated than answering with a single answer.
As it was mentioned already, you cannot have access to the console with only the secret key. You can use those credential for programatic access for example from your CLI.
What you might want to try is to use the AWS CLI tool. https://docs.aws.amazon.com/cli/latest/reference/athena/start-query-execution.html
aws athena start-query-execution <query>

Related

Possible to access an AWS public dataset using Cyberduck?

Cyberduck version: Version 7.9.2
Cyberduck is designed to access non-public AWS buckets. It asks for:
Server
Port
Access Key ID
Secret Access Key
The Registry of Open Data on AWS provides this information for an open dataset (using the example at https://registry.opendata.aws/target/):
Resource type: S3 Bucket
Amazon Resource Name (ARN): arn:aws:s3:::gdc-target-phs000218-2-open
AWS Region: us-east-1
AWS CLI Access (No AWS account required): aws s3 ls s3://gdc-target-phs000218-2-open/ --no-sign-request
Is there a version of s3://gdc-target-phs000218-2-open that can be used in Cyberduck to connect to the data?
If the bucket is public, any AWS credentials will suffice. So as long as you can create an AWS account, you only need to create an IAM user for yourself with programmatic access, and you are all set.
No doubt, it's a pain because creating an AWS account needs your credit (or debit) card! But see https://stackoverflow.com/a/44825406/1094109 and https://stackoverflow.com/a/44825406/1094109
I tried this with s3://gdc-target-phs000218-2-open and it worked:
For RODA buckets that provide public access to specific prefixes, you'd need to edit the path to suit. E.g. s3://cellpainting-gallery/cpg0000-jump-pilot/source_4/ (this is a RODA bucket maintained by us, yet to be released fully)
NOTE: The screenshots below show a different URL that's no longer operational. The correct URL is s3://cellpainting-gallery/cpg0000-jump-pilot/source_4/
No, it's explicitly stated in the documentation that
You must obtain the login credentials [in order to connect to Amazon S3 in Cyberduck]

How do I update my access key id from an old one on AWS CLI?

My goal is to access my s3 buckets from the command line using my AWS educate account.
I expected to get a list of my s3 buckets in the command prompt. I typed in this command. aws s3 ls
I actually received an error message saying Invalid Access Key ID.
They shown do not match they key on the home age of my AWS educate account.
How do I change the keys listed to match the ones on my AWS Educate home page? I think if I correct this then I will be able to access my s3 buckets with aws cli.
Run:
aws configure
And follow the prompts to configure a new Access Key and Secret Access Key.
If this isn't working, there are two other things you can check:
Have you set any of the following environment variables? https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-envvars.html. These can override the ones set using aws configure
If that fails, check that $HOME/.aws/credentials is not write protected, or even try updating the credentials manually.

AWS EC2 userdata encryption

We have usecase of taking input which includes password from user and pass it to on EC2 instance. From with in Ec2 instance we hit the URL - http://169.254.169.254/latest/user-data/ and get the userdata and set appropriate passwords.
The issue is user data is visible by AWS CLI tool:
aws ec2 describe-instance-attribute --instance-id --attribute userData --output text --query "UserData.Value" | base64 --decode
This imposes huge security risk.
Whats the best way to send sensitive / secret data ?
I tried creating a key-pair, which creates the private key on local instance and public key on EC2. What would be right way to encrypt / decrypt using PowerShell and fetch it back in EC2?
The suggested approach would be to store any secrets in an external source.
AWS has a service for storing secrets, Secrets Manager. By using this service you would create a secret containing the secrets that your instance will need to access in its user data. Then give your instance an IAM role with privileges to get the secret value, via the AWS CLI.
Alternatively you could also make use of the AWS SSM Parameter Store service, storing the secrets as a SecureString type. This would work similar to secrets manager with you retrieving the secret via the AWS CLI and then using it in your script.
There are also third party solutions such as Hashicorp Vault that provide similar functionality if you do not want to store your secrets in an AWS solution.

upload file to s3 from local using AWS CLI without hard-coded credentials(access id and secret access key)

My requirement to upload file from local to s3 using aws cli but don't want to use access ID and secret access key while running in command line.
Any suggestions!
It is recommended that you never put AWS credentials in program code.
If the code is running on an Amazon EC2 instance, assign an IAM Role to the instance. The code will automatically detect and use these credentials.
If the code is running on your own computer, run the AWS Command-Line Interface (CLI) aws configure command and enter your IAM credentials (Access Key + Secret Key). They will be stored in the ~/.aws/credentials file and will be automatically accessed by your code.

S3 and IAM settings update

We are in a strange stage at the moment. Our DevOps guy left the organization. Now when we disable his keys in IAM. We saw this kinda error in production. "An error occurred (AccessDenied) when calling the PutObject operation: Access Denied when trying to upload an object on your bucket: XXXXX-prd-asset-images/." If i check Devops Guy IAM , i can see last used as S3 service. Guys i can understand its a half information but any help would be appreciated.
Can we look at prod instances if AWS keys stored there?
Can we check any policy?
Can we check bucket information?
That Devops guy had his AWS Keys being used for AWS CLI.
You need to create a generic account in AWS IAM which is not used by any developer and system administrator to avoid this situation in future.
Now do one thing create a generic account which has same IAM policies as that of your Devops guy account. SSH to the server. Go to this file ~/.aws/config there you will find AWS Key and AWS Secret replace that with the new key and secret of the account generated above.
Or you can run following and paste the Key and Access key when prompted and also the proper region for your EC2 instance.
$ aws configure
AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE
AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Default region name [None]: us-west-2
Default output format [None]: json