not able to generate Rds Iam Auth Token using aws cpp sdk - c++

We need to generate an aws rds iam auth token using the c++ sdk.
Java sdk equivalent is here: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.IAMDBAuth.Connecting.Java.html#UsingWithRDS.IAMDBAuth.Connecting.Java.AuthToken
Is there a way to do the above using the C++ SDK?
The following code just results in printing the connection string (replace parameters with actual values):
Aws::RDS::RDSClient* p = new Aws::RDS::RDSClient();
std::cout << p->GenerateConnectAuthToken("host", "region", 3306, "user");
Not sure if these are related:
https://github.com/aws/aws-sdk-cpp/issues/868
https://github.com/aws/aws-sdk-cpp/issues/864
Any help will be appreciated.

Related

Assuming role in AWS is causing a credentials error

I am trying to use Glue schema registry service in AWS with scala (or java should be useful also) and I tested two ways to assume a role but it results in an error:
"Unable to load credentials from any of the providers in the chain AwsCredentialsProviderChain(credentialsProviders=[SystemPropertyCredentialsProvider(), EnvironmentVariableCredentialsProvider(), WebIdentityTokenCredentialsProvider(), ProfileCredentialsProvider(), ContainerCredentialsProvider(), InstanceProfileCredentialsProvider()])"
I don't want to use environment variables so I tried STS to assume a role with the following code:
val assumeRoleRequest = AssumeRoleRequest.builder.roleSessionName(UUID.randomUUID.toString).roleArn("roleArn").build
val stsClient = StsClient.builder.region(Region.EU_CENTRAL_1).build
val stsAssumeRoleCredentialsProvider = StsAssumeRoleCredentialsProvider.builder.stsClient(stsClient).refreshRequest(assumeRoleRequest).build
val glueClient = GlueClient
.builder()
.region(Region.EU_CENTRAL_1)
.credentialsProvider(stsAssumeRoleCredentialsProvider)
Based on https://stackoverflow.com/a/62930761/17221117
The second way I used is using the following AWS code official documentation
But it fails also... I don't understand if this generate a token that I should use or just executing this code should work.
Anyone can help me with this?

AWS .Net API - The provided token has expired

I am facing this weird scenario. I generate my AWS AccessKeyId, SecretAccessKey and SessionToken by running assume-role-with-saml command. After copying these values to .aws\credentials file, I try to run command "aws s3 ls" and can see all the S3 buckets. Similarly I can run any AWS command to view objects and it works perfectly fine.
However, when I write .Net Core application to list objects, it doesn't work on my computer. Same .Net application works find on other colleagues' computers. We all have access to AWS through the same role. There are no users in IAM console.
Here is the sample code, but I am not sure there is nothing wrong with the code, because it works fine on other users' computers.
var _ssmClient = new AmazonSimpleSystemsManagementClient();
var r = _ssmClient.GetParameterAsync(new Amazon.SimpleSystemsManagement.Model.GetParameterRequest
{
Name = "/KEY1/KEY2",
WithDecryption = true
}).ConfigureAwait(false).GetAwaiter().GetResult();
Any idea why running commands through CLI works and API calls don't work? Don't they both look at the same %USERPROFILE%.aws\credentials file?
I found it. Posting here since it can be useful for someone having same issue.
Go to this folder: %USERPROFILE%\AppData\Local\AWSToolkit
Take a backup of all files and folders and delete all from above location.
This solution applies only if you can run commands like "aws s3 ls" and get the results successfully, but you get error "The provided token has expired" while running the same from .Net API libraries.

How to get the value of aws iam list-account-aliases as variable?

I want to write a python program to check which account I am in by using account alias (I have multiple Tennants et up on AWS). I think aws iam list-account-aliases return what exactly I am looking for but it is a command line results and I am not sure what is the best to capture as variable in a python program.
Also I was reading about the aws iam list-account-aliases and they have a output section mentioned AccountAliases -> (list). (https://docs.aws.amazon.com/cli/latest/reference/iam/list-account-aliases.html)
I wonder what this AccountAliases is? an option? a command? a variable? I was a little bit confused here.
Thank you!
Use Boto3 to get the account alias in python. Your link points to aws-cli.
Here is the link for equivalent Boto3 command for python:
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/iam.html#IAM.Client.list_account_aliases
Sample code:
import boto3
client = boto3.client('iam')
response = client.list_account_aliases()
Response:
{
'AccountAliases': [
'myawesomeaccount',
],
'IsTruncated': True,
}
Account alias on AWS is a readable alias created against AWS user's account id. You can find more info here :
https://docs.aws.amazon.com/IAM/latest/UserGuide/console_account-alias.html
AWS has provided a very well documented library for Python called Boto3 which can be used to obtain connect to your AWS account as client, resource or session (more information on these in SO answer here: Difference in boto3 between resource, client, and session?)
For your use case you can connect to your AWS as client with iam label:
import boto3
client = boto3.client(
'iam',
region_name="region_name",
aws_access_key_id="your_aws_access_key_id",
aws_secret_access_key="your_aws_secret_access_key")
account_aliases = client. list_account_aliases()
The response is a JSON object which can be traversed to get the desired information on aliases.

Installing AWS key to use S3

I did aws configure and inputted my key and secret key. i have checked that my account exists when I ran:
aws iam list-account-aliases
and my alias appeared.
However, when i try to upload a file to aws, I am recieving this error:
/Users/kchen/campaiyn-web/node_modules/skipper-s3/node_modules/knox/lib/client.js:197
if (!options.key) throw new Error('aws "key" required');
^
Error: aws "key" required
at new Client (/Users/kchen/campaiyn-web/node_modules/skipper-s3/node_modules/knox/lib/client.js:197:27)
at Function.exports.createClient (/Users/kchen/campaiyn-web/node_modules/skipper-s3/node_modules/knox/lib/client.js:925:10)
at Writable.onFile (/Users/kchen/campaiyn-web/node_modules/skipper-s3/index.js:248:22)
at doWrite (_stream_writable.js:292:12)
at writeOrBuffer (_stream_writable.js:278:5)
at Writable.write (_stream_writable.js:207:11)
at Transform.ondata (_stream_readable.js:528:20)
at emitOne (events.js:77:13)
at Transform.emit (events.js:169:7)
at readableAddChunk (_stream_readable.js:146:16)
at Transform.Readable.push (_stream_readable.js:110:10)
at Transform.push (_stream_transform.js:128:32)
at /Users/kchen/campaiyn-web/node_modules/sails/node_modules/skipper/standalone/Upstream/build-renamer-stream.js:49:19
at Object.opts.saveAs (/Users/kchen/campaiyn-web/node_modules/sails/node_modules/skipper/standalone/Upstream/prototype.upload.js:71:7)
at determineBasename (/Users/kchen/campaiyn-web/node_modules/sails/node_modules/skipper/standalone/Upstream/build-renamer-stream.js:32:17)
at Transform.__renamer__._transform (/Users/kchen/campaiyn-web/node_modules/sails/node_modules/skipper/standalone/Upstream/build-renamer-stream.js:40:7)
I feel that my key was not installed correctly or am i looking at this incorrectly?
The library you are using (skipper) does not know how to pick up the credentials.
I would recommend either passing the credential in explicitly or switching to using the nodejs AWS SDK which definitely supports this.

Allow 3rd party app to upload file to AWS s3

I need a way to allow a 3rd party app to upload a txt file (350KB and slowly growing) to an s3 bucket in AWS. I'm hoping for a solution involving an endpoint they can PUT to with some authorization key or the like in the header. The bucket can't be public to all.
I've read this: http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html
and this: http://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html
but can't quite seem to find the solution I'm seeking.
I'd suggests using a combination of the AWS API gateway, a lambda function and finally S3.
You clients will call the API Gateway endpoint.
The endpoint will execute an AWS lambda function that will then write out the file to S3.
Only the lambda function will need rights to the bucket, so the bucket will remain non-public and protected.
If you already have an EC2 instance running, you could replace the lambda piece with custom code running on your EC2 instance, but using lambda will allow you to have a 'serverless' solution that scales automatically and has no min. monthly cost.
I ended up using the AWS SDK. It's available for Java, .NET, PHP, and Ruby, so there's very high probability the 3rd party app is using one of those. See here: http://docs.aws.amazon.com/AmazonS3/latest/dev/UploadObjSingleOpNET.html
In that case, it's just a matter of them using the SDK to upload the file. I wrote a sample version in .NET running on my local machine. First, install the AWSSDK Nuget package. Then, here is the code (taken from AWS sample):
C#:
var bucketName = "my-bucket";
var keyName = "what-you-want-the-name-of-S3-object-to-be";
var filePath = "C:\\Users\\scott\\Desktop\\test_upload.txt";
var client = new AmazonS3Client(Amazon.RegionEndpoint.USWest2);
try
{
PutObjectRequest putRequest2 = new PutObjectRequest
{
BucketName = bucketName,
Key = keyName,
FilePath = filePath,
ContentType = "text/plain"
};
putRequest2.Metadata.Add("x-amz-meta-title", "someTitle");
PutObjectResponse response2 = client.PutObject(putRequest2);
}
catch (AmazonS3Exception amazonS3Exception)
{
if (amazonS3Exception.ErrorCode != null &&
(amazonS3Exception.ErrorCode.Equals("InvalidAccessKeyId")
||
amazonS3Exception.ErrorCode.Equals("InvalidSecurity")))
{
Console.WriteLine("Check the provided AWS Credentials.");
Console.WriteLine(
"For service sign up go to http://aws.amazon.com/s3");
}
else
{
Console.WriteLine(
"Error occurred. Message:'{0}' when writing an object"
, amazonS3Exception.Message);
}
}
Web.config:
<add key="AWSAccessKey" value="your-access-key"/>
<add key="AWSSecretKey" value="your-secret-key"/>
You get the accesskey and secret key by creating a new user in your AWS account. When you do so, they'll generate those for you and provide them for download. You can then attach the AmazonS3FullAccess policy to that user and the document will be uploaded to S3.
NOTE: this was a POC. In the actual 3rd party app using this, they won't want to hardcode the credentials in the web config for security purposes. See here: http://docs.aws.amazon.com/AWSSdkDocsNET/latest/V2/DeveloperGuide/net-dg-config-creds.html