Powershell writing to AWS S3 - amazon-web-services

I'm trying to get powershell to write results to AWS S3 and I can't figure out the syntax. Below is the line that is giving me trouble. If I run this without everything after the ">>" the results print on the screen.
Write-host "Thumbprint=" $i.Thumbprint " Expiration Date="$i.NotAfter " InstanceID ="$instanceID.Content" Subject="$i.Subject >> Write-S3Object -BucketName arn:aws:s3:::eotss-ssl-certificatemanagement

Looks like you have an issue with >> be aware that you can't pass the write-host function result into another command.
In order to do that, you need to assign the string you want into a variable and then pass it into the -Content.
Take a look at the following code snippet:
Install-Module AWSPowerShell
Import-Module AWSPowerShell
#Set AWS Credential
Set-AWSCredential -AccessKey "AccessKey" -SecretKey "SecretKey"
#File upload
Write-S3Object -BucketName "BucketName" -Key "File upload test" -File "FilePath"
#Content upload
$content = "Thumbprint= $($i.Thumbprint) Expiration Date=$($i.NotAfter) InstanceID = $($instanceID.Content) Subject=$($i.Subject)"
Write-S3Object -BucketName "BucketName" -Key "Content upload test" -Content $content
How to create new AccessKey and SecretKey - Managing Access Keys for Your AWS Account.
AWSPowerShell Module installation.
AWS Tools for PowerShell - S3 Documentation.

Related

How do I obtain temporary AWS credentials for an unauthenticated role in PowerShell using a Cognito IdentityPool?

I was writing a PowerShell script that needed to access an AWS S3 bucket using an unauthenticated role via Cognito and had trouble finding much documentation. All of the documentation I was able to find for the AWS PowerShell SDK discussed storing your AccessKey and SecretKey but never how to get those credentials using Cognito when you aren't using a user pool.
There may be other ways to do this with PowerShell (I haven't been able to find them yet.) but you can obtain temporary credentials through Cognito using AWS's REST API.
The following PowerShell example shows how to:
Set your REST URL
Get an id from the Cognito Identity provider
Use the received id to request temporary credentials (AccessKey will begin with AS instead of AK)
Set the temporary credentials
For more information see:
AWS API Getting Credentials
AWS API GetCredentialsForIdentity
AWS API GetId
function Get-CognitoRestURL {
param(
[parameter(Mandatory)]$Region
)
return "https://cognito-identity.{0}.amazonaws.com/" -f $Region
}
function Get-AWSTempCredentials {
param(
[parameter(Mandatory)]$IdentityPoolId,
[parameter(Mandatory)]$Region
)
try {
$cognitoRestURL = Get-CognitoRestURL -Region $Region
$requestTempId = Invoke-RestMethod -Uri $cognitoRestURL -Method "POST" `
-Headers #{
"authority"=$cognitoRestURL
"x-amz-target"="AWSCognitoIdentityService.GetId"
"x-amz-user-agent"="aws-powershell callback"
} -ContentType "application/x-amz-json-1.1" -Body "{`"IdentityPoolId`":`"$($IdentityPoolId)`"}"
} catch {
Write-Error $_
#Request failed, we don't have the data we need to continue
break
}
try {
$tempCredentials = Invoke-RestMethod -Uri $cognitoRestURL -Method "POST" `
-Headers #{
"x-amz-target"="AWSCognitoIdentityService.GetCredentialsForIdentity"
"x-amz-user-agent"="aws-powershell callback"
} -ContentType "application/x-amz-json-1.1" -Body "{`"IdentityId`":`"$($requestTempId.IdentityId)`"}"
} catch {
Write-Error $_
#Request failed, we don't have the data we need to continue
break
}
return $tempCredentials
}
function Set-AWSTempCredentials {
param(
[parameter(Mandatory)]$AccessKeyId,
[parameter(Mandatory)]$SecretKey,
[parameter(Mandatory)]$SessionToken,
[parameter(Mandatory)]$ProfileName,
[parameter(Mandatory)]$Region
)
Set-AWSCredential -AccessKey $AccessKeyId -SecretKey $SecretKey -SessionToken $SessionToken -StoreAs $ProfileName
return Get-AWSCredential -ProfileName $ProfileName
}
$region = "us-west-1"
$IdentityPoolId = "us-west-1:12a01023-4567-123a-bcd1-12345a0b1abc"
$response = Get-AWSTempCredentials -IdentityPoolId $IdentityPoolId -Region $region
Set-AWSTempCredentials -AccessKeyId $response.Credentials.AccessKeyId `
-SecretKey $response.Credentials.SecretKey `
-SessionToken $response.Credentials.SessionToken `
-ProfileName MyTempCredentials `
-Region $region

Creating a function to assume an AWS STS role in PowerShell $PROFILE

I have AWS credentials defined in my .aws/credentials like follows:
[profile source]
aws_access_key_id=...
aws_secret_access_key=...
[profile target]
role_arn = arn:aws:iam::123412341234:role/rolename
mfa_serial = arn:aws:iam::123412341234:mfa/mylogin
source_profile = source
...
and I would like to define functions in my $PROFILE to assume roles using AWS Tools for PowerShell in the said accounts because of MFA and credential lifetime of 1 hours.
The function looks like
function Use-SomeAWS {
Clear-AWSCredential
$Response=(Use-STSRole arn:aws:iam::123412341234:role/rolename -ProfileName target -RoleSessionName "my email").Credentials
$Creds=(New-AWSCredentials -AccessKey $Response.AccessKeyId -SecretKey $Response.SecretAccessKey -SessionToken $Response.SessionToken)
Set-AWSCredential -Credential $Creds
}
Copying & pasting the lines within the function work just fine, but sourcing the profile (. $PROFILE) and running the function (Use-SomeAWS) asks for the MFA code and seems to do its job, however, the credentials do not get correctly set for the session.
What am I doing wrong?
EDIT: With some further testing, this does work if I add -StoreAs someprofilename to the Set-AWSCredential and after that do Set-AWSCredential -ProfileName someprofilename but that kind of defeats the purpose.
Did you try the -Scope for Set-AWSCredential ? like this :
Set-AWSCredential -Credential $Creds -Scope global
https://docs.aws.amazon.com/powershell/latest/reference/items/Set-AWSCredential.html

Download Last 24 hour files from s3 using Powershell

I have an s3 bucket with different filenames. I need to download specific files (filenames that starts with impression) that are created or modified in last 24 hours from s3 bucket to local folder using powershell?
$items = Get-S3Object -BucketName $sourceBucket -ProfileName $profile -Region 'us-east-1' | Sort-Object LastModified -Descending | Select-Object -First 1 | select Key Write-Host "$($items.Length) objects to copy" $index = 1 $items | % { Write-Host "$index/$($items.Length): $($_.Key)" $fileName = $Folder + ".\$($_.Key.Replace('/','\'))" Write-Host "$fileName" Read-S3Object -BucketName $sourceBucket -Key $_.Key -File $fileName -ProfileName $profile -Region 'us-east-1' > $null $index += 1 }
A workaround might be to turn on access log, and since the access log will contain timestamp, you can get all access logs in the past 24 hours, de-duplicate repeated S3 objects, then download them all.
You can enable S3 access log in the bucket settings, the logs will be stored in another bucket.
If you end up writing a script for this, just bear in mind downloading the S3 objects will essentially create new access logs, making the operation irreversible.
If you want something fancy perhaps you can even query the logs and perhaps deduplicate using AWS Athena.

aws s3 sync missing to create root folders

I am archiving some folders to S3
Example: C:\UserProfile\E21126\data ....
I expect to have a folder structure in s3 like, UserProfiles\E21126.
Problem is it created the folders under \E21126 and misses creating the root folder \E21126.
Folds1.txt contains these folders to sync:
G:\UserProfiles\E21126
G:\UserProfiles\E47341
G:\UserProfiles\C68115
G:\UserProfiles\C30654
G:\UserProfiles\C52860
G:\UserProfiles\E47341
G:\UserProfiles\C68115
G:\UserProfiles\C30654
G:\UserProfiles\C52860
my code below:
ForEach ($Folder in (Get-content "F:\scripts\Folds1.txt")) {
aws s3 sync $Folder s3://css-lvdae1cxfs003-archive/Archive-Profiles/ --acl bucket-owner-full-control --storage-class STANDARD
}
It will upload all the folders with their names excluding the path. If you want to include the UserProfiles in the S3 bucket then you will needs to include that in the key. You need to upload them to the S3 bucket with specifying the key name
aws s3 sync $Folder s3://css-lvdae1cxfs003-archive/Archive-Profiles/UserProfiles --acl bucket-owner-full-control --storage-class STANDARD
and If your files have different name instead of UserProfiles string then you can get the parent path and then fetch the leaf to get the username from the string
PS C:\> Split-Path -Path "G:\UserProfiles\E21126"
G:\UserProfiles
PS C:\> Split-Path -Path "G:\UserProfiles" -Leaf -Resolve
UserProfiles
If you were to modify the text file to contain:
E21126
E47341
C68115
Then you could use the command:
ForEach ($Folder in (Get-content "F:\scripts\Folds1.txt")) {
aws s3 sync G:\UserProfiles\$Folder s3://css-lvdae1cxfs003-archive/Archive-Profiles/$Folder/ --acl bucket-owner-full-control --storage-class STANDARD
}
Note that the folder name is included in the destination path.

Powershell - gzip large file and load to s3 using stream

I'm trying to compress some csv files using gzip and then upload them to S3. I need to use streams to compress and load because the files could be very large and I don't want to write the file back to disk before loading it to s3. I'm new to using streams in Powershell and I'm struggling to figure out the issue.
This is what I have so far but I can't get it to work. It loads a very small gzip file that shows my original file inside but I can't extract it - I get an "Unexpected end of data" error. I believe it's not finalizing the gzip stream or something like that. If I remove the "gzip" commands and just write out the inputFileStream to S3 then it works to load the uncompressed file, so I know the S3 load using a stream works.
Also, I'm using "CopyTo" which I believe will bring the whole file into memory which I don't want either (let me know if I'm not correct with that thinking).
$sourcePath = "c:\temp\myfile.csv"
$bucketName = "mybucket"
$s3Key = "staging/compress_test/"
$fileInfo = Get-Item -Path $sourcePath
$destPath = "$s3Key$($fileInfo.Name).gz"
$outputMemoryStream = New-Object System.IO.MemoryStream
$gzipStream = New-Object System.IO.Compression.GZipStream $outputMemoryStream, ([IO.Compression.CompressionMode]::Compress)
$inputFileStream = New-Object System.IO.FileStream $sourcePath, ([IO.FileMode]::Open), ([IO.FileAccess]::Read), ([IO.FileShare]::Read)
$inputFileStream.CopyTo($gzipStream)
Write-S3Object -BucketName $destBucket -Key $destPath -Stream $outputMemoryStream -ProfileName Dev -Region us-east-1
$inputFileStream.Close()
$outputMemoryStream.Close()
UPDATE: Thanks #FoxDeploy. I got it at least loading the file now. I needed to close the gzip stream before writing to S3 causing the gzip to finalize. But as I suspected the "CopyTo" causes the full file to be compressed and stored in memory and then it loads to S3. I would like it to stream to S3 as it's compressing to reduce the memory load, if that's possible.
Here's the current working code:
$sourcePath = "c:\temp\myfile.csv"
$bucketName = "mybucket"
$s3Key = "staging/compress_test/"
$fileInfo = Get-Item -Path $sourcePath
$destPath = "$s3Key$($fileInfo.Name).gz"
$outputMemoryStream = New-Object System.IO.MemoryStream
$gzipStream = New-Object System.IO.Compression.GZipStream $outputMemoryStream, ([IO.Compression.CompressionMode]::Compress), true
$inputFileStream = New-Object System.IO.FileStream $sourcePath, ([IO.FileMode]::Open), ([IO.FileAccess]::Read), ([IO.FileShare]::Read)
$inputFileStream.CopyTo($gzipStream)
$gzipStream.Close()
Write-S3Object -BucketName $bucketName -Key $destPath -Stream $outputMemoryStream -ProfileName Dev -Region us-east-1
$inputFileStream.Close()
$outputMemoryStream.Close()