I am trying to setup a powershell script in an attempt to setup a automated transfer of a directory to a S3 Bucket, I have been following instructions listed at http://todhilton.com/technicalwriting/upload-backup-your-files-to-amazon-s3-with-powershell/ but when I run it I get the following error.
Unable to find type [Amazon.AWSClientFactory].
At line:18 char:9
+ $client=[Amazon.AWSClientFactory]::CreateAmazonS3Client($accessKeyID, ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (Amazon.AWSClientFactory:TypeName) [], RuntimeException
+ FullyQualifiedErrorId : TypeNotFound
The Code I have is pasted in below... If someone has some insight that would be awsome :)
# Constants
$sourceDrive = "C:\"
$sourceFolder = "Users\Administrator\AppData\Roaming\folder"
$sourcePath = $sourceDrive + $sourceFolder
$s3Bucket = "bucket"
$s3Folder = "Archive"
# Constants – Amazon S3 Credentials
$accessKeyID="KEY"
$secretAccessKey="Secret"
# Constants – Amazon S3 Configuration
$config=New-Object Amazon.S3.AmazonS3Config
$config.RegionEndpoint=[Amazon.RegionEndpoint]::"ap-southeast-2"
$config.ServiceURL = "https://s3-ap-southeast-2.amazonaws.com/"
# Instantiate the AmazonS3Client object
$client=[Amazon.AWSClientFactory]::CreateAmazonS3Client($accessKeyID,$secretAccessKey,$config)
# FUNCTION – Iterate through subfolders and upload files to S3
function RecurseFolders([string]$path) {
$fc = New-Object -com Scripting.FileSystemObject
$folder = $fc.GetFolder($path)
foreach ($i in $folder.SubFolders) {
$thisFolder = $i.Path
# Transform the local directory path to notation compatible with S3 Buckets and Folders
# 1. Trim off the drive letter and colon from the start of the Path
$s3Path = $thisFolder.ToString()
$s3Path = $s3Path.SubString(2)
# 2. Replace back-slashes with forward-slashes
# Escape the back-slash special character with a back-slash so that it reads it literally, like so: "\\"
$s3Path = $s3Path -replace "\\", "/"
$s3Path = "/" + $s3Folder + $s3Path
# Upload directory to S3
Write-S3Object -BucketName $s3Bucket -Folder $thisFolder -KeyPrefix $s3Path
}
# If subfolders exist in the current folder, then iterate through them too
foreach ($i in $folder.subfolders) {
RecurseFolders($i.path)
}
}
# Upload root directory files to S3
$s3Path = "/" + $s3Folder + "/" + $sourceFolder
Write-S3Object -BucketName $s3Bucket -Folder $sourcePath -KeyPrefix $s3Path
# Upload subdirectories to S3
RecurseFolders($sourcePath)
Please check below:
Amazon AWSClientFactory does not exists
Change: AWSClientFactory is removed
I used below script:
# Bucket region details
$RegionEndpoint = 'us-east-1'
#$ServiceURL = 'https://s3-us-east-1.amazonaws.com/'
#Credentials initialized
$credsCSV = Get-ChildItem "E:\myAPIUser_credentials.csv"
$credsContent = Import-Csv $credsCSV.FullName
$accessKeyID = $credsContent.'Access key ID'
$secretAccessKey = $credsContent.'Secret access key'
Initialize-AWSDefaults -Region $RegionEndpoint -AccessKey $accessKeyID -SecretKey $secretAccessKey
$sourceFolder = "E:\Code\powershell\PSmy\git\AWSPowerShell"
$targetFolder = Get-Date -Format "dd-MMM-yyyy"
Write-S3Object -BucketName $s3Bucket.BucketName -Folder $sourceFolder -Recurse -KeyPrefix $targetFolder\
Related
i want to play with the Amazon Selling Partner API. From Postman everything works fine. From Powershell not.
I tried to derive my powershell script from the documentation https://docs.aws.amazon.com/general/latest/gr/sigv4_signing.html and the python example.
I try it all the day, but i do not find the solution.
What i have done:
Test functionality from Postman -> Works
Read about Signing AWS requests with Signature Version 4 -> OK
Understanding signing key functionality with the test input from
https://docs.aws.amazon.com/general/latest/gr/signature-v4-examples.html -> Works
Try to translate the Python example to Powershell: https://docs.aws.amazon.com/general/latest/gr/sigv4-signed-request-examples.html -> does not work
Use Google to find a solution from examples and found this: http://www.laurierhodes.info/?q=node/114 -> does not work - maybe because it is for aws lambda
Whatever i do, i receive error Message Sender SignatureDoesNotMatch
I attached a screenshot from Postman, it shows the response from this endpoint. I know it is "Access denied", it is just really near on the example from amazon. Later on i need to make a sts request to receive temp credentials on each call and i also need to sign every call to the amazon selling partner API.
Maybe someone can help me, please? I'm going nuts :D
Here is my Powershell code from the python example: https://docs.aws.amazon.com/general/latest/gr/sigv4-signed-request-examples.html
I tried both get variants,
Using GET with an authorization header (Python)
Using GET with authentication information in the Query string (Python)
This code is from the 2nd -> Using GET with authentication information in the Query string
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
[cultureinfo]::CurrentCulture = 'de-DE'
#powershell variant with example test input from #https://docs.aws.amazon.com/general/latest/gr/signature-v4-examples.html
<#
.Synopsis
HMACSHA256 signing function used in the construction of a "Signature 4 " request
#>
# translated function from Deriving a signing key using .NET (C#)
function hmacSHA256(
[string]$data,
[byte[]]$key)
{
$hmacsha = New-Object System.Security.Cryptography.HMACSHA256
$hmacsha.key = $key
$signature = $hmacsha.ComputeHash([Text.Encoding]::UTF8.GetBytes($data))
return $signature
}
<#
.Synopsis
The AWS Signature version 4 creation routine
#>
function getSignatureKey(
[String]$AWSAccessKey,
[String]$dateStamp,
[String]$regionName,
[String]$serviceName)
{
[Byte[]]$kSecret = [System.Text.Encoding]::UTF8.GetBytes("AWS4" + $AWSAccessKey)
$kDate = hmacSHA256 -data $dateStamp -key $kSecret
$kRegion = hmacSHA256 -data $regionName -key $kDate
$kService = hmacSHA256 -data $serviceName -key $kRegion
$kSigningKey = hmacSHA256 -data "aws4_request" -key $kService
return $kSigningKey
}
<#
.Synopsis
Retrieves an SHA hash of a string as required by AWS Signature 4
#>
function hash($request) {
$sha256 = new-object -TypeName System.Security.Cryptography.SHA256Managed
$utf8 = new-object -TypeName System.Text.UTF8Encoding
$hash = [System.BitConverter]::ToString($sha256.ComputeHash($utf8.GetBytes($request)))
return $hash.replace('-','').toLower()
}
# ************* REQUEST VALUES *************
$method = 'GET'
$service = 'iam'
$host1 = 'iam.amazonaws.com'
$region = 'us-east-1'
$endpoint = 'https://iam.amazonaws.com'
$access_key = 'AKIA4KA7FVL7SN2EXAMPLE'
$secret_key = 'EXPL[enter image description here][1]Y59hS5KWKAnfOSnWLjNsiKaK/EXAMPLE'
$now = [DateTime]::UtcNow
$amz_date = $now.ToString('yyyyMMddTHHmmssZ')
$datestamp = $now.ToString('yyyyMMdd')
# ************* TASK 1: CREATE A CANONICAL REQUEST *************
# http://docs.aws.amazon.com/general/latest/gr/sigv4-create-canonical-request.html
# Step 2: Create canonical URI--the part of the URI from domain to query
# string (use '/' if no path)
$canonical_uri = '/'
# Step 3: Create the canonical headers and signed headers. Header names
# must be trimmed and lowercase, and sorted in code point order from
# low to high. Note trailing \n in canonical_headers.
# signed_headers is the list of headers that are being included
# as part of the signing process. For requests that use query strings,
# only "host" is included in the signed headers.
$canonical_headers = "host:" + $host1 + "`n"
$signed_headers = "host"
# Match the algorithm to the hashing algorithm you use, either SHA-1 or
# SHA-256 (recommended)
$algorithm = 'AWS4-HMAC-SHA256'
$credential_scope = $datestamp + '/' + $region + '/' + $service + '/' + 'aws4_request'
# Step 4: Create the canonical query string. In this example, request
# parameters are in the query string. Query string values must
# be URL-encoded (space=%20). The parameters must be sorted by name.
$canonical_querystring = "Action=CreateUser&UserName=NewUser&Version=2010-05-08"
$canonical_querystring += "&X-Amz-Algorithm=AWS4-HMAC-SHA256 `n"
$canonical_querystring += "&X-Amz-Credential=" + [uri]::EscapeDataString(($access_key + "/" + $credential_scope))
$canonical_querystring += "&X-Amz-Date=" + $amz_date
$canonical_querystring += "&X-Amz-Expires=30"
$canonical_querystring += "&X-Amz-SignedHeaders=" + $signed_headers
# Step 5: Create payload hash. For GET requests, the payload is an
# empty string ("").
$payload_hash = hash ''
# Step 6: Combine elements to create canonical request
$canonical_request = $method + "`n" +
$canonical_uri + "`n" +
$canonical_querystring + "`n" +
$canonical_headers + "`n" +
$signed_headers + "`n" +
$payload_hash
#$canonical_request_hash = hash -request $canonical_request
#write-host $canonical_request_hash
# *************************************************************#
# ************* TASK 2: CREATE THE STRING TO SIGN *************#
$string_to_sign = $algorithm + "`n" +
$amz_date + "`n" +
$credential_scope + "`n" +
$canonical_request_hash
# *************************************************************#
# ************* TASK 3: CALCULATE THE SIGNATURE ***************#
# Create the signing key
$signing_key = getSignatureKey($secret_key, $datestamp, $region, $service)
#write-host "signing-key: $($signing_key)"
# Sign the string_to_sign using the signing_key
$signature = HmacSHA256 -data $string_to_sign -key $signing_key
$signature = [System.BitConverter]::ToString($signature).Replace('-','').ToLower()
# ************* TASK 4: ADD SIGNING INFORMATION TO THE REQUEST *************
# The auth information can be either in a query string
# value or in a header named Authorization. This code shows how to put
# everything into a query string.
$canonical_querystring += '&X-Amz-Signature=' + $signature
# ************* SEND THE REQUEST *************
# The 'host' header is added automatically by the Python 'request' lib. But it
# must exist as a header in the request.
###
#I am not sure how to work with headers, because i used the Get Variant with authentication information in the Query string
####
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("authorization", "AWS4-HMAC-SHA256 Credential=$($AWSAccessID)/$($shortdate)/$($AWSRegion)/$($AWSService)/aws4_request, SignedHeaders=$($SignedHeadersList), Signature=$($signature)")
$request_url = $endpoint + "/?" + $canonical_querystring
Invoke-RestMethod $request_url -Method 'GET' -Headers $headers
Best regards and many many thanks
Patrick
Postman screenshot
I'm trying to put together a simple PowerShell script to delete files from an AWS S3 bucket. I'm unable to split the filename correctly from items in the loop.
How is the filename correctly selected?
Example record names:
2021-06-07 16:08:15 1876349 20210502210533.csv
2021-06-07 16:44:53 1858461 210502210533.csv
2021-06-07 16:18:39 276597534 20210424203918.csv
Example script:
$all_files=aws s3 ls s3://bucket.host.com/folder1/folder2/folder3/folder4/ --profile dev
foreach($file in $all_files)
{
aws s3 rm s3://bucket.host.com/folder1/folder2/folder3/folder4/$file.Split(' ')[6] --profile dev
}
Results:
delete: s3 ls s3://bucket.host.com/folder1/folder2/folder3/folder4/2021-06-07 16:44:53 1858461 20210502210533.csv.Split
delete: s3 ls s3://bucket.host.com/folder1/folder2/folder3/folder4/2021-06-07 16:18:39 276597534 20210424203918.csv.Split
delete: s3 ls s3://bucket.host.com/folder1/folder2/folder3/folder4/2021-06-07 15:50:41 276597534 20210424204122.csv.Split
Notice how the entire record is present (no split occurring). How do I reliably split this?
EDIT 1:
$all_files=aws s3 ls s3://bucket.host.com/folder1/folder2/folder3/folder4/ --profile dev
PS C:\Users\me> $all_files
2021-06-08 02:50:37 4637885036 20210425202931.csv
2021-06-08 02:53:23 4753217891 20210426204043P.csv
2021-06-08 02:59:10 4838159267 20210426204346.csv
2021-06-08 02:58:07 4871146830 20210426204407.csv
2021-06-08 03:00:24 4641073848 20210427203146.csv
2021-06-08 02:52:29 4633473584 20210427203836.csv
2021-06-08 02:57:55 4633473584 20210427204657.csv
2021-06-08 02:56:25 4633473584 20210428203618.csv
2021-06-08 02:53:30 4633473584 20210429204253.csv
PS C:\Users\me> ForEach($file in $all_files){$command = 'aws s3 rm s3://bucket.host.com/folder1/folder2/folder3/folder4/' + $file.Split(' ')[-1] + ' --profile dev' & $command}
At line:1 char:177
+ ... folder3/folder4/' + $file.Split(' ')[-1] + ' --profile vis_dev' & $comman ...
+ ~
Unexpected token '&' in expression or statement.
+ CategoryInfo : ParserError: (:) [], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : UnexpectedToken
If I leave the & $command off the end, then run Invoke-Expression $command as a second expression, the last file is deleted.
How is the & $command run as part of the loop?
Try this:
$All_Files = #(
'2021-06-07 16:08:15 1876349 20210502210533.csv'
'2021-06-07 16:44:53 1858461 210502210533.csv'
'2021-06-07 16:18:39 276597534 20210424203918.csv'
)
ForEach($File in $All_Files)
{
'aws s3 rm s3://bucket.host.com/folder1/folder2/folder3/folder4/' + $file.Split(' ')[-1]
}
If you just use normal string concatenation this should output what you're looking for.
If you are going to split using the string .Split() method you're reliance on the index [6] is problematic. [-1] will simply give you the last element in the resulting array.
If you are looking to execute the resulting string as a command I would assign it to a variable and include other required arguments, then use the call operator (&) to execute it.
ForEach($File in $All_Files)
{
$command = 'aws s3 rm s3://bucket.host.com/folder1/folder2/folder3/folder4/' + $file.Split(' ')[-1] + ' --profile dev'
& $command
}
I have an s3 bucket with different filenames. I need to download specific files (filenames that starts with impression) that are created or modified in last 24 hours from s3 bucket to local folder using powershell?
$items = Get-S3Object -BucketName $sourceBucket -ProfileName $profile -Region 'us-east-1' | Sort-Object LastModified -Descending | Select-Object -First 1 | select Key Write-Host "$($items.Length) objects to copy" $index = 1 $items | % { Write-Host "$index/$($items.Length): $($_.Key)" $fileName = $Folder + ".\$($_.Key.Replace('/','\'))" Write-Host "$fileName" Read-S3Object -BucketName $sourceBucket -Key $_.Key -File $fileName -ProfileName $profile -Region 'us-east-1' > $null $index += 1 }
A workaround might be to turn on access log, and since the access log will contain timestamp, you can get all access logs in the past 24 hours, de-duplicate repeated S3 objects, then download them all.
You can enable S3 access log in the bucket settings, the logs will be stored in another bucket.
If you end up writing a script for this, just bear in mind downloading the S3 objects will essentially create new access logs, making the operation irreversible.
If you want something fancy perhaps you can even query the logs and perhaps deduplicate using AWS Athena.
We would like to transfer the development via azure devops to another company and we ask ourselves whether not only new releases can be pushed through this pipeline. But also data could be downloaded from the productive environment via the azure devops or aws devops pipeline?
I researched myself but found nothing about it.
does any of you have more information on this?
Thank you
Is it possible to download files/data during the build pipeline on Azure DevOps?
In Azure DevOps, there isn't a task to download files/data, but you can use the PowerShell task to connect to FTP server and download files.
For detailed information, you can refer to this similar question.
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#FTP Server Information - SET VARIABLES
$ftp = "ftp://XXX.com/"
$user = 'UserName'
$pass = 'Password'
$folder = 'FTP_Folder'
$target = "C:\Folder\Folder1\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
while(-not $reader.EndOfStream) {
$reader.ReadLine()
}
#$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$files = Get-FTPDir -url $folderPath -credentials $credentials
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.txt"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Certificate comes from: PowerShell Connect to FTP server and get files.
You should use artifacts when it is inside your "environment".
Otherwise you can use the normal command line tools like git or curl and wget this depends on your build agent.
I'm trying to get powershell to write results to AWS S3 and I can't figure out the syntax. Below is the line that is giving me trouble. If I run this without everything after the ">>" the results print on the screen.
Write-host "Thumbprint=" $i.Thumbprint " Expiration Date="$i.NotAfter " InstanceID ="$instanceID.Content" Subject="$i.Subject >> Write-S3Object -BucketName arn:aws:s3:::eotss-ssl-certificatemanagement
Looks like you have an issue with >> be aware that you can't pass the write-host function result into another command.
In order to do that, you need to assign the string you want into a variable and then pass it into the -Content.
Take a look at the following code snippet:
Install-Module AWSPowerShell
Import-Module AWSPowerShell
#Set AWS Credential
Set-AWSCredential -AccessKey "AccessKey" -SecretKey "SecretKey"
#File upload
Write-S3Object -BucketName "BucketName" -Key "File upload test" -File "FilePath"
#Content upload
$content = "Thumbprint= $($i.Thumbprint) Expiration Date=$($i.NotAfter) InstanceID = $($instanceID.Content) Subject=$($i.Subject)"
Write-S3Object -BucketName "BucketName" -Key "Content upload test" -Content $content
How to create new AccessKey and SecretKey - Managing Access Keys for Your AWS Account.
AWSPowerShell Module installation.
AWS Tools for PowerShell - S3 Documentation.