So, I've got the following powershell script to find inactive AD users and disable their accounts, creating a log file containing a list of what accounts have been disabled:
Import-Module ActiveDirectory
# Set the number of days since last logon
$DaysInactive = 60
$InactiveDate = (Get-Date).Adddays(-($DaysInactive))
# Get AD Users that haven't logged on in xx days
$Users = Get-ADUser -Filter { LastLogonDate -lt $InactiveDate -and Enabled -eq $true } -
Properties LastLogonDate | Select-Object #{ Name="Username"; Expression=.
{$_.SamAccountName} }, Name, LastLogonDate, DistinguishedName
# Export results to CSV
$Users | Export-Csv C:\Temp\InactiveUsers.csv -NoTypeInformation
# Disable Inactive Users
ForEach ($Item in $Users){
$DistName = $Item.DistinguishedName
Disable-ADAccount -Identity $DistName
Get-ADUser -Filter { DistinguishedName -eq $DistName } | Select-Object #{ Name="Username"; Expression={$_.SamAccountName} }, Name, Enabled
}
The script works and is doing everything it should. What I am trying to figure out is how to automate this in an AWS environment.
I'm guessing I need to use a Lambda function in AWS to trigger this script to run on a schedule but don't know where to start.
Any help greatly appreciated.
I recomment to create a Lambda function with dotnet environment: https://docs.aws.amazon.com/lambda/latest/dg/lambda-powershell.html
Use a CloudWatch Event on a Scheduled basis to trigger the function:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/RunLambdaSchedule.html
An alternative, if you like to to have a more pipeline style execution you could use CodePipeline and CodeBuild to run the script. Use again CloudWatch to trigger the CodePipeline on a scheduled basis!
Related
I have spent a long time on this getting nowhere and cannot find an answer on the web.
I am looking for a PowerShell script that will return EC2 Instances without tag called 'backup' associated with it.
Each Backup tag has a value but right now I am just looking for instances which do not have the tag. Any help would be greatly appreciated.
Thanks
This returns the EC2 Instance objects without a tag named Backup in a specific region:
Import-Module AWSPowershell
$instances = (Get-EC2Instance -Region $region -Credential $cred).Instances
$EC2Tags = Get-EC2Tag -Region $region -Credential $cred |
Where {$_.Key -eq 'Backup' -and $_.ResourceType -EQ 'Instance'}
$instances | Where {$_.InstanceID -NotIn $EC2Tags.ResourceID}
I have an s3 bucket with different filenames. I need to download specific files (filenames that starts with impression) that are created or modified in last 24 hours from s3 bucket to local folder using powershell?
$items = Get-S3Object -BucketName $sourceBucket -ProfileName $profile -Region 'us-east-1' | Sort-Object LastModified -Descending | Select-Object -First 1 | select Key Write-Host "$($items.Length) objects to copy" $index = 1 $items | % { Write-Host "$index/$($items.Length): $($_.Key)" $fileName = $Folder + ".\$($_.Key.Replace('/','\'))" Write-Host "$fileName" Read-S3Object -BucketName $sourceBucket -Key $_.Key -File $fileName -ProfileName $profile -Region 'us-east-1' > $null $index += 1 }
A workaround might be to turn on access log, and since the access log will contain timestamp, you can get all access logs in the past 24 hours, de-duplicate repeated S3 objects, then download them all.
You can enable S3 access log in the bucket settings, the logs will be stored in another bucket.
If you end up writing a script for this, just bear in mind downloading the S3 objects will essentially create new access logs, making the operation irreversible.
If you want something fancy perhaps you can even query the logs and perhaps deduplicate using AWS Athena.
Does anyone have a PowerShell script that starts and stops aws ec2 instances?
param
(
[string] $Filter = "xxxxx*"
)
$CurrentDate = (Get-Date -Format "yyyyMMdd.0.0")
$instances = Get-EC2Instance -Filter #(#{name = 'tag:Name'; values = "xxxx"}) Start-EC2Instance $_.instances.instanceid
Check out this cmdlet by AWS: "Start-EC2Instance Cmdlet", https://docs.aws.amazon.com/powershell/latest/reference/items/Start-EC2Instance.html
And in extension, have a look at the PowerShell Gallery to understand the AWS tools for PowerShell: https://www.powershellgallery.com/packages/AWSPowerShell.NetCore/4.1.11.0
When running a cmdlet like Get-WKSWorkspaces, it will return a set of properties about your workspaces (e.g. WorkspaceID, Username, SubnetID, BundleID, etc.), but not everything you see in the AWS GUI. I am specifically trying to pull things like Running Mode, Compute Type, and Creation Time as well, but can't seem to find where to pull it.
In my research, I got up to the point where I was using $AWSHistory to try and dig deeper into the data returned from my previous cmdlets, but have definitely hit a wall and can't seem to get around it.
I do have a partial command that is giving me most of the output I need:
$region = Get-DefaultAWSRegion
$lastuserconnect = Get-WKSWorkspacesConnectionStatus | Select LastKnownUserConnectionTimestamp
Get-WKSWorkspace -ProfileName ITSLayer1-053082227562-Profile | Select WorkspaceID, UserName, BundleID, DirectoryID,
#{Name="Region"; Expression={$region.Region}},
#{Name="LastKnownUserConnect"; Expression=
{$lastuserconnect.LastKnownUserConnectionTimestamp}}
Update for posterity: Actually got something decent to come out here. It's slow, but it renders in a table format pretty well and includes a bit at the start to select your AWS region.
Suggestions for improvement include:
Automatically switching the Region select to get all workspaces from
the main Regions we use
Cleaning the lines up so it's easier to
read
Getting the region to automatically append the filename so it
doesn't overwrite your file every time (it's in there but broken at
the moment...still pops out a file with 'workspace_properties.csv'
as the name)
Optimizing the script because it's pretty slow
$lastuserconnect = Get-WKSWorkspacesConnectionStatus -ProfileName $profile
$defaultregion = Get-DefaultAWSRegion
$showallregions = Get-AWSRegion
$exportpath = "" + $env:USERPROFILE + "\workspace_properties" +
$defaultregion.Region + ".csv"
$showallregions | Format-Table
$setregion = Read-Host -Prompt 'AWS Region'
Clear-DefaultAWSRegion
Set-DefaultAWSRegion $setregion
Get-WKSWorkspace -ProfileName $profile | Select WorkspaceID, UserName, BundleID, DirectoryID, #{Name="ComputeType"; Expression={$.WorkspaceProperties.ComputeTypeName}}, #{Name="RunningMode"; Expression={$.WorkspaceProperties.RunningMode}}, #{Name="Region"; Expression={$defaultregion.Region}}, #{Name="LastKnownUserConnect"; Expression={$_ | foreach {$lastuserconnect = Get-WKSWorkspacesConnectionStatus -ProfileName $profile -WorkspaceId $_.WorkspaceId; echo $lastuserconnect.LastKnownUserConnectionTimestamp}}} | Export-Csv $exportpath
Here is an example of fetching those properties you are looking for:
Get-WKSWorkspace | foreach {
$connectionStatus = Get-WKSWorkspacesConnectionStatus -WorkspaceId $_.WorkspaceId;
echo "";
echo "==> About $($_.WorkspaceId)";
echo "Last State Check: $($connectionStatus.ConnectionStateCheckTimestamp)";
echo "User Last Active: $($connectionStatus.LastKnownUserConnectionTimestamp)";
echo "Directory: $($_.DirectoryId)";
echo "Compute: $($_.WorkspaceProperties.ComputeTypeName)";
echo "Running mode $($_.WorkspaceProperties.RunningMode)";
echo "State $($_.State)"
}
I don't see a 'Creation Time' on workspace on the console either.
[edit]
I believe you are looking for a way to export these info, may be below code will help:
[System.Collections.ArrayList]$output=#()
Get-WKSWorkspace | foreach {
$connectionStatus = Get-WKSWorkspacesConnectionStatus -WorkspaceId $_.WorkspaceId;
$bunch = [pscustomobject]#{
WorkspaceId = $_.WorkspaceId
LastStateCheck=$connectionStatus.ConnectionStateCheckTimestamp
UserLastActive=$connectionStatus.LastKnownUserConnectionTimestamp
Directory= $_.DirectoryId
Compute=$_.WorkspaceProperties.ComputeTypeName
Runningmode= $_.WorkspaceProperties.RunningMode
State= $_.State
}
$output.Add($bunch)|Out-Null
}
$output | Export-Csv -NoType c:\dd.csv
From looking at the docs it appears what you are looking for in the property WorkspaceProperties which contains an Amazon.WorkSpaces.Model.WorkspaceProperties object with the following properties:
ComputeTypeName Amazon.WorkSpaces.Compute
RootVolumeSizeGib System.Int32
RunningMode Amazon.WorkSpaces.RunningMode
RunningModeAutoStopTimeoutInMinutes System.Int32
UserVolumeSizeGib System.Int32
Not sure about the CreationTime though...
I am building blueprints in vRealize Automation 7.2 and I need to be able to execute code from a remote location as part of the process. I know I can use encrypted properties to provide the credentials of a user and then execute scripts in a different user context, but is that my only option? I see in vRealize Orchestrator that I can change the credential of the user executing a workflow, but I'm not sure that's my best option either.
I found a way by mapping a drive on the deployed machine to the network location using powershell scripts.
$driveLetter = "U"
$networkPath = "\\network\share"
$userName = "domain\username"
$password = "password"
$psDrive = Get-PSDrive -Name $driveLetter
if ($psDrive)
{
Remove-PSDrive $psDrive
}
$token = $password | ConvertTo-SecureString -AsPlainText -Force
$credentials = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $userName, $token -ErrorAction Stop
$output = New-PSDrive -Name $driveLetter -Root $networkPath -PSProvider FileSystem -Credential $credentials -Persist -Scope Global
$driveLetter = $output.Name + ":"
I was then able to map the resulting drive letter to other steps install location using vRealize Automation software components and binding the properties to this component's property by marking the property's Binding box checked and setting the Value to ConnectionStep_1~driveLetter property.