Connecting Mulitple VCenter Servers and collect information with Powercli - vmware

I have a list of VCenter servers. They are on different locations and of different customers. I have created a text file with all the vcenter servers and credentials like below..I have more than 20 Vcenter Servers. I need to collect information of VM, Datastores, etc.(for which I have scripts).
Connect-VIServer vcenter0001 -User vcenter0001\sysdep -Password "Passwowrd1"
Connect-VIServer vcenter0002 -User vcenter0002\sysdep -Password "Passwowrd2"
I want to connect to each VCenter server and execute my scripts. Please help me. Thanks in Advance.

There's a couple ways to accomplish this, first you need to make sure that your configuration is set to allow for multiple connections. This is done with the following:
Set-PowerCLIConfiguration -DefaultVIServerMode Multiple
Note: It may also be necessary to run the following to enforce the change against all session scopes:
Set-PowerCLIConfiguration -DefaultVIServerMode Multiple -Scope User
Set-PowerCLIConfiguration -DefaultVIServerMode Multiple -Scope Session
Afterwards, you can pass multiple vCenter server names in string format or array format to the Connect-VIServer cmdlet to the 'Server' parameter.
Example using strings:
Connect-VIServer -Server vcenter0001,vcenter0002,vcenter0003 -User sysdep -Password "Password"
Example using an array:
$vCenterNames = #('vcenter0001','vcenter0002','vcenter0003')
Connect-VIServer -Server $vCenterNames -User sysdep -Password "Password"
Lastly, since it looks like you may be using local accounts instead of a single domain account, you could look at integrating the VICredentialStore. This saves your credentials in an XML file that will be referenced automatically at time of authentication.
Example Usage:
New-VICredentialStoreItem -Host vcenter0001 -User vcenter0001\sysdep -Password "Password"
New-VICredentialStoreItem -Host vcenter0002 -User vcenter0002\sysdep -Password "Password"
New-VICredentialStoreItem -Host vcenter0003 -User vcenter0003\sysdep -Password "Password"
Connect-VIServer -Server vcenter0001,vcenter0002,vcenter0003

Suppose you have a top secret csv file where you store the connection info (i.e. vi server fqdn, logon user name and passwords) that looked like this:
viserver, username, password
myfav.cust1.org, cust1usr, cust1pw
my2fav.cust2.net, cust2usr, cust2pw
myleastfav.cust3.com, cust3usr, cust3pw
and it was saved in: c:\mysecretdocs\custviservers.csv
you could use import-csv and a foreach statement to do your inventory dirty work with a function that looked something like this:
function get-vminventory
{
$viCntinfo = Import-Csv c:\mysecretdocs\custviservers.csv
foreach ($vi in $viCntInfo)
{
$convi = connect-viserver -server $vi.viserver -username $vi.username -password $vi.password
$vms = get-vm
$vms | select name, MemoryGB, NumCpu,
#{ n = "hostname"; e = { $_.guest.hostname } },
#{ n = "ip"; e = { $_.guest.ipaddress -join ", " } },
#{ n = "viserver"; e = { $convi.Name } }
$discvi = disconnect-viserver -server * -force -confirm:$false
}
}
You can run any of the PowerCLI inventory or custom commands there and select whatever output you want, that's just an example using Get-VM. Either dot source the function or just paste it into your shell. Then execute it and put the output in a csv like this:
get-vminventory | Export-Csv c:\mycustomerdata\vminfo.csv

Related

powercli site recovery manager: determine which protection group an unprotected VM will be in. SRM uses array based replication

I use automation to deply VM's to various vcenter clusters.
I then confgiure SRM network mapping to create a network map between the cluster that the VM is in and the cluster which is used for DR purposes, in the protection group for those two clusters.
SRM is set up for array based replication, so as long as the VM is placed on replicated storage in the right cluster it will appear in SRM under the protection group, if a network mapping is in place then the VM will be auto protected by SRM or via my SRM config script.
I currently have the primary cluster, DR cluster and protection group hard coded, but would like to determine the protection group a VM is in and the name of the two clusters which the protection group is set up for, that way any changes to cluster configuration is automatically picked up and doesn't require manual changes to the SRM config script.
I've looked in the SRM API docs but it's not something I have worked out yet!
I have solved the issue:
$credential = Get-Credential
$server_name = "test-server"
Connect-VIServer -Server $primaryDC -Credential $credential
$srmConnection = Connect-SrmServer -Credential $credential -RemoteCredential $credential
Connect-VIServer -Server $secondaryDC -Credential $credential
$srmApi = $srmConnection.ExtensionData
$protectionGroups = $srmApi.Protection.ListProtectionGroups()
foreach ($protectionGroup in $protectionGroups){
$associatedVms = $protectionGroup.ListProtectedDatastores() | Get-VIObjectByVIView | Get-VM | Where-Object {($_.name -eq $server_name) -and($_.ExtensionData.Config.ManagedBy.ExtensionKey -ne 'com.vmware.vcDr' )}
foreach ($vm in $associatedVms) {
if ($vm.Name -eq $server_name) {
$protection_group_name = $protectionGroup.GetInfo().Name
$primary_cluster = get-vm -name $server_name | get-cluster
$primary_cluster_res_group = $primary_cluster.ExtensionData.ResourcePool
$srm_resource_groups = $srmApi.inventoryMapping.getResourcePoolMappings()
foreach ($resource_group in $srm_resource_groups){
if ($resource_group.PrimaryObject -eq $primary_cluster_res_group){
$secondary_res_group = $resource_group.SecondaryObject
}
}
}
}
}
$secondary_cluster = Get-Cluster | Where-Object {$_.ExtensionData.ResourcePool -eq $secondary_res_group}
Write-Host "VM: $vm - Protection Group: $protection_group_name - Primary cluster: $primary_cluster - Secondary cluster: $secondary_cluster - Primary ResGrp: $primary_cluster_res_group - Secondary ResGrp: $secondary_res_group"

Is it possible to download files/data during the build pipeline on Azure DevOps?

We would like to transfer the development via azure devops to another company and we ask ourselves whether not only new releases can be pushed through this pipeline. But also data could be downloaded from the productive environment via the azure devops or aws devops pipeline?
I researched myself but found nothing about it.
does any of you have more information on this?
Thank you
Is it possible to download files/data during the build pipeline on Azure DevOps?
In Azure DevOps, there isn't a task to download files/data, but you can use the PowerShell task to connect to FTP server and download files.
For detailed information, you can refer to this similar question.
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#FTP Server Information - SET VARIABLES
$ftp = "ftp://XXX.com/"
$user = 'UserName'
$pass = 'Password'
$folder = 'FTP_Folder'
$target = "C:\Folder\Folder1\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
while(-not $reader.EndOfStream) {
$reader.ReadLine()
}
#$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$files = Get-FTPDir -url $folderPath -credentials $credentials
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.txt"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Certificate comes from: PowerShell Connect to FTP server and get files.
You should use artifacts when it is inside your "environment".
Otherwise you can use the normal command line tools like git or curl and wget this depends on your build agent.

Scheduling a Powershell script to run weekly in AWS

So, I've got the following powershell script to find inactive AD users and disable their accounts, creating a log file containing a list of what accounts have been disabled:
Import-Module ActiveDirectory
# Set the number of days since last logon
$DaysInactive = 60
$InactiveDate = (Get-Date).Adddays(-($DaysInactive))
# Get AD Users that haven't logged on in xx days
$Users = Get-ADUser -Filter { LastLogonDate -lt $InactiveDate -and Enabled -eq $true } -
Properties LastLogonDate | Select-Object #{ Name="Username"; Expression=.
{$_.SamAccountName} }, Name, LastLogonDate, DistinguishedName
# Export results to CSV
$Users | Export-Csv C:\Temp\InactiveUsers.csv -NoTypeInformation
# Disable Inactive Users
ForEach ($Item in $Users){
$DistName = $Item.DistinguishedName
Disable-ADAccount -Identity $DistName
Get-ADUser -Filter { DistinguishedName -eq $DistName } | Select-Object #{ Name="Username"; Expression={$_.SamAccountName} }, Name, Enabled
}
The script works and is doing everything it should. What I am trying to figure out is how to automate this in an AWS environment.
I'm guessing I need to use a Lambda function in AWS to trigger this script to run on a schedule but don't know where to start.
Any help greatly appreciated.
I recomment to create a Lambda function with dotnet environment: https://docs.aws.amazon.com/lambda/latest/dg/lambda-powershell.html
Use a CloudWatch Event on a Scheduled basis to trigger the function:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/RunLambdaSchedule.html
An alternative, if you like to to have a more pipeline style execution you could use CodePipeline and CodeBuild to run the script. Use again CloudWatch to trigger the CodePipeline on a scheduled basis!

Pull Server names and ip address and see if they're live

I am looking to run a PowerShell script that will pull all servers from AD, show me the FQDN and IP address as well as tell me if the server is pinging.
I have a couple of scripts that do certain parts but would like to get something all in one and am having a hard time doing it.
Ping a list of servers:
$ServerName = Get-Content "c:\temp\servers.txt"
foreach ($Server in $ServerName) {
if (test-Connection -ComputerName $Server -Count 2 -Quiet ) {
"$Server is Pinging "
} else {
"$Server not pinging"
}
}
I also have a script to pull all servers from AD which shows server name, FQDN, and OS version:
Import-Module ActiveDirectory
Get-ADComputer -Filter {OperatingSystem -like '*Windows Server*'} -Properties * |
select Name, DNSHostName, OperatingSystem
Any help in getting a script to show me all servers in my environment with FQDN, IP address and if there live would be appreciated.
You should only choose desired properties with the -Property argument and not pull everything through network with *. Also quotes should be used with -Filter, not braces.
You can add a calculated property to Select-Object and get the value from Test-NetConnection:
Get-ADComputer -Filter "OperatingSystem -like '*Windows Server*'" -Properties dnshostname, operatingsystem |
Select-Object name, dnshostname, operatingsystem, `
#{n="PingSucceeded";e={(Test-NetConnection $_.name).PingSucceeded}}

Execute scripts under a different user context in vRealize Automation 7.2

I am building blueprints in vRealize Automation 7.2 and I need to be able to execute code from a remote location as part of the process. I know I can use encrypted properties to provide the credentials of a user and then execute scripts in a different user context, but is that my only option? I see in vRealize Orchestrator that I can change the credential of the user executing a workflow, but I'm not sure that's my best option either.
I found a way by mapping a drive on the deployed machine to the network location using powershell scripts.
$driveLetter = "U"
$networkPath = "\\network\share"
$userName = "domain\username"
$password = "password"
$psDrive = Get-PSDrive -Name $driveLetter
if ($psDrive)
{
Remove-PSDrive $psDrive
}
$token = $password | ConvertTo-SecureString -AsPlainText -Force
$credentials = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $userName, $token -ErrorAction Stop
$output = New-PSDrive -Name $driveLetter -Root $networkPath -PSProvider FileSystem -Credential $credentials -Persist -Scope Global
$driveLetter = $output.Name + ":"
I was then able to map the resulting drive letter to other steps install location using vRealize Automation software components and binding the properties to this component's property by marking the property's Binding box checked and setting the Value to ConnectionStep_1~driveLetter property.