Copy-Item : The process cannot access the file - web-services

I have a Jenkins job which does below activity.
Stop WebService
Delete WebService
Copy items from Jenkins workspace to server path
Create WebService
Start WebService
Below is my PowerShell script:
Get-ChildItem "C:\Location\*"
$service = Get-Service -Name value -Computername $env:SERVER -ErrorAction SilentlyContinue
sc.exe \\$env:SERVER stop value
Write-Host "value STOPPED"
sc.exe \\$env:SERVER delete val
Write-Host "val DELETED"
Copy-Item "C:\Location\*" "\\$env:SERVER\d$\Location" -Force -Recurse
sc.exe \\$env:SERVER create val start=auto DisplayName ="val" binPath= D:\Location.exe
sc.exe \\$env:SERVER description value"value"
sc.exe \\$env:SERVER start value
Write-Host "value STARTED"
if ($error) { exit 1 }
Error logs:
Copy-Item : The process cannot access the file '\\Location' because it is being used by another process.
At C:\Users\Administrator\AppData\Local\Temp\hudson2059984936352103941.ps1:18 char:5
+ Copy-Item "C:\Location\*" " ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Copy-Item], IOException
+ FullyQualifiedErrorId : System.IO.IOException,Microsoft.PowerShell.Commands.CopyItemCommand
[SC] CreateService FAILED 1072:
The specified service has been marked for deletion.
[SC] ChangeServiceConfig2 FAILED 1072:
The specified service has been marked for deletion.
[SC] StartService FAILED 1058:
The service cannot be started, either because it is disabled or because it has no
enabled devices associated with it.
Can you please help me out with this error? Do I need to restart the deploy server so that my process gets killed? If so I feel this is not relevant and cannot do in prod servers.

The problem is that you have another process which is using the dll. Since it is being used you cannot remove it. You need to make sure that no process is using your dll before you can remove it. To achieve this you will need to find out which process is using it, why it is using it and make sure that it will be closed. If the problem occurs next time as well, then you will need to add closing that process to your script.

Related

How to download aws cli and use in the same powershell script

I have the following powershell script that runs via packer utility while creating aws ami image.
This script downloads and installs aws cli and then immediately try to use it. The installation process will update windows PATH environment variable but I think it will not be available immediately in the same script. So I set the location to where cli is installed before using it.
$SETUP_DIR = "C:\Setup"
New-Item -Path $SETUP_DIR -ItemType Directory -Force -ErrorAction SilentlyContinue
Clear-Host
Set-Location -Path $SETUP_DIR -PassThru
# install AWS CLI
msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi /quiet
Write-Host "AWS CLI installation completed."
# Set location to AWS CLI
Set-Location -Path "C:\Program Files\Amazon\AWSCLIV2" -PassThru
# Download utility from AWS S3
aws s3 cp s3://tools/utility.exe
Write-Host "Utility download completed."
# switch the location back to c:\setup
Set-Location -Path $SETUP_DIR -PassThru
However when the script executed, it throws error as The term 'aws' is not recognized
aws s3 cp s3://tools/utility.exe ...
==> amazon-ebs.windows_server: + ~~~
==> amazon-ebs.windows_server: + CategoryInfo : ObjectNotFound: (aws:String) [], CommandNotFoundException
==> amazon-ebs.windows_server: + FullyQualifiedErrorId : CommandNotFoundException
==> amazon-ebs.windows_server:
amazon-ebs.windows_server:
==> amazon-ebs.windows_server: aws : The term 'aws' is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the
==> amazon-ebs.windows_server: spelling of the name, or if a path was included, verify that the path is correct and try again.
When you run aws s3 ... system is trying to locate aws in the system path it knows about. The fact that you change directory to the location of the aws binary does not do anything for you.
As you probably know, even when you are in the same directory as a binary, if you simply try to use it, it won't work if it is not in the path, and that's why you would use .\binary_name.exe That's a safety feature in PowerShell.
You could try updating the path as suggested in the commends by adding that location:
$env:path="$env:path;C:\Program Files\Amazon\AWSCLIV2"
or point to the binary directly. You could also create an alias like:
set-alias -name aws -value "c:\program files\Amazon\awscliv2\aws.cmd"
Then use aws and it should run the aws.cmd (check that folder to make sure you are calling the correct binary. It used to be aws.cmd for aws cli v1 but may have changed)

AWS AppStream 2.0 home folder "device not ready"

We are using on-demand instances to serve our applications in desktop view on AppStream 2.0. When we click on our application script the first thing it does is try to ensure that a directory exists in the "D:\PhotonUser\My Files\Home Folder" folder.
We are experiencing an issue with the "device not ready" exception and occasionally "access denied". We have found that everything works if we add a 30-second delay at the start of our script (before it checks and creates the folder if missing).
Does anyone know if the delay in the home folder readiness is to be expected, or does anyone know of any nice ways to poll for readiness in Powershell?
Thanks for taking the time to look
There is a registry key you can check to see if the Home Folder has mounted.
$regHive = "HKLM:\SOFTWARE\Amazon\AppStream\Storage\$Env:AppStream_UserName"
function Get-HomeFolderMountStatus {
Get-ChildItem -Path $regHive `
| Where-Object { $_.Name.EndsWith('HomeFolder') } `
| Get-ItemPropertyValue -Name MountStatus
}
Write-Output "Mount status: $(Get-HomeFolderMountStatus)"
# status values can be found at
# https://docs.aws.amazon.com/appstream2/latest/developerguide/use-session-scripts.html#use-storage-connectors-with-session-scripts
while ("$(Get-HomeFolderMountStatus)" -ne '2') {
Start-Sleep -Seconds 3
Write-Output "Mount status: $(Get-HomeFolderMountStatus)"
}
# now that the folder has mounted, continue with your script
It does take too long to mount the Home Folder. I'm seeing it take over 30 seconds, which eats up far too much of the 60-second script allotment in my opinion.

aws command not getting recognized after MSI install

I am new to powershell here. I can't figure out why after successfully installing AWS CLI, I intermittently get back aws command not recognized error. I put in sleep thinking some environment variables might be getting set in background. Need help figuring out what do I need to do here to able to to successfully execute $putItem command.
I followed the instructions here https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-windows.html
PLEASE NOTE: The whole thing has to be automated, so I can't manually login to a host and fix something as this same script has to be run on 100+ hosts
Write-Output "Checking if AWS CLI support exists..."
cmd.exe /c "aws --version"
if ($LASTEXITCODE -eq 0){
Write-Output "AWS CLI installed already"
} else {
Write-Output "Installing AWS CLI V2"
cmd.exe /c "msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi /qn"
if ($LASTEXITCODE -eq 0){
Write-Output "AWS CLI installed successfully"
Start-Sleep -s 5
} else {
Write-Output "Could not install AWS CLI"
exit 1
}
}
$putItem = 'aws dynamodb put-item --table-name ' + $instanceStatusDDBTable + ' --item "{\"HostName\" : {\"S\" : \"' + $instanceName + '\"}, \"Modules\" : {\"M\" : {}}, \"DAGName\" : {\"S\" : \"' + $dagName +'\"}}"'
Write-Output "Executing DB put item query $putItem"
cmd.exe /c $putItem
if ($LASTEXITCODE -eq 0){
Write-Output "Created entry for $instanceName in $instanceStatusDDBTable DDB table"
} else {
Write-Output "Could not complete put Item operation for $instanceName"
exit 1
}
Here is the output
Checking if AWS CLI support exists...
Installing AWS CLI V2
AWS CLI installed successfully
Executing DB put item query aws dynamodb put-item --table-name Ex2019-HostStatusTable --item "{\"HostName\" : {\"S\" : \"Host1\"}, \"Modules\" : {\"M\" : {}}, \"DAGName\" : {\"S\" : \"USW-D01\"}}"
Could not complete put Item operation for Host1
Error output -
'aws' is not recognized as an internal or external command,
operable program or batch file.
Try adding the below code to refresh your environment variables after you check your $LASTEXITCODE variable. The shell session has to regather the updated environment variables your installer just added. See this response for more info.
$env:Path = [System.Environment]::GetEnvironmentVariable("Path","Machine") + ";" + [System.Environment]::GetEnvironmentVariable("Path","User")
You may also want to consider using the Start-Process with the -wait and -passthru params to invoke your installer as the cmd may not wait long enough for the app to finish installing. You can read up on here. I do agree with David, you could just check to see if it's installed by running aws --version and then reading in the version number or catching the error in a try catch block.

Powershell can't remove file i just wrote

I wanted to install git on an AWS EC2 Windows Server. So i used an Invoke-WebRequest to download the portable git exe
Invoke-WebRequest -Uri https://github.com/git-for-windows/git/releases/download/v2.23.0.windows.1/PortableGit-2.23.0-64-bit.7z.exe -UseBasicParsing -OutFile git.exe
But the Download got stuck and i terminated the session. Now i want to remove git.exe but for some reason i'm not allowed to.
I tried removing the file with:
Remove-Item .\git.exe
But i got an error message telling me i'm not allowd to
Remove-Item : Cannot remove item C:\git.exe: Access to the path 'C:\git.exe' is denied.
At line:1 char:1
+ Remove-Item .\git.exe
+ ~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : PermissionDenied: (C:\git.exe:FileInfo) [Remove-Item], UnauthorizedAccessException
+ FullyQualifiedErrorId : RemoveFileSystemItemUnAuthorizedAccess,Microsoft.PowerShell.Commands.RemoveItemCommand
I just found out that the Invoke-WebRequest was not properly terminated and the stuck process still kept the file on lock. after restarting the server i was able to delete the file. Thanks for the suggestions and comments.

How to make windows EC2 user data script run again on startup?

A user data script will run the first time an EC2 is started.
How can I restore/reactivate this ability on a windows EC2?
Note
I have tried the script suggested here but it fails immediately as there is no file C:\Program Files\Amazon\Ec2ConfigService\Settings\Config.xml and nothing similarly named (not that I found; not even an Ec2ConfigService directory)
Also note, my question is identical to this question but for windows ec2, not linux
I understand that the point is about just running user-data, and not all the other stuff ...
To run (only) user-data script, you can run it by:
Import-Module (Join-Path (Join-Path $env:ProgramData -ChildPath "Amazon\EC2-Windows\Launch") -ChildPath "Module\Ec2Launch.psd1")
Invoke-Userdata -OnlyExecute
let's say you save this as 'C:\ProgramData\Amazon\EC2-Windows\Launch\Config\run-user-data.ps1', then you can use PowerShell to schedule a new task to run at startup:
$Action = New-ScheduledTaskAction -Execute 'Powershell.exe' -Argument '-ExecutionPolicy Bypass C:\ProgramData\Amazon\EC2-Windows\Launch\Config\run-user-data.ps1'
$Trigger = New-ScheduledTaskTrigger -AtStartup
$Settings = New-ScheduledTaskSettingsSet
$Task = New-ScheduledTask -Action $Action -Trigger $Trigger -Settings $Settings
Register-ScheduledTask -TaskName 'Execute user-data' -InputObject $Task -User 'NT AUTHORITY\SYSTEM' -Force
I use this sort of solution by creating the mentioned file and command on 'AWS::CloudFormation::Init' sections.
Hope it helps!