AWS AppStream 2.0 home folder "device not ready" - amazon-web-services

We are using on-demand instances to serve our applications in desktop view on AppStream 2.0. When we click on our application script the first thing it does is try to ensure that a directory exists in the "D:\PhotonUser\My Files\Home Folder" folder.
We are experiencing an issue with the "device not ready" exception and occasionally "access denied". We have found that everything works if we add a 30-second delay at the start of our script (before it checks and creates the folder if missing).
Does anyone know if the delay in the home folder readiness is to be expected, or does anyone know of any nice ways to poll for readiness in Powershell?
Thanks for taking the time to look

There is a registry key you can check to see if the Home Folder has mounted.
$regHive = "HKLM:\SOFTWARE\Amazon\AppStream\Storage\$Env:AppStream_UserName"
function Get-HomeFolderMountStatus {
Get-ChildItem -Path $regHive `
| Where-Object { $_.Name.EndsWith('HomeFolder') } `
| Get-ItemPropertyValue -Name MountStatus
}
Write-Output "Mount status: $(Get-HomeFolderMountStatus)"
# status values can be found at
# https://docs.aws.amazon.com/appstream2/latest/developerguide/use-session-scripts.html#use-storage-connectors-with-session-scripts
while ("$(Get-HomeFolderMountStatus)" -ne '2') {
Start-Sleep -Seconds 3
Write-Output "Mount status: $(Get-HomeFolderMountStatus)"
}
# now that the folder has mounted, continue with your script
It does take too long to mount the Home Folder. I'm seeing it take over 30 seconds, which eats up far too much of the 60-second script allotment in my opinion.

Related

How to move backup files from windows server to AWS S3 bucket? But I also want 30 days backup file locally

I have some MS SQL backup files in different directories on windows server. I want to move all the files from different DB backups to S3 bucket.
Can you please suggest what would be the best approach?
My thought is to sync the folder in S3 bucket. Then delete old files keeping last 30 days files.
aws s3 sync "D:\Program data\..." s3://bucket_path
Get-ChildItem "D:\Program data\..." | Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-30) } | Remove-Item
So if I delete from source path does this sync command also removes the file from bucket.
Or is there any way I can flag in the aws command only.
This is also my first time windows scripting. Can I schedule above command from task scheduler from single bat file? I'm asking this one is powershell script & one id aws cli script.
Any input will be very valuable.
Thanks.
If you are trying to accomplish everything in powershell and if you have already aws cli configured with necessary permissions, then this would do your work: (I have added comments for better understanding)
$files = get-childitem -path C:\folderpath -recurse | where-object {$_.lastwritetime.year -lt (Get-Date).AddDays(-30)} ## Capturing the files less that 30 days
Foreach ($file in $files)
{
aws s3 cp $file s3://mybucket/$file
} ##This would iterate all the files one by one and will upload in the bucket.
## get-childitem -path C:\folderpath -recurse | where-object {$_.lastwritetime.year -lt (Get-Date).AddDays(-30)} | Remove-Item -Force ## Upto you if you want to remove the files in once shot after being uploaded
Note: It is always recommended to use try/catch in these scenarios because there could be multiple issues with file paths, file being already in use, so on and so forth. So please ensure that you have proper error handling in place before executing in production.

AWS Serverless Application Model init error on Pycharm

I am trying to create a new AWS Serverless Application on Pycharm but i am getting this error:
Could not execute `sam init`!: [Cloning from https://github.com/aws/aws-sam-cli-app-templates (process may take a moment),
Error: Unstable state when updating repo. Check that you have permissions to create/delete files
How can i solve this problem ?
Details:
OS:Windows 10 , x64
Version: Python 3.9, SAM CLI- 1.53.0
IDE: Pycharm 2022.1.3 Pro Edition
Git Version 2.37
Open PowerShell in admin mode and type this and execute
New-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Control\FileSystem" `
-Name "LongPathsEnabled" -Value 1 -PropertyType DWORD -Force
worked for me. (fixes webstorm aws toolkit error in init too)
Okay here is the problem why it occurs and the solution for Windows users:
The problem was with the path ("AWS SAM") in Windows that has a space causes the problem:
By calling the --location arg and puting the full path in double quotes it works
sam init --location "C:\Users[your_user_name]\AppData\Roaming\AWS
SAM\aws-sam-cli-app-templates\python3.9\cookiecutter-aws-sam-hello-python"
Ref: https://github.com/aws/aws-sam-cli/issues/1891
Thanks to:
https://github.com/hawflau
https://github.com/john-zenden
On my windows machine, I fixed it by setting LongPathsEnabled to 1 in the registery:
Open Registry Editor (regedit.exe).
Navigate to
Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem
Set LongPathsEnabled to 1.
This sets Git to allow long paths
Neither of the solutions worked for me. Instead:
Open Registry Editor (regedit.exe).
Navigate to Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem
Set LongPathsEnabled to 1
Copied from: https://lightrun.com/answers/aws-aws-sam-cli-permissions-error-unstable-state-when-updating-repo
Run this in powershell (as admin)
New-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Control\FileSystem" `
-Name "LongPathsEnabled" -Value 1 -PropertyType DWORD -Force
works for me

How to make windows EC2 user data script run again on startup?

A user data script will run the first time an EC2 is started.
How can I restore/reactivate this ability on a windows EC2?
Note
I have tried the script suggested here but it fails immediately as there is no file C:\Program Files\Amazon\Ec2ConfigService\Settings\Config.xml and nothing similarly named (not that I found; not even an Ec2ConfigService directory)
Also note, my question is identical to this question but for windows ec2, not linux
I understand that the point is about just running user-data, and not all the other stuff ...
To run (only) user-data script, you can run it by:
Import-Module (Join-Path (Join-Path $env:ProgramData -ChildPath "Amazon\EC2-Windows\Launch") -ChildPath "Module\Ec2Launch.psd1")
Invoke-Userdata -OnlyExecute
let's say you save this as 'C:\ProgramData\Amazon\EC2-Windows\Launch\Config\run-user-data.ps1', then you can use PowerShell to schedule a new task to run at startup:
$Action = New-ScheduledTaskAction -Execute 'Powershell.exe' -Argument '-ExecutionPolicy Bypass C:\ProgramData\Amazon\EC2-Windows\Launch\Config\run-user-data.ps1'
$Trigger = New-ScheduledTaskTrigger -AtStartup
$Settings = New-ScheduledTaskSettingsSet
$Task = New-ScheduledTask -Action $Action -Trigger $Trigger -Settings $Settings
Register-ScheduledTask -TaskName 'Execute user-data' -InputObject $Task -User 'NT AUTHORITY\SYSTEM' -Force
I use this sort of solution by creating the mentioned file and command on 'AWS::CloudFormation::Init' sections.
Hope it helps!

Copy-Item : The process cannot access the file

I have a Jenkins job which does below activity.
Stop WebService
Delete WebService
Copy items from Jenkins workspace to server path
Create WebService
Start WebService
Below is my PowerShell script:
Get-ChildItem "C:\Location\*"
$service = Get-Service -Name value -Computername $env:SERVER -ErrorAction SilentlyContinue
sc.exe \\$env:SERVER stop value
Write-Host "value STOPPED"
sc.exe \\$env:SERVER delete val
Write-Host "val DELETED"
Copy-Item "C:\Location\*" "\\$env:SERVER\d$\Location" -Force -Recurse
sc.exe \\$env:SERVER create val start=auto DisplayName ="val" binPath= D:\Location.exe
sc.exe \\$env:SERVER description value"value"
sc.exe \\$env:SERVER start value
Write-Host "value STARTED"
if ($error) { exit 1 }
Error logs:
Copy-Item : The process cannot access the file '\\Location' because it is being used by another process.
At C:\Users\Administrator\AppData\Local\Temp\hudson2059984936352103941.ps1:18 char:5
+ Copy-Item "C:\Location\*" " ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Copy-Item], IOException
+ FullyQualifiedErrorId : System.IO.IOException,Microsoft.PowerShell.Commands.CopyItemCommand
[SC] CreateService FAILED 1072:
The specified service has been marked for deletion.
[SC] ChangeServiceConfig2 FAILED 1072:
The specified service has been marked for deletion.
[SC] StartService FAILED 1058:
The service cannot be started, either because it is disabled or because it has no
enabled devices associated with it.
Can you please help me out with this error? Do I need to restart the deploy server so that my process gets killed? If so I feel this is not relevant and cannot do in prod servers.
The problem is that you have another process which is using the dll. Since it is being used you cannot remove it. You need to make sure that no process is using your dll before you can remove it. To achieve this you will need to find out which process is using it, why it is using it and make sure that it will be closed. If the problem occurs next time as well, then you will need to add closing that process to your script.

Run Powershell script as a service, watching log files and reporting via email

I have some log files that I am watching, I need to do this using PowerShell since its a windows server. the command I am using to do this is:
Get-Content System.log -wait | {$_-match "some regex"}
I am trying to run this in a script and make it email me if it found any changes to the log file that follow the regex i specify. This is easily doable using bash and cron. I am not sure what's the equivalent in PowerShell. and what return code I need to look for to know if it found anything?
have you tried
Get-Content System.log -wait | {$_-match "some regex"} | foreach { send-email -message $_ }
I don't know if the result of the where will be piped until get-content -wait exits but give it a go.
As for running it as a service you can call a powershell script from FireDaemon or SvrAny but you could also use a schedule task to keep restarting the script.