I need your help with a script that i am building.I need it to:`
To find a VM and see if it is turned off or on.
If it is off then turn it on and copy a folder with the content to replace the old folder in the VM machine.
$vmName = 'Target'
$folderName = 'C:\Folder'
Get-VM -Name $vmName | where{$_.PowerState -ne 'PoweredOn'} | Start-VM -Confirm:$false
sleep 30
Copy-VMGuestFile -LocalToGuest -Source $folderName -Destination $folderName -Confirm:$false
My problem is that the folder and its content is copy to the destination but it will not replace the folder and its content from the destination VM.
Regrades,
Michel Vaillancourt
Looks like -Force clobbers existing files on the target, but will not mirror the source contents, at least with PowerCLI 5.8 R1. So files that exist on target but not source before the copy will remain on the target after the copy. Not sure if this is by design. Invoke-VMScript could delete the target folder before Copy-VMGuestFile runs.
Why not use Robocopy or some other tool after the VM is up?
1- Start-VM, wait 1 or 2 minutes.
2- Loop until "Test-Path $folderName" is true.
3- Copy folder.
Related
I'm trying to rename my file toto.html by removing its extension using gsutils. My folder/files are organized this way:
toto ----|
|
|--file1.html
|
|--file2.txt
|
|-- ...
|
toto.html
So I used the command :
gsutil mv gs://my_bucket/toto.html gs://my_bucket/toto
the problem is by doing so, the file is moved inside the folder that has the same name as the directory. ( which is normal )
How can I rename my file without having this issue?
thanks in advance.
Let me explain briefly about how directories work on Google Cloud Storage, as detailed in the documentation:
Cloud Storage operates with a flat namespace, which means that folders don't actually exist within Cloud Storage. If you create an object named folder1/file.txt in the bucket your-bucket, the path to the object is your-bucket/folder1/file.txt, but there is no folder named folder1; instead, the string folder1 is part of the object's name.
So when you perform the move command gsutil mv gs://my_bucket/toto.html gs://my_bucket/toto, gsutil interprets it like you would like to move the file inside the folder named toto as there is no way to differentiate between the file and the folder.
What you could do to achieve the task is move the file inside a folder first to change its name and then get it back to root:
gsutil mv gs://my_bucket/toto.html gs://my_bucket/toto/toto
gsutil mv gs://my_bucket/toto/toto gs://my_bucket/
I have folder A and folder B
Folder A contains approx 100 files all text, js, php, bash etc. They are stored in the root of the folder and sub folders and further sub folders within folder A.
Folder B is a copy of Folder A, but some of the files have been updated.
Is there any way I can compare A to B and create a tar.gz file containing only the files that have changed in Folder B
I would need to keep the folder structure intact when the tar.gz is created.
Currently I use WinMerge to check for differences, but I'm happy to look at any windows or Linux application/commands that will help with this.
Thanks
This line excludes files that are only in one or the other, but creates the tar.gz file that you want.
diff -rq folderA folderB | grep -v "^Only in" | sed "s/^.*and folderB/folderB/g" | sed "s/ differ//g" | tar czf output.tar.gz -T -
Broken down it goes:
dif -rq folderA folderB
Do a recursive diff between these folders, be quiet about it - only output the file names.
| grep -v "^Only in"
Exclude output lines that indicate one file is only in one of the folders. I'm assuming from your description this isn't an issue for you, but the two folders I was playing with were a bit dirty.
| sed "s/^.*and folderB/folderB/g"
Discard the first bit of the output up until it says " and " and then the name of the second folder. This actually takes away the second folder name as well, but then replaces it back in
| sed "s/ differ//g"
Discard the end bit of the diff output.
| tar czf output.tar.gz -T -
Tell tar to do the thing. c == create a tar file z means compress it (zip) f means the filename is coming shortly. output.tar.gz is your output file -T means "get the filenames from the file I'm about to tell you" the final - means "use stdin instead"
I suggest you build this up yourself in the individual steps so you can see how it is constructed, and what the output of each step is like.
I want to replace folder A in P4 by another folder A.
The two folders have different files and sub folders.
I know, we can do it by deleting old folder A then adding new folder A.
But, can I do it with only one step in a pending changelist ?
As following result in that pending cl:
If this file is in old folder, but not in new folder, then it is marked by "delete".
If this file is in new folder, but not in old folder, then it is marked by "add".
If this file is in new folder and also in old folder, then it is marked by "modify".
Thank you
Are both these folders under source control?
That is, are you trying to make //depot/folder/A contain what //depot/other/A_prime contains?
If so, consider using 'p4 copy':
p4 copy //depot/other/A_prime/... //depot/folder/A/...
If the other folder A is just something you have on your hard disk, then consider using 'reconcile':
p4 edit //depot/folder/A/...
rm -r /path/to/depot/folder/A/*
cp -r /path/to/other/folder/A/* /path/to/depot/folder/A
p4 reconcile -aed //depot/folder/A/...
I kind of like the 'p4 copy' approach, myself, so I'd be tempted to check that other folder into Perforce (in a different location in the repository, naturally), so that I could then run 'p4 copy'.
Stack,
We have many files in our library that were never used in subsequent projects. We are now at a development phase where we can do some good housekeeping and carefully remove unused library code. I am trying to optimize my grep command, it's current implementation is quite slow.
grep --include=*.cpp --recursive --files-with-matches <library function name> <network path to subsequent projects>
The main reason is that the projects path is expansive and the bulk of the time is spent just navigating the directory tree and applying the file mask. This grep command is called many times on the same set of project files.
Rather than navigating the directory tree every call, I would like to grep to reference a static filelist stored on my local disk.
Something akin to this:
grep --from-filelist=c:\MyProjectFileList.txt
The MyProjectFileList.txt would be:
\\server1\myproject1\main.cpp
\\server1\myproject1\func1.cpp
\\server1\myproject2\main.cpp
\\server1\myproject2\method.cpp
Grep would apply the pattern-expression to contents of each of those files. Grep output would be the fully qualified path of the project file that is uses a specific library function.
Grep commands for specific library functions that return no project files are extraneous and can be deleted.
How do you force grep to scan files from an external filelist stored in a text file?
(Thereby removing directory scanning.)
Try around a little using the 'xargs' command and pipes ("|").
Try the following:
while read line; do echo -e "$line"; done < list_of_files.txt | xargs -0 grep **YOUR_GREP_ARGS_HERE**
or in a Windows environment with Powershell installed try...
Get-Content List_of_files.txt | Foreach-Object {grep $_ GREP_ARGS_HERE}
I googled for windows args and found this:
FOR /F %k in (filelist.txt) DO grep yourgrepargs %k
(but I use linux, no idea if it works)
When I try run make from cmd-console on Windows, it runs Turbo Delphi's make.exe but I need MSYS's make.exe. There is no mention about Turbo Delphi in %path% variable, maybe I can change it to MSYS in registry?
The path is in the registry but usually you edit through this interface:
Go to Control Panel -> System -> System settings -> Environment Variables.
Scroll down in system variables until you find PATH.
Click edit and change accordingly.
BE SURE to include a semicolon at the end of the previous as that is the delimiter, i.e. c:\path;c:\path2
Launch a new console for the settings to take effect.
Here I'm providing solution to setup Terraform environment variable in windows for beginners.
Download the terraform ZIP file from Terraform site.
Extract the .exe from the ZIP file to a folder eg C:\Apps\Terraform
copy this path location like C:\Apps\terraform\
Add the folder location to your PATH variable, eg: Control Panel -> System -> System settings -> Environment Variables
In System Variables, select Path > edit > new > Enter the location of the Terraform .exe, eg C:\Apps\Terraform then click OK
Open a new CMD/PowerShell and the Terraform command should work
Or you can just run this PowerShell command to append extra folder to the existing path:
$env:Path += ";C:\temp\terraform"
To add a PERSISTENT path (eg one that's permanent), you can do this one-liner in PowerShell (adjust the last c:\apps\terraform part)
Set-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH -Value (((Get-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH).path) + ";c:\apps\terraform" )
Alternatively, you can jump directly to the Environment Variables dialog by RUNning/CMD/PowerShell this:
rundll32.exe sysdm.cpl,EditEnvironmentVariables
I had issues for a whilst not getting Terraform commands to run unless I was in the directory of the exe, even though I set the path correctly.
For anyone else finding this issue, I fixed it by moving the environment variable higher than others!
Why don't you create a bat file makedos.bat containing the following line?
c:\DOS\make.exe %1 %2 %5
and put it in C:\DOS (or C:\Windowsè or make sure that it is in your %path%)
You can run from cmd, SET and it displays all environment variables, including PATH.
In registry you can find environment variables under:
HKEY_CURRENT_USER\Environment
HKEY_CURRENT_USER\Volatile Environment
HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Session Manager\Environment
just copy it to system32 call make1 or whatever if the name conflicts.