How to get a .txt file of all folders names inside a folder but not the files/subfolders inside the child folders using powershell? - list

Currently I'm using, SHIFT + Right Click > Open powershell window here > then paste in
dir -recurse | select -ExpandProperty Name | foreach {$_.split('.',2)[0]} | out-file file.txt
Only problem is I choose a directory to SHIFT + Right click, but I get all the names of the files/folders inside the second folder too and it really ruins the organization I'm going for.
So for example I have a folder called "RootFolder".
Inside RootFolder is 10 other folders called "Folder1" through "Folder10".
I only want the names Folder1 - Folder10 to be inside a .txt folder the shell command creates. I do not want Subfolders/files inside folders Folder1-10 in the .txt file.

If you want to only list directories you need to tell it.
Dir -R | ? { $_.PSIsContainer }
or
Dir -R | ? { $_ -is [System.IO.DirectoryInfo] }
In more recent PowerShell versions you can do it directly.
Dir -Dir
Dir = Get-ChildItem
-R = -Recurse
? = Where-Object
-Dir = -Directory

If you only want directories within the named directory, you don't want to recurse. That will travel the entire directory hierarchy; just use the -Directory switch to filter the output to only include directories.
Then, you can just extract the Name property from the DirectoryInfo objects.
PS C:\> dir C:\Windows -Directory |% { $_.Name }
addins
appcompat
apppatch
AppReadiness
[...]
and output as desired. Even more concisely:
(dir C:\Windows -Directory).Name

Related

Draw.io - how to export all tabs to images using command line

I have installed on my PC draw.io app. I want to export all tabs with drawings to seperate files. The only options I have found is:
"c:\Program Files\draw.io\draw.io.exe" --crop -x -f jpg c:\Users\user-name\Documents\_xxx_\my-file.drawio
Help for draw.io
Usage: draw.io [options] [input file/folder]
Options:
(...)
-x, --export export the input file/folder based on the
given options
-r, --recursive for a folder input, recursively convert
all files in sub-folders also
-o, --output <output file/folder> specify the output file/folder. If
omitted, the input file name is used for
output with the specified format as
extension
-f, --format <format> if output file name extension is
specified, this option is ignored (file
type is determined from output extension,
possible export formats are pdf, png, jpg,
svg, vsdx, and xml) (default: "pdf")
(default: 0)
-a, --all-pages export all pages (for PDF format only)
-p, --page-index <pageIndex> selects a specific page, if not specified
and the format is an image, the first page
is selected
-g, --page-range <from>..<to> selects a page range (for PDF format only)
(...)
is not supporting. I can use one of this:
-p, --page-index <pageIndex> selects a specific page, if not specified
and the format is an image, the first page
is selected
-g, --page-range <from>..<to> selects a page range (for PDF format only)
but how to get page-range or number of pages to select index?
There is no easy way to find the number of pages out of the box with Draw.io's CLI options.
One solution would be export the diagram as XML.
draw.io --export --format xml --uncompressed test-me.drawio
And then count how many diagram elements there are. It should equal the number of pages (I briefly tested this but I'm not 100% sure if diagram element only appears once per page).
grep -o "<diagram" "test-me.xml" | wc -l
Here is an example of putting it all together in a bash script (I tried this on MacOS 10.15)
#!/bin/bash
file=test-me # File name excluding extension
# Export diagram to plain XML
draw.io --export --format xml --uncompressed "$file.drawio"
# Count how many pages based on <diagram element
count=$(grep -o "<diagram" "$file.xml" | wc -l)
# Export each page as an PNG
# Page index is zero based
for ((i = 0 ; i <= $count-1; i++)); do
draw.io --export --page-index $i --output "$file-$i.png" "$file.drawio"
done
OP did ask the question with reference to the Windows version, so here's a PowerShell solution inspired by eddiegroves
$DIR_DRAWIO = "."
$DrawIoFiles = Get-ChildItem $DIR_DRAWIO *.drawio -File
foreach ($file in $DrawIoFiles) {
"File: '$($file.FullName)'"
$xml_file = "$($file.DirectoryName)/$($file.BaseName).xml"
if ((Test-Path $xml_file)) {
Remove-Item -Path $xml_file -Force
}
# export to XML
& "C:/Program Files/draw.io/draw.io.exe" '--export' '--format' 'xml' $file.FullName
# wait for XML file creation
while ($true) {
if (-not (Test-Path $xml_file)) {
Start-Sleep -Milliseconds 200
}
else {
break
}
}
# load to XML Document (cast text array to object)
$drawio_xml = [xml](Get-Content $xml_file)
# for each page export png
for ($i = 0; $i -lt $drawio_xml.mxfile.pages; $i++) {
$file_out = "$($file.DirectoryName)/$($file.BaseName)$($i + 1).png"
& "C:/Program Files/draw.io/draw.io.exe" '--export' '--border' '10' '--page-index' $i '--output' $file_out $file.FullName
}
# wait for last file PNG image file
while ($true) {
if (-not (Test-Path "$($file.DirectoryName)/$($file.BaseName)$($drawio_xml.mxfile.pages).png")) {
Start-Sleep -Milliseconds 200
}
else {
break
}
}
# remove/delete XML file
if ((Test-Path $xml_file)) {
Remove-Item -Path $xml_file -Force
}
# export 'vsdx' & 'pdf'
& "C:/Program Files/draw.io/draw.io.exe" '--export' '--format' 'vsdx' $file.FullName
Start-Sleep -Milliseconds 1000
& "C:/Program Files/draw.io/draw.io.exe" '--export' '--format' 'pdf' $file.FullName
}

Optimizing Script for Downloading Files from Library with Folders

Due to some extenuating circumstances we had to resort to migrating a teams content from one farm to another via locally downloading the files from their lists and shipping them. Unfortunately, this team used folders in their lists VERY heavily. And they wanted to maintain their folder structure for document organization purposes, so I had to come up with a script that could download the contents of a list, folders and all, while also maintaining that folder structure. I have years of SP admin experience, but my PS scripting has always been very basic to intermediate usage at best so I am reaching out to you all to help me optimize this script.
add-pssnapin microsoft.sharepoint.powershell
$w=get-spweb "http://URL"
$tlfolder=$w.getfolder("Old_Surveys")
$subfold=$tlfolder.subfolders|select *|?{$_.name -match "historical"}
$subfolders=$subfold.subfolders
foreach ($subfolder in $subfolders)
{
new-item -path "D:\backups\Old_Surveys\Historical Company Surveys" -name $subfolder.name -itemtype "directory"
#if the 2 level sub folder exists
if ($subfolder.subfolders -ne $null)
{
$subfoldername=$subfolder.name
$subsubfolders=$subfolder.subfolders
foreach ($subsubfolder in $subsubfolders)
{
new-item -path "D:\backups\Old_Surveys\Historical Company Surveys\$subfoldername" -name $subsubfolder.name -itemtype "directory"
# if the 3 level sub folder exists
if ($subsubfolder.subfolders -ne $null)
{
$subsubfoldername=$subsubfolder.name
$subsubsubfolders=$subsubfolder.subfolders
foreach ($subsubsubfolder in $subsubsubfolders)
{
new-item -path "D:\backups\Old_Surveys\Historical Company Surveys\$subfoldername\$subsubfoldername" -name $subsubsubfolder.name -itemtype "directory"
#if the 3 level sub files exists
if ($subsubsubfolder.files -ne $null)
{
$subsubsubfiles=$subsubsubfolder.files
$subsubsubfoldername=$subsubsubfolder.name
foreach ($subsubsubfile in $subsubsubfiles)
{
write-host $subsubsubfile.name
$b=$subsubsubfile.openbinary()
$fs=new-object system.io.filestream(("d:\backups\Old_Surveys\Historical Company Surveys\$subfoldername\$subsubfoldername\$subsubsubfoldername"+$subsubsubfile.name), [system.io.filemode]::create)
$bw=new-object system.io.binarywriter($fs)
$bw.write($b)
$bw.close()
}
}
}
}
if ($subsubfolder.files -ne $null)
{
$subsubfiles=$subsubfolder.files
$subsubfoldername=$subsubfolder.name
foreach ($subsubfile in $subsubfiles)
{
write-host $subsubfile.name
$b=$subsubfile.openbinary()
$fs=new-object system.io.filestream(("d:\backups\Old_Surveys\Historical Company Surveys\$subfoldername\$subsubfoldername\"+$subsubfile.name), [system.io.filemode]::create)
$bw=new-object system.io.binarywriter($fs)
$bw.write($b)
$bw.close()
}
}
}
if ($subfolder.files -ne $null)
{
$subfiles=$subfolder.files
$subfoldername=$subfolder.name
foreach ($subfile in $subfiles)
{
write-host $subfile.name
$b=$subfile.openbinary()
$fs=new-object system.io.filestream(("d:\backups\Old_Surveys\Historical Company Surveys\$subfoldername\"+$subfile.name), [system.io.filemode]::create)
$bw=new-object system.io.binarywriter($fs)
$bw.write($b)
$bw.close()
}
}
}
}
The objective of the code was to iterate down thru any folders, and subfolders of the folders, and create them on the local drive and download any files that are in them at that level. I did this manually for up to 3-sub levels of folders, but if there are more levels lower than that then I would have had to keep manually adding all the code and going even longer. Surely there has to be a better way of doing this?

How to list all iso files of a particular datastore using powercli in VMware vSphere

I am trying to list all iso files in ISO directory of a particular datastore in VMware vSphere using powercli. I am able to list all the isos in all the datastores using the below command though but not able to do so for a particular datastore.
dir vmstores:\ -Recurse -Include *.iso | Select Name,FolderPath
I think you'll need to do something more like:
dir vmstore:\datacentername\datastorename -Recurse -Include *.iso | Select Name, FolderPath
I found that the below set of commands worked.
$datastoreName = 'enter_name_of_datastore'
$ds = Get-Datastore -Name $datastoreName
New-PSDrive -Location $ds -Name DS -PSProvider VimDatastore -Root '\' | Out-Null
Get-ChildItem -Path DS:\ISO -Include *.iso -Recurse | Select Name,FolderPath
Remove-PSDrive -Name DS -Confirm:$false

Powershell script to open another script as admin

I have a script that I can double click and it'll open other scripts as admin. Works with some things but not everything. For one script, it opens the next window and then immediately closes it. For another, I get this error:
At MYPATH\InstallClient.ps1:33 char:78
+ ... tall_x64.msi" -force -recurse -ErrorAction Stop #Cleans out the file ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The string is missing the terminator: ".
At MYPATH\InstallClient.ps1:27 char:31
+ ForEach ($entry in $computers){ #start of foreach loop
+ ~
Missing closing '}' in statement block or type definition.
+ CategoryInfo : ParserError: (:) [], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : TerminatorExpectedAtEndOfString
Below is the script to open a script as an admin:
Function Get-FileName($initialDirectory)
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $initialDirectory
$OpenFileDialog.filter = "PS1 (*.ps1)| *.ps1"
$OpenFileDialog.ShowDialog() | Out-Null
$OpenFileDialog.filename
}
$inputfile = Get-FileName "MYPATH\Scripts"
powershell.exe -noprofile -command "&{start-process powershell -ArgumentList '-NoExit -noprofile -file $inputfile' -verb RunAs}"
This is the script that it gives the previous error for while trying to open:
Function Get-FileName($initialDirectory) #Function to choose a file
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $initialDirectory
$OpenFileDialog.filter = "MSI (*.msi)| *.msi" #type of files that will be available for selection
$OpenFileDialog.ShowDialog() | Out-Null
$OpenFileDialog.filename
}
$inputfile = Get-FileName "MyPath" #Directory that is going to open to select a file from
Function Get-FileName($initialDirectory) #Function to choose a file
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $initialDirectory
$OpenFileDialog.filter = "CSV (*.csv)| *.csv" #type of files that will be available for selection
$OpenFileDialog.ShowDialog() | Out-Null
$OpenFileDialog.filename
}
$inputfile1 = Get-FileName "MyPath\ServerLists"
$computers = import-csv $inputfile1
ForEach ($entry in $computers){ #start of foreach loop
$computername = $entry.computernames #this saves the single entry under computernames for each entry in csv file
Copy-item $inputfile -container -recurse \\$computername\C$\windows\temp #this copies the msi file that we selected to the computer entry called from the csv file's temp folder
Invoke-Command -Computername $computername –ScriptBlock {Start-process -Wait "C:\windows\temp\ShadowSuiteClientInstall_x64.msi"} | out-null #This starts the msi file that we just copied and waits for the installation to be completed before moving on
If($?){ #If the last command was successful
Echo "Installed ShadowSuiteClientInstall_x64 on $computername."
Remove-Item "\\$computername\C$\windows\temp\ShadowSuiteClientInstall_x64.msi" -force -recurse -ErrorAction Stop #Cleans out the file we copied into the temp folder
}
}
Does anyone have any ideas on why this will open some things fine but give this error for this script and immediately close other scripts without running them? Does anyone have a better way to navigate through scripts and select one to open as admin?
Ok I figured this out. I loaded the script into powershell ISE and I saw that it was compiling it incorrectly. It kept turning the -Scriptblock into an ae symbol instead of the - in front of scriptblock. Weird AF IMO but ok, I fixed it in ISE, which I recommend to anyone struggling with weird compiling errors like this.

Powershell script to list Zip files in a Directory with their Size

I have came across a way to list Zip content in a Directory - I would like to include the file size information.
[Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem')
foreach($sourceFile in (Get-ChildItem -filter '*.zip'))
{
[IO.Compression.ZipFile]::OpenRead($sourceFile.FullName).Entries.FullName |
%{ "$sourcefile`:$_" }
}
Could someone help me to add size information - current display format is:
zipName:zipContentName
I would like to keep formatting as:
zipName:zipContentName:zipContentSize
Rather than looping through the FullNames, loop through the entries themselves:
[Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem')
foreach($sourceFile in (Get-ChildItem -filter '*.zip'))
{
[IO.Compression.ZipFile]::OpenRead($sourceFile.FullName).Entries |
%{ "$sourcefile`:$($_.FullName):$($_.Length)"; }
}