Draw.io - how to export all tabs to images using command line - draw.io

I have installed on my PC draw.io app. I want to export all tabs with drawings to seperate files. The only options I have found is:
"c:\Program Files\draw.io\draw.io.exe" --crop -x -f jpg c:\Users\user-name\Documents\_xxx_\my-file.drawio
Help for draw.io
Usage: draw.io [options] [input file/folder]
Options:
(...)
-x, --export export the input file/folder based on the
given options
-r, --recursive for a folder input, recursively convert
all files in sub-folders also
-o, --output <output file/folder> specify the output file/folder. If
omitted, the input file name is used for
output with the specified format as
extension
-f, --format <format> if output file name extension is
specified, this option is ignored (file
type is determined from output extension,
possible export formats are pdf, png, jpg,
svg, vsdx, and xml) (default: "pdf")
(default: 0)
-a, --all-pages export all pages (for PDF format only)
-p, --page-index <pageIndex> selects a specific page, if not specified
and the format is an image, the first page
is selected
-g, --page-range <from>..<to> selects a page range (for PDF format only)
(...)
is not supporting. I can use one of this:
-p, --page-index <pageIndex> selects a specific page, if not specified
and the format is an image, the first page
is selected
-g, --page-range <from>..<to> selects a page range (for PDF format only)
but how to get page-range or number of pages to select index?

There is no easy way to find the number of pages out of the box with Draw.io's CLI options.
One solution would be export the diagram as XML.
draw.io --export --format xml --uncompressed test-me.drawio
And then count how many diagram elements there are. It should equal the number of pages (I briefly tested this but I'm not 100% sure if diagram element only appears once per page).
grep -o "<diagram" "test-me.xml" | wc -l
Here is an example of putting it all together in a bash script (I tried this on MacOS 10.15)
#!/bin/bash
file=test-me # File name excluding extension
# Export diagram to plain XML
draw.io --export --format xml --uncompressed "$file.drawio"
# Count how many pages based on <diagram element
count=$(grep -o "<diagram" "$file.xml" | wc -l)
# Export each page as an PNG
# Page index is zero based
for ((i = 0 ; i <= $count-1; i++)); do
draw.io --export --page-index $i --output "$file-$i.png" "$file.drawio"
done

OP did ask the question with reference to the Windows version, so here's a PowerShell solution inspired by eddiegroves
$DIR_DRAWIO = "."
$DrawIoFiles = Get-ChildItem $DIR_DRAWIO *.drawio -File
foreach ($file in $DrawIoFiles) {
"File: '$($file.FullName)'"
$xml_file = "$($file.DirectoryName)/$($file.BaseName).xml"
if ((Test-Path $xml_file)) {
Remove-Item -Path $xml_file -Force
}
# export to XML
& "C:/Program Files/draw.io/draw.io.exe" '--export' '--format' 'xml' $file.FullName
# wait for XML file creation
while ($true) {
if (-not (Test-Path $xml_file)) {
Start-Sleep -Milliseconds 200
}
else {
break
}
}
# load to XML Document (cast text array to object)
$drawio_xml = [xml](Get-Content $xml_file)
# for each page export png
for ($i = 0; $i -lt $drawio_xml.mxfile.pages; $i++) {
$file_out = "$($file.DirectoryName)/$($file.BaseName)$($i + 1).png"
& "C:/Program Files/draw.io/draw.io.exe" '--export' '--border' '10' '--page-index' $i '--output' $file_out $file.FullName
}
# wait for last file PNG image file
while ($true) {
if (-not (Test-Path "$($file.DirectoryName)/$($file.BaseName)$($drawio_xml.mxfile.pages).png")) {
Start-Sleep -Milliseconds 200
}
else {
break
}
}
# remove/delete XML file
if ((Test-Path $xml_file)) {
Remove-Item -Path $xml_file -Force
}
# export 'vsdx' & 'pdf'
& "C:/Program Files/draw.io/draw.io.exe" '--export' '--format' 'vsdx' $file.FullName
Start-Sleep -Milliseconds 1000
& "C:/Program Files/draw.io/draw.io.exe" '--export' '--format' 'pdf' $file.FullName
}

Related

Download podcast where the file name (date) changes weekly [duplicate]

This question already has an answer here:
WebClient downloadfile
(1 answer)
Closed 4 years ago.
I want to be able to download a .MP3 podcast that comes out weekly, I've got a working script which downloads the file, the problem is the file name changes weekly (the date is in the file name)
news-2018-12-09.mp3,
news-2018-12-16.mp3,
news-2018-12-23.mp3.
This is the code I have:
# Start IE and navigate to your download file/location
$ie = New-Object -Com internetExplorer.Application
$ie.Navigate("<address>2018-12-09.mp3")
# Wait for Download Dialog box to pop up
Sleep 5
while ($ie.Busy) {Sleep 1}
# Hit "S" on the keyboard to hit the "Save" button on the download box
$obj = New-Object -Com WScript.Shell
$obj.AppActivate('Internet Explorer')
$obj.SendKeys('s')
# Hit "Enter" to save the file
$obj.SendKeys('{Enter}')
# Closes IE Downloads window
$obj.SendKeys('{TAB}')
$obj.SendKeys('{TAB}')
$obj.SendKeys('{TAB}')
$obj.SendKeys('{Enter}')
Is there a particular Regex sequence that would check and file name, download the current one, and possibly when saving the file, save it just as news-current.mp3?
If you know the format of the file is always news- + current date + .mp3 it can't be hard to construct that..
Something like $fileName = 'news-{0}.mp3' -f (Get-Date -Format 'yyyy-MM-dd') would do it.
As for the way to download the file, there are better ways of doing this I think than to make use of the Internet Explorer Com object.
$address = '<PUT THE URL FOR THE DOWNLOAD IN HERE>'
$fileName = 'news-{0}.mp3' -f (Get-Date -Format 'yyyy-MM-dd')
$downloadUrl = '{0}/{1}' -f $address, $fileName
$outputFile = Join-Path -Path $PSScriptRoot -ChildPath 'news-current.mp3'
.NET WebClient.
(New-Object System.Net.WebClient).DownloadFile($downloadUrl, $outputFile)
Invoke-WebRequest.
Invoke-WebRequest -Uri $downloadUrl -OutFile $outputFile
Start-BitsTransfer.
Import-Module BitsTransfer
Start-BitsTransfer -Source $downloadUrl -Destination $outputFile

Parsing files to populate placeholders with values

Background Info
I have a JS library which consists of many constructor functions. I am using grunt-concat & -uglify to compile these into a single file.
Each constructor has a readme.md file.
The library is used to create advertising banners. It is used by around 10 developers who create their own advertising templates within a folder Templates which use the library. The files are .xml files which also provide a CDATA tag where they can insert their JavaScript code.
Question
I would like to populate the readme files with a counter, so that the developers can see how popular a particular constructor is directly in its documentation.
Number of occurrences (<% occurrences %>)
What I've already done
I can get the number of occurrences by executing
find . -name "*.xml" -exec grep -e "new\Foo\.Bar" {} \; | wc -l
It would be great if I could grab this value and insert it to the readme file.
grunt.registerTask('count_occurrences', '', function () {
var exec = require('child_process').execSync;
var result = exec("find . -name "*.xml" -exec grep -e "new\Foo\.Bar" {} \; | wc -l", { encoding: 'utf8' });
grunt.log.writeln(result);
// Now write result to your README file
grunt.file.write("README.md", result);
});
or
You can use the grunt plugin called exec (https://github.com/jharding/grunt-exec) to execute cmd line functions, e.g. find.
You probably would want something like this in your GruntFile.js:
exec: {
count_occurrences: {
cmd: function() {
return 'find . -name "*.xml" -exec grep -e "new\Foo\.Bar" {} \; | wc -l';
}
}
}
Then call grunt exec:echo_name

How to get a .txt file of all folders names inside a folder but not the files/subfolders inside the child folders using powershell?

Currently I'm using, SHIFT + Right Click > Open powershell window here > then paste in
dir -recurse | select -ExpandProperty Name | foreach {$_.split('.',2)[0]} | out-file file.txt
Only problem is I choose a directory to SHIFT + Right click, but I get all the names of the files/folders inside the second folder too and it really ruins the organization I'm going for.
So for example I have a folder called "RootFolder".
Inside RootFolder is 10 other folders called "Folder1" through "Folder10".
I only want the names Folder1 - Folder10 to be inside a .txt folder the shell command creates. I do not want Subfolders/files inside folders Folder1-10 in the .txt file.
If you want to only list directories you need to tell it.
Dir -R | ? { $_.PSIsContainer }
or
Dir -R | ? { $_ -is [System.IO.DirectoryInfo] }
In more recent PowerShell versions you can do it directly.
Dir -Dir
Dir = Get-ChildItem
-R = -Recurse
? = Where-Object
-Dir = -Directory
If you only want directories within the named directory, you don't want to recurse. That will travel the entire directory hierarchy; just use the -Directory switch to filter the output to only include directories.
Then, you can just extract the Name property from the DirectoryInfo objects.
PS C:\> dir C:\Windows -Directory |% { $_.Name }
addins
appcompat
apppatch
AppReadiness
[...]
and output as desired. Even more concisely:
(dir C:\Windows -Directory).Name

Using powershell to download an embedded video

I need to download a monthly broadcast automatically (will set a scheduled task) using powershell.
Here is the embedded URL: https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E
The only thing that changes each month is the 201602, 201603, etc. Once I have able to pull the 720p video file, I will work on programmatically adding that part of the URL, based on the current system clock (I can manage this)
I have tried these without success:
Attempt 1:
$source = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$destination = "c:\broadcasts\test.mp4"
Invoke-WebRequest $source -OutFile $destination
Attempt 2:
$source = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$dest = "c:\broadcasts\test.mp4"
$wc = New-Object System.Net.WebClient
$wc.DownloadFile($source, $dest)
Attempt 3:
Import-Module BitsTransfer
$url = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$output = "c:\broadcasts\test.mp4"
Start-BitsTransfer -Source $url -Destination $output
Both of these end up with a test.mp4 that is basically just an empty file.
Then I found the another page that holds the video (and the download links for different qualities) and tried to pull these links using the following (I know I could have used $webpage.links):
Attempt 4:
$webpage=Invoke-webrequest "http://tv.jw.org/#en/video/VODStudio/pub-
jwb_201601_1_VIDEO"
$webpage.RawContent | Out-File "c:\scripts\webpage.txt" ASCII -Width 9999
And found that the raw content doesn't have the mp4 visible. My idea was to pull the raw content, parse it with regex and grab the 720p URL, save it in a variable and then send that to a BitsTransfer bit of code.
Please help?

VisualSVN post-commit hook with batch file

I'm running VisualSVN on a Windows server.
I'm trying to add a post-commit hook to update our staging project whenever a commit happens.
In VisualSVN, if I type the command in the hook/post-commit dialog, everything works great.
However, if I make a batch file with the exact same command, I get an error that says the post-commit hook has failed. There is no additional information.
My command uses absolute paths.
I've tried putting the batch file in the VisualSVN/bin directory, I get the same error there.
I've made sure VisualSVN has permissions for the directories where the batch file is.
The only thing I can think of is I'm not calling it correctly from VisualSVN. I'm just replacing the svn update command in the hook/post-commit dialog with the batch file name ("c:\VisualSVN\bin\my-batch-file.bat") I've tried it with and without the path (without the path it doesn't find the file at all).
Do I need to use a different syntax in the SVNCommit dialog to call the batch file? What about within the batch file (It just has my svn update command. It works if I run the batch file from the command line.)
Ultimately I want to use a batch file because I want to do a few more things after the commit.
When using VisualSVN > Select the Repo > Properties > Hooks > Post-commit hook.
Where is the code I use for Sending an Email then running a script, which has commands I want to customize
"%VISUALSVN_SERVER%\bin\VisualSVNServerHooks.exe" ^
commit-notification "%1" -r %2 ^
--from support#domainname.com --to "support#domainname.com" ^
--smtp-server mail.domainname.com ^
--no-diffs ^
--detailed-subject
--no-html
set PWSH=%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe
%PWSH% -command $input ^| C:\ServerScripts\SVNScripts\post-commit-wp.ps1 %1 %2
if errorlevel 1 exit %errorlevel%
The script file is located on C:\ServerScripts\SVNScripts\
post-commit-wp.ps1 and I pass in two VisualSVN variables as %1 and %2
%1 = serverpathwithrep
%2 = revision number
The script file is written in Windows PowerShell
# PATH TO SVN.EXE
$svn = "C:\Program Files\VisualSVN Server\bin\svn.exe"
$pathtowebistesWP = "c:\websites-wp\"
# STORE HOOK ARGUMENTS INTO FRIENDLY NAMES
$serverpathwithrep = $args[0]
$revision = $args[1]
# GET DIR NAME ONLY FROM REPO-PATH STRING
# EXAMPLE: C:\REPOSITORIES\DEVHOOKTEST
# RETURNS 'DEVHOOKTEST'
$dirname = ($serverpathwithrep -split '\\')[-1]
# Combine ServerPath with Dir name
$exportpath = -join($pathtowebistesWP, $dirname);
# BUILD URL TO REPOSITORY
$urepos = $serverpathwithrep -replace "\\", "/"
$url = "file:///$urepos/"
# --------------------------------
# SOME TESTING SCRIPTS
# --------------------------------
# STRING BUILDER PATH + DIRNAME
$name = -join($pathtowebistesWP, "testscript.txt");
# CREATE FILE ON SERVER
New-Item $name -ItemType file
# APPEND TEXT TO FILE
Add-Content $name $pathtowebistesWP
Add-Content $name $exportpath
# --------------------------------
# DO EXPORT REPOSITORY REVISION $REVISION TO THE ExportPath
&"$svn" export -r $revision --force "$url" $exportpath
I added comments to explain each line and what it does. In a nutshell, the scripts:
Gets all the parameters
Build a local dir path
Runs SVN export
Places files to a website/publish directory.
Its a simple way of Deploying your newly committed code to a website.
Did you try to execute batch file using 'call' command? I mean:
call C:\Script\myscript.bat
I was trying the same thing and found that you also must have the script in the hooks folder.. the bat file that is.