Add publish directory to ccnet header.xml - build

I have a CruiseControl.net build that compiles all the binaries, creates an installation and publishes the install files and log files to a server location.
The actual final directory name is dynamic to include the YYYYMMDD_HH_MM_SS in the path name.
Example: <server>\Path\2-Tuesday\MyBuild_2014_08_06_07_23_15
I include the publisher event to send emails to our development and QA teams. In this email I would like to include the publish path for the build to make it easier for users to find the build.
I believe I want to modify the header.xls file in /server/xls/
However, I am not certain how to include the path?
My publishing script is a powershell script. Below is a code snippet
$dOfWeek = (Get-Date).dayofweek.toString()
$date = Get-Date
$n = [int]$date.dayofweek
$dest = Join-Path -Path $publishDir.value -ChildPath "$n-$dOfWeek"
$day = Get-Date -Format yyyyMMdd
$time = Get-Date -Format HH_mm_ss
$pubFolder="Bld" + $day + "_" + $time
$publishPath=Join-Path -Path $dest -ChildPath $pubFolder
Note that $publishDir is a parameter passed to the function that formats this.
How do I set this up so that I notify ccnet of this path, and how do I incorporate the value in header.xls?
Thank you.
Sincerely,
Daniel Lee

Use a file merge task to "notify" CC of your custom information. The information will show up in the CC xml build log. See:
File Merge
Then edit header.xsl or compile.xsl to transform the new xml into html to show up in the build emails.

Related

powershell WebClient parse links or regex for zip file

I have a script that nearly works but I need to add in either parsing or a wildcard on the URL as the URL will change to characters I won’t know each month. I have to use New-Object System.Net.WebClient because invoke web request is blocked. So, I was thinking if anyone knows how to download the link using the characters that I will know and strip off the rest of the link.
Examples of the links below
Full-CSV-data-file-Jan19-ZIP-3633K-53821.zip
Full-CSV-data-file-Dec18-ZIP-3427K.zip
Full-CSV-data-file-Nov18-ZIP-3543K-21860.zip
So on the above links i know the latest file will have Jan19 in it and it's a zip file. The script i am using is
$currentMonthNo = get-date -format "MM"
$currentMonthName = (get-date((get-date).addmonths(-2)) -format MMM)
$currentYearNo = get-date –format yy
$url = "http://www.website.com/Full-CSV-data-file-$currentMonthName$currentYearNo-ZIP-3633K-53821.zip"
$output = "C:\Folder\Full-CSV-data-file-Jan19-ZIP-3633K-53821.zip"
$start_time = Get-Date
$wc = New-Object System.Net.WebClient
$wc.DownloadFile($url, $output)
#OR
(New-Object System.Net.WebClient).DownloadFile($url, $output)
Write-Output "Time taken: $((Get-Date).Subtract($start_time).Seconds) second(s)"
Write-Output $url
Start-Sleep -s 6

Download podcast where the file name (date) changes weekly [duplicate]

This question already has an answer here:
WebClient downloadfile
(1 answer)
Closed 4 years ago.
I want to be able to download a .MP3 podcast that comes out weekly, I've got a working script which downloads the file, the problem is the file name changes weekly (the date is in the file name)
news-2018-12-09.mp3,
news-2018-12-16.mp3,
news-2018-12-23.mp3.
This is the code I have:
# Start IE and navigate to your download file/location
$ie = New-Object -Com internetExplorer.Application
$ie.Navigate("<address>2018-12-09.mp3")
# Wait for Download Dialog box to pop up
Sleep 5
while ($ie.Busy) {Sleep 1}
# Hit "S" on the keyboard to hit the "Save" button on the download box
$obj = New-Object -Com WScript.Shell
$obj.AppActivate('Internet Explorer')
$obj.SendKeys('s')
# Hit "Enter" to save the file
$obj.SendKeys('{Enter}')
# Closes IE Downloads window
$obj.SendKeys('{TAB}')
$obj.SendKeys('{TAB}')
$obj.SendKeys('{TAB}')
$obj.SendKeys('{Enter}')
Is there a particular Regex sequence that would check and file name, download the current one, and possibly when saving the file, save it just as news-current.mp3?
If you know the format of the file is always news- + current date + .mp3 it can't be hard to construct that..
Something like $fileName = 'news-{0}.mp3' -f (Get-Date -Format 'yyyy-MM-dd') would do it.
As for the way to download the file, there are better ways of doing this I think than to make use of the Internet Explorer Com object.
$address = '<PUT THE URL FOR THE DOWNLOAD IN HERE>'
$fileName = 'news-{0}.mp3' -f (Get-Date -Format 'yyyy-MM-dd')
$downloadUrl = '{0}/{1}' -f $address, $fileName
$outputFile = Join-Path -Path $PSScriptRoot -ChildPath 'news-current.mp3'
.NET WebClient.
(New-Object System.Net.WebClient).DownloadFile($downloadUrl, $outputFile)
Invoke-WebRequest.
Invoke-WebRequest -Uri $downloadUrl -OutFile $outputFile
Start-BitsTransfer.
Import-Module BitsTransfer
Start-BitsTransfer -Source $downloadUrl -Destination $outputFile

Using powershell to download an embedded video

I need to download a monthly broadcast automatically (will set a scheduled task) using powershell.
Here is the embedded URL: https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E
The only thing that changes each month is the 201602, 201603, etc. Once I have able to pull the 720p video file, I will work on programmatically adding that part of the URL, based on the current system clock (I can manage this)
I have tried these without success:
Attempt 1:
$source = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$destination = "c:\broadcasts\test.mp4"
Invoke-WebRequest $source -OutFile $destination
Attempt 2:
$source = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$dest = "c:\broadcasts\test.mp4"
$wc = New-Object System.Net.WebClient
$wc.DownloadFile($source, $dest)
Attempt 3:
Import-Module BitsTransfer
$url = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$output = "c:\broadcasts\test.mp4"
Start-BitsTransfer -Source $url -Destination $output
Both of these end up with a test.mp4 that is basically just an empty file.
Then I found the another page that holds the video (and the download links for different qualities) and tried to pull these links using the following (I know I could have used $webpage.links):
Attempt 4:
$webpage=Invoke-webrequest "http://tv.jw.org/#en/video/VODStudio/pub-
jwb_201601_1_VIDEO"
$webpage.RawContent | Out-File "c:\scripts\webpage.txt" ASCII -Width 9999
And found that the raw content doesn't have the mp4 visible. My idea was to pull the raw content, parse it with regex and grab the 720p URL, save it in a variable and then send that to a BitsTransfer bit of code.
Please help?

Backup List in Sharepoint 2010 in Shell

I want to backup a List of Sharepoint 2010 using the powershell.
I can backup the list using the central Administration and can also backup the whole Site using
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp
But when I try to export a specific List (with the path that is also shown using the Central Administration):
Export-SPWeb -Identity http://siteurl:22222/en-us/Lists/MyList -Path \\public\backup.cmp
I receive the error:
"The URL provided is invalid. Only valid URLs
that are site collections or sites are allowed to be exported using
stsadm.exe"
I also tried
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp -ItemURL http://siteurl:22222/en-us/Lists/MyList
getting the same error
Thanks in advance
Try to fiddle with the ItemUrl parameter value:
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp
-ItemUrl /Lists/MyList
or
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp
-ItemUrl /en-us/Lists/MyList
or
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp
-ItemUrl "/Lists/MyList"
Different sources show different syntax:
SharePoint 2010 Granular Backup-Restore Part 1
Failing to export with Export-SPWeb
If you have the following kind of setup:
Site Collection e.g. http://localhost:81
|
|-> Subsite 1 e.g. tools (http://localhost:81/tools)
|
|-> Subsite 2 e.g. admin (http://localhost:81/tools/admin)
I found the following worked for lists on the subsite:
Export-SPWeb -Identity http://<site>:<port>/<subsite1>/<subsite2> -ItemUrl /<subsite1>/<subsite2>/<listName> -Path <localpath>/<filename>.cmp -IncludeVersions All
e.g.
Export-SPWeb -Identity http://localhost:81/tools/admin/ -ItemUrl /tools/admin/RequestList -Path C:/Temp/Backup.cmp -IncludeVersions All
To ensure you've got the right url for your list, use the following command (thanks to HAZET here: http://social.technet.microsoft.com/Forums/en-US/sharepoint2010setup/thread/a1f48e70-9360-440f-b160-525fbf2b8412/):
$(Get-SPWeb -identity http://<site>:<port>/<subsite1>/<subsite2>).lists | ft title, #{Name="itemURL"; Expression = { $_.parentWebURL + "/" + $_.RootFolder}}
e.g.
$(Get-SPWeb -identity http://localhost:81/tools/admin/).lists | ft title, #{Name="itemURL"; Expression = { $_.parentWebURL + "/" + $_.RootFolder}}
Some examples of various errors I encountered whilst trying to get this to work:
The URL provided is invalid
Export-SPWeb : <nativehr>0x80070057</nativehr><nativestack></nativestack> At line:1 char:13
CategoryInfo : InvalidData: (Microsoft.Share...CmdletExportWeb: SPCmdletExportWeb) [Export-SPWeb], SPException FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletExportWeb
Some things to check:
Check that -Identity has the trailing slash i.e. http://localhost:81/
Check that you have the complete URL in Identity (if using subsites, include subsites)
Check that the path where you're trying to store your export file exists
Check that your ItemUrl is correct (i.e. starts with / and is a directory, not a specific file e.g. is /tools/admin/RequestsList, not /tools/admin/RequestsList/AllItems.aspx
Check that you have permissions to perform the export
Further info that might be helpful:
Identity: The URL of your SharePoint site
ItemUrl: The relative URL of your list/document library
Path: The target filename and location for the exported list e.g. C:/Temp/backup.cmp
IncludeVersion: What versions of the document you wish to export.
Export-SPWeb
http://technet.microsoft.com/en-us/library/ff607895.aspx
Export a site, list or document library in SharePoint 2010
http://technet.microsoft.com/en-us/library/ee428301.aspx
Import a list or document library in SharePoint 2010
http://technet.microsoft.com/en-us/library/ee428322.aspx

VisualSVN post-commit hook with batch file

I'm running VisualSVN on a Windows server.
I'm trying to add a post-commit hook to update our staging project whenever a commit happens.
In VisualSVN, if I type the command in the hook/post-commit dialog, everything works great.
However, if I make a batch file with the exact same command, I get an error that says the post-commit hook has failed. There is no additional information.
My command uses absolute paths.
I've tried putting the batch file in the VisualSVN/bin directory, I get the same error there.
I've made sure VisualSVN has permissions for the directories where the batch file is.
The only thing I can think of is I'm not calling it correctly from VisualSVN. I'm just replacing the svn update command in the hook/post-commit dialog with the batch file name ("c:\VisualSVN\bin\my-batch-file.bat") I've tried it with and without the path (without the path it doesn't find the file at all).
Do I need to use a different syntax in the SVNCommit dialog to call the batch file? What about within the batch file (It just has my svn update command. It works if I run the batch file from the command line.)
Ultimately I want to use a batch file because I want to do a few more things after the commit.
When using VisualSVN > Select the Repo > Properties > Hooks > Post-commit hook.
Where is the code I use for Sending an Email then running a script, which has commands I want to customize
"%VISUALSVN_SERVER%\bin\VisualSVNServerHooks.exe" ^
commit-notification "%1" -r %2 ^
--from support#domainname.com --to "support#domainname.com" ^
--smtp-server mail.domainname.com ^
--no-diffs ^
--detailed-subject
--no-html
set PWSH=%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe
%PWSH% -command $input ^| C:\ServerScripts\SVNScripts\post-commit-wp.ps1 %1 %2
if errorlevel 1 exit %errorlevel%
The script file is located on C:\ServerScripts\SVNScripts\
post-commit-wp.ps1 and I pass in two VisualSVN variables as %1 and %2
%1 = serverpathwithrep
%2 = revision number
The script file is written in Windows PowerShell
# PATH TO SVN.EXE
$svn = "C:\Program Files\VisualSVN Server\bin\svn.exe"
$pathtowebistesWP = "c:\websites-wp\"
# STORE HOOK ARGUMENTS INTO FRIENDLY NAMES
$serverpathwithrep = $args[0]
$revision = $args[1]
# GET DIR NAME ONLY FROM REPO-PATH STRING
# EXAMPLE: C:\REPOSITORIES\DEVHOOKTEST
# RETURNS 'DEVHOOKTEST'
$dirname = ($serverpathwithrep -split '\\')[-1]
# Combine ServerPath with Dir name
$exportpath = -join($pathtowebistesWP, $dirname);
# BUILD URL TO REPOSITORY
$urepos = $serverpathwithrep -replace "\\", "/"
$url = "file:///$urepos/"
# --------------------------------
# SOME TESTING SCRIPTS
# --------------------------------
# STRING BUILDER PATH + DIRNAME
$name = -join($pathtowebistesWP, "testscript.txt");
# CREATE FILE ON SERVER
New-Item $name -ItemType file
# APPEND TEXT TO FILE
Add-Content $name $pathtowebistesWP
Add-Content $name $exportpath
# --------------------------------
# DO EXPORT REPOSITORY REVISION $REVISION TO THE ExportPath
&"$svn" export -r $revision --force "$url" $exportpath
I added comments to explain each line and what it does. In a nutshell, the scripts:
Gets all the parameters
Build a local dir path
Runs SVN export
Places files to a website/publish directory.
Its a simple way of Deploying your newly committed code to a website.
Did you try to execute batch file using 'call' command? I mean:
call C:\Script\myscript.bat
I was trying the same thing and found that you also must have the script in the hooks folder.. the bat file that is.