This question already has an answer here:
WebClient downloadfile
(1 answer)
Closed 4 years ago.
I want to be able to download a .MP3 podcast that comes out weekly, I've got a working script which downloads the file, the problem is the file name changes weekly (the date is in the file name)
news-2018-12-09.mp3,
news-2018-12-16.mp3,
news-2018-12-23.mp3.
This is the code I have:
# Start IE and navigate to your download file/location
$ie = New-Object -Com internetExplorer.Application
$ie.Navigate("<address>2018-12-09.mp3")
# Wait for Download Dialog box to pop up
Sleep 5
while ($ie.Busy) {Sleep 1}
# Hit "S" on the keyboard to hit the "Save" button on the download box
$obj = New-Object -Com WScript.Shell
$obj.AppActivate('Internet Explorer')
$obj.SendKeys('s')
# Hit "Enter" to save the file
$obj.SendKeys('{Enter}')
# Closes IE Downloads window
$obj.SendKeys('{TAB}')
$obj.SendKeys('{TAB}')
$obj.SendKeys('{TAB}')
$obj.SendKeys('{Enter}')
Is there a particular Regex sequence that would check and file name, download the current one, and possibly when saving the file, save it just as news-current.mp3?
If you know the format of the file is always news- + current date + .mp3 it can't be hard to construct that..
Something like $fileName = 'news-{0}.mp3' -f (Get-Date -Format 'yyyy-MM-dd') would do it.
As for the way to download the file, there are better ways of doing this I think than to make use of the Internet Explorer Com object.
$address = '<PUT THE URL FOR THE DOWNLOAD IN HERE>'
$fileName = 'news-{0}.mp3' -f (Get-Date -Format 'yyyy-MM-dd')
$downloadUrl = '{0}/{1}' -f $address, $fileName
$outputFile = Join-Path -Path $PSScriptRoot -ChildPath 'news-current.mp3'
.NET WebClient.
(New-Object System.Net.WebClient).DownloadFile($downloadUrl, $outputFile)
Invoke-WebRequest.
Invoke-WebRequest -Uri $downloadUrl -OutFile $outputFile
Start-BitsTransfer.
Import-Module BitsTransfer
Start-BitsTransfer -Source $downloadUrl -Destination $outputFile
Related
We would like to transfer the development via azure devops to another company and we ask ourselves whether not only new releases can be pushed through this pipeline. But also data could be downloaded from the productive environment via the azure devops or aws devops pipeline?
I researched myself but found nothing about it.
does any of you have more information on this?
Thank you
Is it possible to download files/data during the build pipeline on Azure DevOps?
In Azure DevOps, there isn't a task to download files/data, but you can use the PowerShell task to connect to FTP server and download files.
For detailed information, you can refer to this similar question.
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#FTP Server Information - SET VARIABLES
$ftp = "ftp://XXX.com/"
$user = 'UserName'
$pass = 'Password'
$folder = 'FTP_Folder'
$target = "C:\Folder\Folder1\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
while(-not $reader.EndOfStream) {
$reader.ReadLine()
}
#$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$files = Get-FTPDir -url $folderPath -credentials $credentials
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.txt"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Certificate comes from: PowerShell Connect to FTP server and get files.
You should use artifacts when it is inside your "environment".
Otherwise you can use the normal command line tools like git or curl and wget this depends on your build agent.
I have a script that nearly works but I need to add in either parsing or a wildcard on the URL as the URL will change to characters I won’t know each month. I have to use New-Object System.Net.WebClient because invoke web request is blocked. So, I was thinking if anyone knows how to download the link using the characters that I will know and strip off the rest of the link.
Examples of the links below
Full-CSV-data-file-Jan19-ZIP-3633K-53821.zip
Full-CSV-data-file-Dec18-ZIP-3427K.zip
Full-CSV-data-file-Nov18-ZIP-3543K-21860.zip
So on the above links i know the latest file will have Jan19 in it and it's a zip file. The script i am using is
$currentMonthNo = get-date -format "MM"
$currentMonthName = (get-date((get-date).addmonths(-2)) -format MMM)
$currentYearNo = get-date –format yy
$url = "http://www.website.com/Full-CSV-data-file-$currentMonthName$currentYearNo-ZIP-3633K-53821.zip"
$output = "C:\Folder\Full-CSV-data-file-Jan19-ZIP-3633K-53821.zip"
$start_time = Get-Date
$wc = New-Object System.Net.WebClient
$wc.DownloadFile($url, $output)
#OR
(New-Object System.Net.WebClient).DownloadFile($url, $output)
Write-Output "Time taken: $((Get-Date).Subtract($start_time).Seconds) second(s)"
Write-Output $url
Start-Sleep -s 6
I have a script that I can double click and it'll open other scripts as admin. Works with some things but not everything. For one script, it opens the next window and then immediately closes it. For another, I get this error:
At MYPATH\InstallClient.ps1:33 char:78
+ ... tall_x64.msi" -force -recurse -ErrorAction Stop #Cleans out the file ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The string is missing the terminator: ".
At MYPATH\InstallClient.ps1:27 char:31
+ ForEach ($entry in $computers){ #start of foreach loop
+ ~
Missing closing '}' in statement block or type definition.
+ CategoryInfo : ParserError: (:) [], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : TerminatorExpectedAtEndOfString
Below is the script to open a script as an admin:
Function Get-FileName($initialDirectory)
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $initialDirectory
$OpenFileDialog.filter = "PS1 (*.ps1)| *.ps1"
$OpenFileDialog.ShowDialog() | Out-Null
$OpenFileDialog.filename
}
$inputfile = Get-FileName "MYPATH\Scripts"
powershell.exe -noprofile -command "&{start-process powershell -ArgumentList '-NoExit -noprofile -file $inputfile' -verb RunAs}"
This is the script that it gives the previous error for while trying to open:
Function Get-FileName($initialDirectory) #Function to choose a file
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $initialDirectory
$OpenFileDialog.filter = "MSI (*.msi)| *.msi" #type of files that will be available for selection
$OpenFileDialog.ShowDialog() | Out-Null
$OpenFileDialog.filename
}
$inputfile = Get-FileName "MyPath" #Directory that is going to open to select a file from
Function Get-FileName($initialDirectory) #Function to choose a file
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $initialDirectory
$OpenFileDialog.filter = "CSV (*.csv)| *.csv" #type of files that will be available for selection
$OpenFileDialog.ShowDialog() | Out-Null
$OpenFileDialog.filename
}
$inputfile1 = Get-FileName "MyPath\ServerLists"
$computers = import-csv $inputfile1
ForEach ($entry in $computers){ #start of foreach loop
$computername = $entry.computernames #this saves the single entry under computernames for each entry in csv file
Copy-item $inputfile -container -recurse \\$computername\C$\windows\temp #this copies the msi file that we selected to the computer entry called from the csv file's temp folder
Invoke-Command -Computername $computername –ScriptBlock {Start-process -Wait "C:\windows\temp\ShadowSuiteClientInstall_x64.msi"} | out-null #This starts the msi file that we just copied and waits for the installation to be completed before moving on
If($?){ #If the last command was successful
Echo "Installed ShadowSuiteClientInstall_x64 on $computername."
Remove-Item "\\$computername\C$\windows\temp\ShadowSuiteClientInstall_x64.msi" -force -recurse -ErrorAction Stop #Cleans out the file we copied into the temp folder
}
}
Does anyone have any ideas on why this will open some things fine but give this error for this script and immediately close other scripts without running them? Does anyone have a better way to navigate through scripts and select one to open as admin?
Ok I figured this out. I loaded the script into powershell ISE and I saw that it was compiling it incorrectly. It kept turning the -Scriptblock into an ae symbol instead of the - in front of scriptblock. Weird AF IMO but ok, I fixed it in ISE, which I recommend to anyone struggling with weird compiling errors like this.
I need to download a monthly broadcast automatically (will set a scheduled task) using powershell.
Here is the embedded URL: https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E
The only thing that changes each month is the 201602, 201603, etc. Once I have able to pull the 720p video file, I will work on programmatically adding that part of the URL, based on the current system clock (I can manage this)
I have tried these without success:
Attempt 1:
$source = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$destination = "c:\broadcasts\test.mp4"
Invoke-WebRequest $source -OutFile $destination
Attempt 2:
$source = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$dest = "c:\broadcasts\test.mp4"
$wc = New-Object System.Net.WebClient
$wc.DownloadFile($source, $dest)
Attempt 3:
Import-Module BitsTransfer
$url = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$output = "c:\broadcasts\test.mp4"
Start-BitsTransfer -Source $url -Destination $output
Both of these end up with a test.mp4 that is basically just an empty file.
Then I found the another page that holds the video (and the download links for different qualities) and tried to pull these links using the following (I know I could have used $webpage.links):
Attempt 4:
$webpage=Invoke-webrequest "http://tv.jw.org/#en/video/VODStudio/pub-
jwb_201601_1_VIDEO"
$webpage.RawContent | Out-File "c:\scripts\webpage.txt" ASCII -Width 9999
And found that the raw content doesn't have the mp4 visible. My idea was to pull the raw content, parse it with regex and grab the 720p URL, save it in a variable and then send that to a BitsTransfer bit of code.
Please help?
I have a CruiseControl.net build that compiles all the binaries, creates an installation and publishes the install files and log files to a server location.
The actual final directory name is dynamic to include the YYYYMMDD_HH_MM_SS in the path name.
Example: <server>\Path\2-Tuesday\MyBuild_2014_08_06_07_23_15
I include the publisher event to send emails to our development and QA teams. In this email I would like to include the publish path for the build to make it easier for users to find the build.
I believe I want to modify the header.xls file in /server/xls/
However, I am not certain how to include the path?
My publishing script is a powershell script. Below is a code snippet
$dOfWeek = (Get-Date).dayofweek.toString()
$date = Get-Date
$n = [int]$date.dayofweek
$dest = Join-Path -Path $publishDir.value -ChildPath "$n-$dOfWeek"
$day = Get-Date -Format yyyyMMdd
$time = Get-Date -Format HH_mm_ss
$pubFolder="Bld" + $day + "_" + $time
$publishPath=Join-Path -Path $dest -ChildPath $pubFolder
Note that $publishDir is a parameter passed to the function that formats this.
How do I set this up so that I notify ccnet of this path, and how do I incorporate the value in header.xls?
Thank you.
Sincerely,
Daniel Lee
Use a file merge task to "notify" CC of your custom information. The information will show up in the CC xml build log. See:
File Merge
Then edit header.xsl or compile.xsl to transform the new xml into html to show up in the build emails.