Backup List in Sharepoint 2010 in Shell - list

I want to backup a List of Sharepoint 2010 using the powershell.
I can backup the list using the central Administration and can also backup the whole Site using
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp
But when I try to export a specific List (with the path that is also shown using the Central Administration):
Export-SPWeb -Identity http://siteurl:22222/en-us/Lists/MyList -Path \\public\backup.cmp
I receive the error:
"The URL provided is invalid. Only valid URLs
that are site collections or sites are allowed to be exported using
stsadm.exe"
I also tried
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp -ItemURL http://siteurl:22222/en-us/Lists/MyList
getting the same error
Thanks in advance

Try to fiddle with the ItemUrl parameter value:
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp
-ItemUrl /Lists/MyList
or
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp
-ItemUrl /en-us/Lists/MyList
or
Export-SPWeb -Identity http://siteurl:22222/en-us -Path \\public\backup.cmp
-ItemUrl "/Lists/MyList"
Different sources show different syntax:
SharePoint 2010 Granular Backup-Restore Part 1
Failing to export with Export-SPWeb

If you have the following kind of setup:
Site Collection e.g. http://localhost:81
|
|-> Subsite 1 e.g. tools (http://localhost:81/tools)
|
|-> Subsite 2 e.g. admin (http://localhost:81/tools/admin)
I found the following worked for lists on the subsite:
Export-SPWeb -Identity http://<site>:<port>/<subsite1>/<subsite2> -ItemUrl /<subsite1>/<subsite2>/<listName> -Path <localpath>/<filename>.cmp -IncludeVersions All
e.g.
Export-SPWeb -Identity http://localhost:81/tools/admin/ -ItemUrl /tools/admin/RequestList -Path C:/Temp/Backup.cmp -IncludeVersions All
To ensure you've got the right url for your list, use the following command (thanks to HAZET here: http://social.technet.microsoft.com/Forums/en-US/sharepoint2010setup/thread/a1f48e70-9360-440f-b160-525fbf2b8412/):
$(Get-SPWeb -identity http://<site>:<port>/<subsite1>/<subsite2>).lists | ft title, #{Name="itemURL"; Expression = { $_.parentWebURL + "/" + $_.RootFolder}}
e.g.
$(Get-SPWeb -identity http://localhost:81/tools/admin/).lists | ft title, #{Name="itemURL"; Expression = { $_.parentWebURL + "/" + $_.RootFolder}}
Some examples of various errors I encountered whilst trying to get this to work:
The URL provided is invalid
Export-SPWeb : <nativehr>0x80070057</nativehr><nativestack></nativestack> At line:1 char:13
CategoryInfo : InvalidData: (Microsoft.Share...CmdletExportWeb: SPCmdletExportWeb) [Export-SPWeb], SPException FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletExportWeb
Some things to check:
Check that -Identity has the trailing slash i.e. http://localhost:81/
Check that you have the complete URL in Identity (if using subsites, include subsites)
Check that the path where you're trying to store your export file exists
Check that your ItemUrl is correct (i.e. starts with / and is a directory, not a specific file e.g. is /tools/admin/RequestsList, not /tools/admin/RequestsList/AllItems.aspx
Check that you have permissions to perform the export
Further info that might be helpful:
Identity: The URL of your SharePoint site
ItemUrl: The relative URL of your list/document library
Path: The target filename and location for the exported list e.g. C:/Temp/backup.cmp
IncludeVersion: What versions of the document you wish to export.
Export-SPWeb
http://technet.microsoft.com/en-us/library/ff607895.aspx
Export a site, list or document library in SharePoint 2010
http://technet.microsoft.com/en-us/library/ee428301.aspx
Import a list or document library in SharePoint 2010
http://technet.microsoft.com/en-us/library/ee428322.aspx

Related

Scheduling a Powershell script to run weekly in AWS

So, I've got the following powershell script to find inactive AD users and disable their accounts, creating a log file containing a list of what accounts have been disabled:
Import-Module ActiveDirectory
# Set the number of days since last logon
$DaysInactive = 60
$InactiveDate = (Get-Date).Adddays(-($DaysInactive))
# Get AD Users that haven't logged on in xx days
$Users = Get-ADUser -Filter { LastLogonDate -lt $InactiveDate -and Enabled -eq $true } -
Properties LastLogonDate | Select-Object #{ Name="Username"; Expression=.
{$_.SamAccountName} }, Name, LastLogonDate, DistinguishedName
# Export results to CSV
$Users | Export-Csv C:\Temp\InactiveUsers.csv -NoTypeInformation
# Disable Inactive Users
ForEach ($Item in $Users){
$DistName = $Item.DistinguishedName
Disable-ADAccount -Identity $DistName
Get-ADUser -Filter { DistinguishedName -eq $DistName } | Select-Object #{ Name="Username"; Expression={$_.SamAccountName} }, Name, Enabled
}
The script works and is doing everything it should. What I am trying to figure out is how to automate this in an AWS environment.
I'm guessing I need to use a Lambda function in AWS to trigger this script to run on a schedule but don't know where to start.
Any help greatly appreciated.
I recomment to create a Lambda function with dotnet environment: https://docs.aws.amazon.com/lambda/latest/dg/lambda-powershell.html
Use a CloudWatch Event on a Scheduled basis to trigger the function:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/RunLambdaSchedule.html
An alternative, if you like to to have a more pipeline style execution you could use CodePipeline and CodeBuild to run the script. Use again CloudWatch to trigger the CodePipeline on a scheduled basis!

Pull Server names and ip address and see if they're live

I am looking to run a PowerShell script that will pull all servers from AD, show me the FQDN and IP address as well as tell me if the server is pinging.
I have a couple of scripts that do certain parts but would like to get something all in one and am having a hard time doing it.
Ping a list of servers:
$ServerName = Get-Content "c:\temp\servers.txt"
foreach ($Server in $ServerName) {
if (test-Connection -ComputerName $Server -Count 2 -Quiet ) {
"$Server is Pinging "
} else {
"$Server not pinging"
}
}
I also have a script to pull all servers from AD which shows server name, FQDN, and OS version:
Import-Module ActiveDirectory
Get-ADComputer -Filter {OperatingSystem -like '*Windows Server*'} -Properties * |
select Name, DNSHostName, OperatingSystem
Any help in getting a script to show me all servers in my environment with FQDN, IP address and if there live would be appreciated.
You should only choose desired properties with the -Property argument and not pull everything through network with *. Also quotes should be used with -Filter, not braces.
You can add a calculated property to Select-Object and get the value from Test-NetConnection:
Get-ADComputer -Filter "OperatingSystem -like '*Windows Server*'" -Properties dnshostname, operatingsystem |
Select-Object name, dnshostname, operatingsystem, `
#{n="PingSucceeded";e={(Test-NetConnection $_.name).PingSucceeded}}

Using powershell to download an embedded video

I need to download a monthly broadcast automatically (will set a scheduled task) using powershell.
Here is the embedded URL: https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E
The only thing that changes each month is the 201602, 201603, etc. Once I have able to pull the 720p video file, I will work on programmatically adding that part of the URL, based on the current system clock (I can manage this)
I have tried these without success:
Attempt 1:
$source = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$destination = "c:\broadcasts\test.mp4"
Invoke-WebRequest $source -OutFile $destination
Attempt 2:
$source = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$dest = "c:\broadcasts\test.mp4"
$wc = New-Object System.Net.WebClient
$wc.DownloadFile($source, $dest)
Attempt 3:
Import-Module BitsTransfer
$url = "https://www.jw.org/download/?fileformat=MP4&output=html&pub=jwb&issue=201601&option=TRGCHlZRQVNYVrXF&txtCMSLang=E"
$output = "c:\broadcasts\test.mp4"
Start-BitsTransfer -Source $url -Destination $output
Both of these end up with a test.mp4 that is basically just an empty file.
Then I found the another page that holds the video (and the download links for different qualities) and tried to pull these links using the following (I know I could have used $webpage.links):
Attempt 4:
$webpage=Invoke-webrequest "http://tv.jw.org/#en/video/VODStudio/pub-
jwb_201601_1_VIDEO"
$webpage.RawContent | Out-File "c:\scripts\webpage.txt" ASCII -Width 9999
And found that the raw content doesn't have the mp4 visible. My idea was to pull the raw content, parse it with regex and grab the 720p URL, save it in a variable and then send that to a BitsTransfer bit of code.
Please help?

Uninstall program on several servers

I have a bunch of servers I have to uninstall an app from. I am using:
$app = Get-WmiObject -Class Win32_Product | Where-Object {
$_.Name -match "application name"
}
$app.Uninstall()
I have tested the above out and it works great. I want to now run this so it uninstalls the app on a bunch of servers. I know I can use the for each option but for some reason I am having an issue getting it to work.
I created a text file called servers and listed my servers in there but it errors out each time.
Does anyone have a good pay to add a for each part to my above uninstall portion so it works?
The -ComputerName parameter of Get-WmiObject accepts a list of computernames.
$servers = Get-Content 'C:\your\computerlist.txt'
Get-WmiObject -Computer $servers -Class Win32_Product |
? { $_.Name -like '*application name*' } |
% { $_.Uninstall() }
Later on the page that you got your original code from, there is an answer from David Setler that does what you need. Slightly modified to fir your scenario:
$computers = Get-Content C:\servers.txt
foreach($server in $computers){
$app = Get-WmiObject -Class Win32_Product -computer $server | Where-Object {
$_.Name -match "application name"
}
$app.Uninstall()
}
Assuming $computers is the list of servers from your servers text file.
I would stay away from win32_product because this class isn't working properly.
See https://support.microsoft.com/en-us/kb/974524.
Basicly what happens when you query this class is that every msi will trigger a repair an the software package (spamming the eventlog) which in some cases can screw with installed software (I've seen broken installations of some programs after querying this class - though it's rare, you don't want to risk this on production servers).

Add publish directory to ccnet header.xml

I have a CruiseControl.net build that compiles all the binaries, creates an installation and publishes the install files and log files to a server location.
The actual final directory name is dynamic to include the YYYYMMDD_HH_MM_SS in the path name.
Example: <server>\Path\2-Tuesday\MyBuild_2014_08_06_07_23_15
I include the publisher event to send emails to our development and QA teams. In this email I would like to include the publish path for the build to make it easier for users to find the build.
I believe I want to modify the header.xls file in /server/xls/
However, I am not certain how to include the path?
My publishing script is a powershell script. Below is a code snippet
$dOfWeek = (Get-Date).dayofweek.toString()
$date = Get-Date
$n = [int]$date.dayofweek
$dest = Join-Path -Path $publishDir.value -ChildPath "$n-$dOfWeek"
$day = Get-Date -Format yyyyMMdd
$time = Get-Date -Format HH_mm_ss
$pubFolder="Bld" + $day + "_" + $time
$publishPath=Join-Path -Path $dest -ChildPath $pubFolder
Note that $publishDir is a parameter passed to the function that formats this.
How do I set this up so that I notify ccnet of this path, and how do I incorporate the value in header.xls?
Thank you.
Sincerely,
Daniel Lee
Use a file merge task to "notify" CC of your custom information. The information will show up in the CC xml build log. See:
File Merge
Then edit header.xsl or compile.xsl to transform the new xml into html to show up in the build emails.