About Sitecore Backup - sitecore

I am trying to backup a whole Sitecore website.
I know that the package designer can do part of the job, but not entirely.
Having a backup is always a good way when the site is broken accidently.
Is there a way or a tool to backup the whole Sitecore website?
I am new to the Sitecore, so any advise is welcome.
Thank you!

We've got a SQL job running to back-up the databases nightly.
Apart from that, when I deploy code and it's a small bit I usually end up backing up only the parts I'm going to replace. If it's a big code deploy I just back up the whole website (code-wise anyway) before deploying the code package.
Apart from that we also run scheduled backups of the code (although I don't know the intervals), and of course we've got source control if everything else fails.
If you've got an automated deployment tool you could also automate the above of course.

Before a major deploy of content or code, I typically backup the master database and zip everything in the website directory minus the App_Data and temp directories. That way if the deploy goes wrong, I can restore the code and database fairly quickly and be back to the previous state.

I have no knowledge of a tool that can do this for you, but there are a few ways you can handle this in an easy way:
1) you can create a database backup of the master database, but this only contains content and no files like media files that are saved on disk or your complete and build solution. It is always a good idea to schedule your database backup every night and save the backups for at least a week or more.
2) When you use the package designer, you can create dynamic pacakges that can contain all your content, media files and solution files on disk. This is an easy way to deploy the site onto a new Sitecore installation all at once, but it requires a manual backup every time.
3) Another way you can use is to serialize your entire content-tree to an xml-format on disk from the Developer tab. Once serialized, you can revert them back into the content tree.

I'd suggest thinking of this in two parts, the first part is backing up the application which is a simple as making sure your application is in some SCM system.
For that you can use Team Development for Sitecore. One of it's features allows you to connect a Visual Studio project to your Sitecore instance.
You can select Sitecore items that you want to be stored in your solution and it will serialize them and place them into your solution.
You can then check them into your SCM system and sleep easier.
The thing to note is deciding which item to place in source control, generally you can think of Sitecore items has developer owned and Content Editor owned. The items you will place in your solution are the items that are developer owned; templates, sublayouts, layouts, and content items that you need for the site to function are good examples.
This way if something goes bad a base restoration is quick and easy.
The second part is the backup of the content in Sitecore that has been added since your deployment. For that like Trayek said above use a SQL job to do the back-ups at whatever interval your are comfortable with.
If you're bored I have a post on using TDS (Team Development for Sitecore) you can check out at Working with Sitecore, Part Nine: TDS

Expanding bit more on what Trayek said, my suggestion would be to have a Continuous Integration (CI) and have automated deploy using Team City.
A good answer is also given here on Stack Overflow.
Basically in your case Teamcity would automatically
1. take back up of the current website (i.e. code) and deploy the new code on top of it.
2. Scripts can also be written to take a differential backup of the SQL databases, if need be.
Hope this helps.

Take a look at Sitecore Instance Manager module. Works really well for packaging entire Sitecore instance.

Related

TDS Auto Sync Content with TFS

We are currently have a slight issue whereby the only way to keep sitecore content in Sync is by taking the staging data and synching with TDS manually and then committing it into source control so the content is not lost. As you can imagine that this is a very repetitive and lengthy process.
My question is there a way to automate this content synchronization on the environment from staging master database to TDS project or by another means? The end result we are hoping for is to have all the content changes made in staging in sync with source control automatically (if possible).
Don't try to keep content in sync using TDS, you'll always face an uphill battle.
TDS should really only be used for developer controlled items - templates, rendering, layout, core db items etc, and maybe certain content items which are used as lookup/settings items. General content and media should not be kept in TDS unless it is for the purposes of setting up test content for developers. As a general rule of thumb, templates go up (local > dev > qa > prod) and content comes down (prod > qa > dev > local).
If you are trying to keep different server environments in sync then take a look at RAZL. It's possible to script the sync process to automate it as part of your continuous deployment process.
If it's just for the purposes getting content onto local developer machines then just create a one-off package and install it locally. As far as I know, it's not possible to automate the sync into TDS, and add to the fact that TFS is the probably not the best choice for Sitecore for this kind of thing anyway. If you really really want to go down this route then Git and Unicorn is a much better choice.
I echo jammykam on not source-controlling content edits. Look at automation of SQL backups instead so you can have regular backups of data over time, and use workflow so that you have content versions.
To your question though, I do not know of a way to automate the TDS sync process. If you truly do want to ship all changes into source control, you will want to have a Sitecore event handler or a regular scheduled agent that is serializing the content and then checking it into TFS.
Typically, TDS is meant for local developers to be able to make changes in their local database that need to be part of the solution and share/deploy those changes via source control and automated deployments.

SharePoint 2013 Dev/Test/Production environment - Best practice for moving content

I am working on a SP2013 project for a customer, and I need to set up a working environment for development, testing and production. Let's assume for the sake of simplicity that the work consists only of various customizations (lists, libraries, apps, themes etc.) and no code.
My setup is as follows:
The production environment is on some servers on the customer site
The test environment is set up in Azure
The development environment is on a virtual machine on my PC
Now, let's assume everything is set up correctly on each environment, and I want to be able to support the following tasks:
I do customizations on my dev environment, and want to deploy this in test for others to test, prefereably with existing data
After testing and QA, I want to deploy from test to production. This must of course only affect customizations, not existing data
Every now and then I would like to take a snapshot of the production environment and move it to test, so that the deployment of a new feature from development can be done as realistic as possible
I want to perform these tasks as smoothly and efficiently as possible, especially when deploying from dev to test which is done often. Deploying from test to production will not be done that often, and hence some more manual work will be tolerated.
I know of a few mechanisms that might be relevant:
Content deployment
Cross site publishing
Content database backup/restore
Save site as template, export wsp and import
(Last resort) Manually set up each customization by hand
Could some of you experienced SharePoint devs/admins make some recommendations as to which mechanism to use in which situation, when to not use it etc.? Are there other methods that should be mentioned? Remember that the three environments reside in separated physical environments, which will probably make a fully automated solution difficult. Would it make it easier if I set up the test environment on the customer site (i.e. part of the same farm)?
Another option depending on your specific customisations might be a third party tool. There are a number of them out there. ShareGate is one I have personally been using for migration work and seems very simple and effective for moving content around quickly between environments. Attunity Repliweb for SharePoint is another that might be worth looking at for the sort of development specific release work that you require.
As for native options, I am still finding my way as well but here are my suggestions :
Where possible I have used Visual Studio to create solution packages containing features to deploy pieces of functionality. A branding solution package for example might include several features that deploy your custom master pages, theme / look files, common JavaScript libraries and images.
Feature deployment makes it easy for you to deploy or remove functionality between environments and to reuse functionality between sites. Additionally you can add your Visual Studio solutions to a source control system such as VS Online or GitHub.
For one off sites I have created a dev site, configured it then used the built in SharePoint backup and restore to deploy it to prod. Subsequent changes have been created in dev and then manually applied to test and prod. Depending on the customisations this has been quite time consuming. You might combine this with a tool such as ShareGate to automate the deployment of individual artifacts such as a customized list from one environment to another.
For moving content around I have been using a combination of ShareGate for things such as documents alongside Boost Solutions Excel Import for handling list data. This allows me to export large amounts of list data to excel and easily reimport it into a new list which might be a copy that I have added new functionality to in preparation for replacing the old prod list or perhaps dev / test lists that I am populating before doing a full site backup to restore to production.
Good luck and hopefully some of these suggestions are useful to you ! I will be following this question as I am also interested to hear of better methods / habits for managing the SharePoint development cycle.
I finished setting up a development environment for a SharePoint 2013 production environment that I maintain. The last step was to move my production content to my development environment. I had to dig around a bit to find the PowerShell etc. Rather than go through that again next time, I decided to write a blog about it, so that I’d have all the steps in one place.
The first step is to back up the content database that you want to restore to development. To do this open up SQL Server Management Studio, right-click on the database you want to back up, hover over tasks, and select backup. You will be presented with the Back Up Database window. Make sure that your backup type is set to full, give the backup a name or stick with the default, and note or change the destination.
You can skip these steps if you have scheduled backups running and are able to access the backup drive. In that case just go grab a copy of the most recent full backup and copy it to your development SQL Server.
The next step is to restore the database to development. To do this open up SQL Server Management Studio in your development environment, right-click on the Databases folder, and select Restore Database. When presented with the Restore Database window, click on the Device Radio Button and click the ellipsis next to the text box. This will bring up the Select backup devices window. From there click Add, locate your backup file and click OK, click OK again to be returned to the Restore Database Window, and finally from there click OK. Now your database has been restored, and you are ready to add it to SharePoint.
If you don’t already have one with content in it that you don’t care about in it, create a new Web Application…
https://sharepointv15.wordpress.com/2012/07/24/create-a-web-application-in-sharepoint-2013/
Don’t worry about creating a site collection.
Now go to Central Admin and click on Manage content databases under Application Management.
rsd1
Make sure that the correct Web Application is selected. If it is not click on the drop down arrow next to the Web Application name, click change web application and select the correct Web Application in the window that you are presented with.
rsd2
Next click on the Content Database name
On the Manage Content Database Settings screen scroll down, click on the Remove Content Database check box, click OK on the warning pop up and click OK at the bottom of the screen.
rsd3
Now you’ll need to open up the SharePoint 2013 Management Shell as an administrator. To do this click on your start menu, click all programs, click on the Sharepoint 2013 folder, right-click the SharePoint 2013 Management Shell and select Run as Administrator.
From here you will run the Mount-SPContentDatabase cmdlet
Mount-SPContentDatabase “MyDatabase” -DatabaseServer “MyServer” -WebApplication http://sitename
Click below for details on this cmdlt…
http://technet.microsoft.com/en-us/library/ff607581.aspx
At this point you should be able to navigate to the web application URL and see the Site Collection that lives in the database you just mounted.
Note: This will work in SharePoint 2010 or SharePoint 2013. However, the database must be the same version of SharePoint that the farm you are trying to mount to is. If it is a lower version it will automatically try to upgrade it, so keep that in mind.
follow the below link.
https://sharepointv15.wordpress.com/2013/03/21/moving-content-between-environments/

Sitecore development and demo servers

I'm attempting to get an understanding of what is a best practice / recommended setup for moving information between multiple Sitecore installations. I have a copy of Sitecore setup on my machine for development. We need a copy of the system setup for demonstration to the client and for people to enter in content prelaunch. How should I set things up so I people can enter content / modify the demonstration version of the site and still allow me to continue development on my local machine and publish my updates without overwriting changes between the systems? Or is this not the correct approach for me to be taking?
I believe that the 'publishing target' feature is what I need to use, but as this is my first project working with Sitecore and so I am looking for practical experience on how to manage this workflow.
Nathan,
You didn't specify what version of Sitecore, but I will assume 6.01+
Leveraging publishing targets will allow you to 'publish' your development Sitecore tree (or sub-trees) from your development environment to the destination, such as your QA server. However, there is potential that you publish /sitecore/content/home/* and then you wipe out your production content!
Mark mentioned using "Sitecore Packages" to move your content (as well as templates, layout items, etc...) over, which is the traditional way of moving items between environments. Also, you didn't specify what version of Sitecore you are using, but the Staging Module is not needed for Sitecore 6.3+. The Staging Module was generally used to keep file systems in sync and to clear the cache of Content Delivery servers.
However, the one piece of the puzzle that is missing here is that, you will still need to update your code (.jpg, .css, .js, .dll, .etc) on the QA box.
The optimal solution would be to have your Sitecore items (templates, layout item, rendering items, and developer owned content items) in Source control right alongside your ASP.NET Web Application and any class library projects you may have. At a basic level, you can do this using built in "Serialization" features of Sitecore. Lars Nielsen wrote an article touching on this.
To take this to the next level, you would use a tool such as Team Development for Sitecore. This tool will allow you to easily bring your Sitecore items into Visual Studio and treat them as code. At this point you could setup automated builds, or continuous integration, so that your code and Sitecore items, are automatically pushed to your QA environment. There are also configuration options to handle the scenario of keeping production content in place while still deploying developer owned items.
I recommend you looks at the staging module if you need to publish to multiple targets from the same instance, i.e. publish content from one tree over a firewall to a development site, to a QA site, etc.
If you're just migrating content from one instance to another piecemeal, you can use Sitecore packages which are standard tools to move content. The packages serialize the content to XML and zip it up and allow you to install them in other instances.

Is there an ideal way to move from Staging to Production for Coldfusion code?

I am trying to work out a good way to run a staging server and a production server for hosting multiple Coldfusion sites. Each site is essentially a fork of a repo, with site specific changes made to each. I am looking for a good way to have this staging server move code (upon QA approval) to the production server.
One fanciful idea involved compiling the sites each into EAR files to be run on the production server, but I cannot seem to wrap my head around Coldfusion archives, plus I cannot see any good way of automating this, especially the deployment part.
What I have done successfully before is use subversion as a go between for a site, where once a site is QA'd the code is committed and then the production server's working directory would have an SVN update run, which would then trigger a code copy from the working directory to the actual live code. This worked fine, but has many moving parts, and still required some form of server access to each server to run the commits and updates. Plus this worked for an individual site, I think it may be a nightmare to setup and maintain this architecture for multiple sites.
Ideally I would want a group of developers to have FTP access with the ability to log into some control panel to mark a site for QA, and then have a QA person check the site and mark it as stable/production worthy, and then have someone see that a site is pending and click a button to deploy the updated site. (Any of those roles could be filled by the same person mind you)
Sorry if that last part wasn't so much the question, just a framework to understand my current thought process.
Agree with #Nathan Strutz that Ant is a good tool for this purpose. Some more thoughts.
You want a repeatable build process that minimizes opportunities for deltas. With that in mind:
SVN export a build.
Tag the build in SVN.
Turn that export into a .zip, something with an installer, etc... idea being one unit to validate with a set of repeatable deployment steps.
Send the build to QA.
If QA approves deploy that build into production
Move whole code bases over as a build, rather than just changed files. This way you know what's put into place in production is the same thing that was validated. Refactor code so that configuration data is not overwritten by a new build.
As for actual production deployment, I have not come across a tool to solve the multiple servers, different code bases challenge. So I think you're best served rolling your own.
As an aside, in your situation I would think through an approach that allows for a standardized codebase, with a mechanism (i.e. an API) that allows for the customization you're describing. Otherwise managing each site as a "custom" project is very painful.
Update
Learning Ant: Ant in Action [book].
On Source Control: for the situation you describe, I would maintain a core code base and overlays per site. Export core, then site specific over it. This ensures any core updates that site specific changes don't override make it in.
Call this combination a "build". Do builds with Ant. Maintain an Ant script - or perhaps more flexibly an ant configuration file - per core & site combination. Track version number of core and site as part of a given build.
If your software is stuffed inside an installer (Nullsoft Install Shield for instance) that should be part of the build. Otherwise you should generate a .zip file (.ear is a possibility as well, but haven't seen anyone actually do this with CF). Point being one file that encompasses the whole build.
This build file is what QA should validate. So validation includes deployment, configuration and functionality testing. See my answer for deployment on how this can flow.
Deployment:
If you want to automate deployment QA should be involved as well to validate it. Meaning QA would deploy / install builds using the same process on their servers before doing a staing to production deployment.
To do this I would create something that tracks what server receives what build file and whatever credentials and connection information is necessary to make that happen. Most likely via FTP. Once transferred, the tool would then extract the build file / run the installer. This last piece is an area I would have to research as to how it's possible to let one server run commands such as extraction or installation remotely.
You should look into Ant as a migration tool. It allows you to package your build process with a simple XML file that you can run from the command line or from within Eclipse. Creating an automated build process is great because it documents the process as well as executes it the same way, every time.
Ant can handle zipping and unzipping, copying around, making backups if needed, working with your subversion repository, transferring via FTP, compressing javascript and even calling a web address if you need to do something like flush the application memory or server cache once it's installed. You may be surprised with the things you can do with Ant.
To get started, I would recommend the Ant manual as your main resource, but look into existing Ant builds as a good starting point to get you going. I have one on RIAForge for example that does some interesting stuff and calls a groovy script to do some more processing on my files during the build. If you search riaforge for build.xml files, you will come up with a great variety of them, many of which are directly for ColdFusion projects.

How do I run one version of a web app while developing the next version?

I just finished a Django app that I want to get some outside user feedback on. I'd like to launch one version and then fork a private version so I can incorporate feedback and add more features. I'm planning to do lots of small iterations of this process. I'm new to web development; how do websites typically do this? Is it simply a matter of copying my Django project folder to another directory, launching the server there, and continuing my dev work in the original directory? Or would I want to use a version control system instead? My intuition is that it's the latter, but if so, it seems like a huge topic with many uses (e.g. collaboration, which doesn't apply here) and I don't really know where to start.
1) Seperate URLs www.yoursite.com vs test.yoursite.com. you can also do www.yoursite.com and www.yoursite.com/development, etc.. You could also create a /beta or /staging..
2) Keep seperate databases, one for production, and one for development. Write a script that will copy your live database into a dev database. Keep one database for each type of site you create. (You may want to create a beta or staging database for your tester).. Do your own work in the dev database. If you change the database structure, save the changes as a .sql file that can be loaded and run on the live site database when you turn those changes live.
3) Merge features into your different sites with version control. I am currently playing with a subversion setup for web apps that has my stable (trunk), one for staging, and one for development. Development tags + branches get merged into staging, and then staging tags/branches get merged into stable. Version control will let you manage your source code in any way you want. You will have to find a methodology that works for you and use it.
4) Consider build automation. It will publish your site for you automatically. Take a look at http://ant.apache.org/. It can drive a lot of automatically checking out your code and uploading it to each specific site as you might need.
5) Toy of the month: There is a utility called cUrl that you may find valuable. It does a lot from the command line. This might be okay for you to do in case you don't want to use all or any of Ant.
Good luck!
You would typically use version control, and have two domains: your-site.com and test.your-site.com. Then your-site.com would always update to trunk which is the current latest, shipping version. You would do your development in a branch of trunk and test.your-site.com would update to that. Then you periodically merge changes from your development branch to trunk.
Jas Panesar has the best answer if you are asking this from a development standpoint, certainly. That is, if you're just asking how to easily keep your new developments separate from the site that is already running. However, if your question was actually asking how to run both versions simultaniously, then here's my two cents.
Your setup has a lot to do with this, but I always recommend running process-based web servers in the first place. That is, not to use threaded servers (less relevant to this question) and not embedding in the web server (that is, not using mod_python, which is the relevant part here). So, you have one or more processes getting HTTP requests from your web server (Apache, Nginx, Lighttpd, etc.). Now, when you want to try something out live, without affecting your normal running site, you can bring up a process serving requests that never gets the regular requests proxied to it like the others do. That is, normal users don't see it.
You can setup a subdomain that points to this one, and you can install middleware that redirects "special" user to the beta version. This allows you to unroll new features to some users, but not others.
Now, the biggest issues come with database changes. Schema migration is a big deal and something most of us never pay attention to. I think that running side-by-side is great, because it forces you to do schema migrations correctly. That is, you can't just shut everything down and run lengthy schema changes before bringing it back up. You'd never see any remotely important site doing that.
The key is those small steps. You need to always have two versions of your code able to access the same database, so changes you make for the new code need to not break the old code. This breaks down into a few steps you can always make:
You can add a column with a default value, or that is optional. The new code can use it, and the old code can ignore it.
You can update the live version with code that knows to use a new column, at which point you can make it required.
You can make the new version ignore a column, and when it becomes the main version, you can delete that column.
You can make these small steps to migrate between any schemas. You can iteratively add a new column that replaces an old one, roll out the new code, and remove the old column, all without interrupting service.
That said, its your first web app? You can probably break it. You probably have few users :-) But, it is fantastic you're even asking this question. Many "professionals" fair to ever ask it, and even then fewer answer it.
What I do is have an export a copy of my SVN repository and put the files on the live production server, and then keep a virtual machine with a development working copy, and submit the changes to the repo when Im done.