Running Sonarqube 5.6.1 MSBuild scanner C#/javascript with mutiple projects with a MSSqlServer db. The global measures tab displays all appropriate projects as expected, however drilling down into individual projects - the measures tab there is blank.
All plugins are updated, nothing custom or esoteric is installed - (C#, SonarJS, Git, Timeline). Server restarted.
The scanner log has no errors and indicates SUCCESS. All of the other tabs [home,issues code, dashboards] are appropriately updated with the latest scan results as expected.
Is there some plugin that's required to enable the measures tab? Some additional configuration required to select what is displayed on this page Installing Sonarqube locally and running with internally db with out of the box settings the measures tab is populated as expected with Maintainability, Duplications, Size, Complexity, Documentation, Issues.
Any troubleshooting ideas appreciated.
The Measures tab is built-in, it doesn't require any plugin.
Reports of blank Measures page often involved the following situation:
reverse-proxy used in front of SonarQube server
reverse-proxy has a setting controlling the maximum URL length
when loading the Measures page, that setting caused an HTTP request to be kicked-out, which caused the page to be empty
increasing the maximum allowed URL length improved the situation
Related
I am using Sitecore 8.1 and whenever I build the solution. Sitecore starts rebuilding all the indexes when I load the website in browser. I am using IIS (localhost) on the development machine. Even if I change a some thing in web.config and reload the website (in browser), sitecore starts rebuilding indexes again.
Is it normal behavior with Sitcore 8.1? I have recently started working with it. I never had this kind of problem with Sitecore 7.2.
When you update web.config, Sitecore initiated a shutdown and restart again, once Sitecore gets restarted, it will start updating the indexes (I can see this in the crawling log).
IntervalAsynchronous strategy will be triggered and responsible for this behaviour.
The strategy forces a full index rebuild when the number of entries in
the history table exceeds the number you specify in the
Indexing.FullRebuildItemCountThreshold setting. This normally means
that a substantial publishing or deployment has taken place, and this
should always trigger a full index rebuild.
Refer to this link for more information: https://doc.sitecore.net/sitecore_experience_platform/setting_up__maintaining/search_and_indexing/indexing/index_update_strategies
I am working on a SP2013 project for a customer, and I need to set up a working environment for development, testing and production. Let's assume for the sake of simplicity that the work consists only of various customizations (lists, libraries, apps, themes etc.) and no code.
My setup is as follows:
The production environment is on some servers on the customer site
The test environment is set up in Azure
The development environment is on a virtual machine on my PC
Now, let's assume everything is set up correctly on each environment, and I want to be able to support the following tasks:
I do customizations on my dev environment, and want to deploy this in test for others to test, prefereably with existing data
After testing and QA, I want to deploy from test to production. This must of course only affect customizations, not existing data
Every now and then I would like to take a snapshot of the production environment and move it to test, so that the deployment of a new feature from development can be done as realistic as possible
I want to perform these tasks as smoothly and efficiently as possible, especially when deploying from dev to test which is done often. Deploying from test to production will not be done that often, and hence some more manual work will be tolerated.
I know of a few mechanisms that might be relevant:
Content deployment
Cross site publishing
Content database backup/restore
Save site as template, export wsp and import
(Last resort) Manually set up each customization by hand
Could some of you experienced SharePoint devs/admins make some recommendations as to which mechanism to use in which situation, when to not use it etc.? Are there other methods that should be mentioned? Remember that the three environments reside in separated physical environments, which will probably make a fully automated solution difficult. Would it make it easier if I set up the test environment on the customer site (i.e. part of the same farm)?
Another option depending on your specific customisations might be a third party tool. There are a number of them out there. ShareGate is one I have personally been using for migration work and seems very simple and effective for moving content around quickly between environments. Attunity Repliweb for SharePoint is another that might be worth looking at for the sort of development specific release work that you require.
As for native options, I am still finding my way as well but here are my suggestions :
Where possible I have used Visual Studio to create solution packages containing features to deploy pieces of functionality. A branding solution package for example might include several features that deploy your custom master pages, theme / look files, common JavaScript libraries and images.
Feature deployment makes it easy for you to deploy or remove functionality between environments and to reuse functionality between sites. Additionally you can add your Visual Studio solutions to a source control system such as VS Online or GitHub.
For one off sites I have created a dev site, configured it then used the built in SharePoint backup and restore to deploy it to prod. Subsequent changes have been created in dev and then manually applied to test and prod. Depending on the customisations this has been quite time consuming. You might combine this with a tool such as ShareGate to automate the deployment of individual artifacts such as a customized list from one environment to another.
For moving content around I have been using a combination of ShareGate for things such as documents alongside Boost Solutions Excel Import for handling list data. This allows me to export large amounts of list data to excel and easily reimport it into a new list which might be a copy that I have added new functionality to in preparation for replacing the old prod list or perhaps dev / test lists that I am populating before doing a full site backup to restore to production.
Good luck and hopefully some of these suggestions are useful to you ! I will be following this question as I am also interested to hear of better methods / habits for managing the SharePoint development cycle.
I finished setting up a development environment for a SharePoint 2013 production environment that I maintain. The last step was to move my production content to my development environment. I had to dig around a bit to find the PowerShell etc. Rather than go through that again next time, I decided to write a blog about it, so that I’d have all the steps in one place.
The first step is to back up the content database that you want to restore to development. To do this open up SQL Server Management Studio, right-click on the database you want to back up, hover over tasks, and select backup. You will be presented with the Back Up Database window. Make sure that your backup type is set to full, give the backup a name or stick with the default, and note or change the destination.
You can skip these steps if you have scheduled backups running and are able to access the backup drive. In that case just go grab a copy of the most recent full backup and copy it to your development SQL Server.
The next step is to restore the database to development. To do this open up SQL Server Management Studio in your development environment, right-click on the Databases folder, and select Restore Database. When presented with the Restore Database window, click on the Device Radio Button and click the ellipsis next to the text box. This will bring up the Select backup devices window. From there click Add, locate your backup file and click OK, click OK again to be returned to the Restore Database Window, and finally from there click OK. Now your database has been restored, and you are ready to add it to SharePoint.
If you don’t already have one with content in it that you don’t care about in it, create a new Web Application…
https://sharepointv15.wordpress.com/2012/07/24/create-a-web-application-in-sharepoint-2013/
Don’t worry about creating a site collection.
Now go to Central Admin and click on Manage content databases under Application Management.
rsd1
Make sure that the correct Web Application is selected. If it is not click on the drop down arrow next to the Web Application name, click change web application and select the correct Web Application in the window that you are presented with.
rsd2
Next click on the Content Database name
On the Manage Content Database Settings screen scroll down, click on the Remove Content Database check box, click OK on the warning pop up and click OK at the bottom of the screen.
rsd3
Now you’ll need to open up the SharePoint 2013 Management Shell as an administrator. To do this click on your start menu, click all programs, click on the Sharepoint 2013 folder, right-click the SharePoint 2013 Management Shell and select Run as Administrator.
From here you will run the Mount-SPContentDatabase cmdlet
Mount-SPContentDatabase “MyDatabase” -DatabaseServer “MyServer” -WebApplication http://sitename
Click below for details on this cmdlt…
http://technet.microsoft.com/en-us/library/ff607581.aspx
At this point you should be able to navigate to the web application URL and see the Site Collection that lives in the database you just mounted.
Note: This will work in SharePoint 2010 or SharePoint 2013. However, the database must be the same version of SharePoint that the farm you are trying to mount to is. If it is a lower version it will automatically try to upgrade it, so keep that in mind.
follow the below link.
https://sharepointv15.wordpress.com/2013/03/21/moving-content-between-environments/
Please help me how to copy a page from the existing application of Apex to another work space of Apex application.
You can't do this out of the box.
Beside workspace ids, the application id also matters. If you have 2 different workspaces and the same application in it but with different IDs, this further complicates things.
What you could always do is export the complete application, import it but use a different id so you don't overwrite the existing one, and then create a new page as a copy of the newly imported application's page.
Antoher way would be to edit the exported PAGE SQL file but, let me stress this, this is not recommended. And as so graciously stated in the OTN forums now and again, if you'd require support with an application/apex issue and they would find you were messing around in the sql files you'd not get support there. Only do this when you UNDERSTAND and KNOW what you're about to do! If you alter the code without understanding what you are doing you could be in a far worse situation than the one you started in. Any other case, follow the application export/import/copy line.
Anyway, I was in a position where workspace IDs differed but application IDs not. In this case altering the exported file is quite trivial and requires editing only 1 (one) line and concerns this piece of code:
begin
-- Assumes you are running the script connected to SQL*Plus as the Oracle user APEX_040200 or as the owner (parsing schema) of the application.
wwv_flow_api.set_security_group_id(p_security_group_id=>nvl(wwv_flow_application_install.get_workspace_id,27000294100083787867));
end;
/
This is one of the first pieces of code in the exported page file. As you can see, the workspace ID is set here. If attempted to be imported into an application (even if the app id matches the one you're trying to import to) you'd get an error. Change the ID to the one matching the workspace however and it'll work. Of course, you need to know the workspace IDs, and you can find these by executing this select on your apex environment(s?)
select workspace, workspace_display_name, workspace_id from apex_workspaces
Some good advice:
If you're still in the start-up phase of your apex installation, you might want to make sure that your workspace ids are identical. For example, with a test and production environment having identical workspace and application ids is very interesting. You'd have 2 instances (2 database installations on 2 different servers), but want the IDs to be the same.
To make sure of this, you can EXPORT the workspace from one environment and then IMPORT it into the other one. You can do that from the instance administration in apex, ie the internal workspace.
This is now supported in APEX version 4.2 - per Oracle doc...
7.3.4 Copying a Database Application Page
You can copy a page from the current application or from another application. During the copy process, you can also copy shared components or change mappings to shared components in the target application.
To copy a page:
Navigate to the application you want to copy to:
Navigate to the Workspace home page.
Click the Application Builder icon.
Select an application.
Select a page.
The Page Definition appears.
In Tree view:
Under Page Rendering, select the page name.
Right-click and select copy.
In Component view:
Under Page, click the Copy icon.
For Copy Page Option, select one of the following:
Page in this application
Page in another application
Follow the on-screen instructions.
In APEX 4.0 to copy a page from any Application:
Edit any page in your Application
Hit the Create▼ button
Choose New page as a copy
I'm attempting to get an understanding of what is a best practice / recommended setup for moving information between multiple Sitecore installations. I have a copy of Sitecore setup on my machine for development. We need a copy of the system setup for demonstration to the client and for people to enter in content prelaunch. How should I set things up so I people can enter content / modify the demonstration version of the site and still allow me to continue development on my local machine and publish my updates without overwriting changes between the systems? Or is this not the correct approach for me to be taking?
I believe that the 'publishing target' feature is what I need to use, but as this is my first project working with Sitecore and so I am looking for practical experience on how to manage this workflow.
Nathan,
You didn't specify what version of Sitecore, but I will assume 6.01+
Leveraging publishing targets will allow you to 'publish' your development Sitecore tree (or sub-trees) from your development environment to the destination, such as your QA server. However, there is potential that you publish /sitecore/content/home/* and then you wipe out your production content!
Mark mentioned using "Sitecore Packages" to move your content (as well as templates, layout items, etc...) over, which is the traditional way of moving items between environments. Also, you didn't specify what version of Sitecore you are using, but the Staging Module is not needed for Sitecore 6.3+. The Staging Module was generally used to keep file systems in sync and to clear the cache of Content Delivery servers.
However, the one piece of the puzzle that is missing here is that, you will still need to update your code (.jpg, .css, .js, .dll, .etc) on the QA box.
The optimal solution would be to have your Sitecore items (templates, layout item, rendering items, and developer owned content items) in Source control right alongside your ASP.NET Web Application and any class library projects you may have. At a basic level, you can do this using built in "Serialization" features of Sitecore. Lars Nielsen wrote an article touching on this.
To take this to the next level, you would use a tool such as Team Development for Sitecore. This tool will allow you to easily bring your Sitecore items into Visual Studio and treat them as code. At this point you could setup automated builds, or continuous integration, so that your code and Sitecore items, are automatically pushed to your QA environment. There are also configuration options to handle the scenario of keeping production content in place while still deploying developer owned items.
I recommend you looks at the staging module if you need to publish to multiple targets from the same instance, i.e. publish content from one tree over a firewall to a development site, to a QA site, etc.
If you're just migrating content from one instance to another piecemeal, you can use Sitecore packages which are standard tools to move content. The packages serialize the content to XML and zip it up and allow you to install them in other instances.
Coldfusion Report Builder is great.
One small issue. We use ANT+CFANT to deploy.
When we create the report, say in a datasource called MyApp_dev on a dev box.
Our other server is the production server. It also contains a staging build to ensure everything is going smoothly before we publish to live. (thanks to Al Everett for bringing this clarification to my attention.)
Everything works great when the report is created.
We deploy the report to our staging server, which has a datasource of MyApp_Staging. That server also, may or may not, have the live app working under MyApp_Live. Ant pushes the update to Staging just great.
Run the report, crashes and burns. Why?
It seems the report is looking for the MyApp_Dev data_source, even though the application is using the MyApp_Staging datasource.
In digging around I found a few approaches, I would like to do this one, final, ideal way from the beginning instead of having to go back to do dozens of reports differently when I have a new Aha! moment.
1) Obvious: Pass in the datasource in to the cfreport tag. Doesn't work for ColdFusion Builder Reports as of v8, or v9 as tested on Linux.
2) Most realistic option (but painful) so far: Pass in the query as an object into the ColdFusion Builder report. Let's think about this:
Create the Report with the report builder to my heart's content using the RDS, etc on my local box.
When I'm done, copy the query into a snippet of code, or into a database column to be dynamically be injected at runtime with correct datasource.
Modify my "run report" event to find the query from the database column, insert it into another dynamic cfquery and potentially... evaluate (!?!) it? Fun side is I can set the cfquery datasource to what I would need for each environment.
When I modify the report's columns in CF Report Builder, I always have to update the query in the database. Is there a snippet of code that can extract this for me? Hmm.
3) Less than ideal. Suck it up and let all the reports in staging run off the live server. Maybe copy the live data into staging (sans structural changes) to let it seem similar.
Are there any eloquent ways to accomplish the above?
Thanks in Advance!
If you have different dev/staging/production boxes, why not just use the same datasource name on each? That'll save you from having the code figure out where it is.
Because security concerns at my current assignment preclude me from using RDS, I use option 2 as a matter of course. I also like it as it makes it easier to debug.