Rebuilding Reporting Database Sitecore 8 - sitecore

Whenever I try to run the reporting database rebuild I get the error "Exceeded the cut off date before clearing storage". Has anyone experienced this or know what it means?

First, to work through this make sure you attach a clean reporting database. Next, make sure that the SQL user account has the db_owner role for both reporting databases. Lastly, and while not necessarily required, I prefer to purge everything from the sitecore_analytics_index folder (in your data\indexes folder).

The other option is to change the TimeToClearStorage setting as per this Sitecore article: https://doc.sitecore.net/sitecore_experience_platform/80/xdb_configuration/walkthrough_rebuilding_the_reporting_database
Specifically:
In the latest version of Sitecore xDB, the primary SQL Server reporting database contains some additional marketing definition tables that you need to copy to the secondary reporting database. When you run the rebuild reporting database page ensure that you allow more time for the clear storage process while these tables are being copied.
The default time to clear storage setting is 1 minute. Change this setting to a time interval appropriate for your Sitecore solution, for example, 10 minutes.
To change the TimeToClearStorage setting:
Open the Sitecore.Analytics.Processing.Aggregation.config file.
Change the TimeToClearStorage setting to an appropriate time. For example 10 minutes.
<reportingStorageManagertype="Sitecore.Analytics.Aggregation.History.ReportingStorageManager"singleInstance="true">
<TimeToClearStorage>0.00:01:00</TimeToClearStorage>
I set mine to 20 minutes.

1) Like Marco said you can change TimeToClearStorage
2) Make sure db_owner role is there for secondary (primary usually have this)
3) Then this will go through.. but later if it pauses for WaitReadyToReceiveData then you might need to do some manual updations as mentioned here .
4) After some time you should see it in 'completed' status.

Related

Scheduled refresh with Azure SQL only data grayed out

Setup
I have around 7GB of data in an Azure SQL DB which will continue to grow. PBIX report at just under 1GB. Currently I'm using the import method to work with the data and then publishing the report. Data loaded to PBI Desktop and then published to a workspace. All the data comes from the same Azure DB and I've already checked the Firewall option about internal Azure connections.
Problem
I am unable to set a scheduled refresh because I haven't filled out a data source credential, but that option is grayed out so I can't fill it in. All the data comes from the same Azure DB (some used to be CSV files, but I just created tables in the DB instead and replaced the data), which is online, so I should not need a gateway.
Thoughts
Maybe the capacity of the Office tenant (not sure if it's A1-3 or larger, not sure how to check) is full, as the report is just shy of 1 GB and the error shown is just badly handled?
Maybe because I had some of the data as files first, it's not recognizing that it's all now under the same DB connection? (I deleted the report with the dataset and re-uploaded)
Maybe I should change it to "Direct Query" (which I think makes me loose some of the things I've done in the report) and pay for more DB use instead, if this is something that's not possible, although this seems like the best way since it's MS and MS.
Maybe PBI just hates me.
Error message:
Last refresh failed: Tue Apr 06 2021 22:39:08 GMT+0000 (Greenwich Mean Time) Scheduled refresh has been disabled. Data source error: Scheduled refresh is disabled because at least one data source is missing credentials. To start the refresh again, go to this dataset's settings page and enter credentials for all data sources. Then reactivate scheduled refresh.

Is there a way to force Sitecore to sync MongoDB data with it's SQL database?

I am setting up Sitecore xDB and am trying to test exactly what info gets through the system for authenticated and non-authenticated users. I would like to be able to make a change and see the results quickly in Sitecore. I found the setting to lower session lifetime to 1 minute rather than 20. I have not found a way to just force Sitecore to sync with Mongo on demand or at least within 1-5 minutes rather than, what also appears to be about 20 minutes at the moment. Does it exist or is "rebuilding" the database explained here the only existing process?
See this blog post by Martina Welander for this and more good info about xDB sessions: https://mhwelander.net/2016/08/24/whats-in-a-session-what-exactly-happens-during-a-session-and-how-does-the-xdb-know-who-you-are/
You just need a utility page that calls System.Web.HttpContext.Current.Session.Abandon(). You may also want to redirect the user to a page that doesn't exist.
Update to address comment
My understanding is that once an xDB session has expired, processing should take place quickly. In the Sitecore.Analytics.Processing.Services.config file, the BackgroundService agent is set to run on an interval of 15 seconds by default.
You may just be seeing cached reporting data. Try clearing the cache using the /sitecore/admin/cache.aspx page. You could also decrease the defaultCacheExpiration setting for the reporting cacheProvider in the Sitecore.Analytics.Reporting.config file. The default is 10 minutes.

Reducing history in CiviCRM

I have a CiviCRM site with 30,000 contacts. I am noticing a number of places where history is logged. The database is getting larger over time. Does anybody have any thoughts on removing history. Has anybody created scripts to cleanup old history data.
I am not sure what history you want to delete but here are couple of things you can do.
All the logging and history data are important, so think twice before deleting them.
1) If you have "Logging" Enabled under Misc., you will get a log table for every table in CiviCRM database.
2) Every contact has Changelog, I assume by history you mean this one.
3) Remove deleted records permanently, this will eliminate the possibility to check revision records in some places.
4) Extremely, you can even delete activities but you will not want to do that.
At the end of the day, it is a CRM, deleting any of the records is a loss of data.
If you are referring to the detailed logging option (as set up as by #popcm) then you can set this detailed logging to write to a separate database - it's a setting in the civicrm.settings.phop file.
Then you could occasionally dump all the data from this database and store it offline, emptying online the database on each occasion.
If you are referring simply to the changelog history or other aspects of the CiviCRM data, then as #popcm indicates, you really don't want to delete this as you'll only regret it later.
If keeping lots of data online is a concern, look to strengthen your security.

track all Mysql queries in WAMP

I want to track how much time needed for my queries to be executed
I referred to this post but I get only the queries without the time.
It is possible that after using my web application for a wile,using select, update , insert queries (not from console but real web-application execution) I can get a summary like this output generated by SHOW PROFILES; command.
I am working with wamp mysql V5.5.24
Many thanks
Edit: I used triggers to track the update and insert statement following this method
I still have the problem how to track the select query.
any idea please?
This no longer works.
As of July 2013, you need:
general-log=1
general-log-file = "C:\wamp\logs\mysql_general.log"
Are you sure you are not getting execution times in your slow query log?
If you are just looking to optimize your queries (rather than checking the execution time of every single one), you can look at the mysql server status in phpmyadmin (assuming you kept it in your wamp server) as covered here. The full tutorial is paid, but the preview will get you into the server status page where phpmyadmin will point out problem areas for you.
Finaly I used the general log by setup WAMP server like this
[mysqld]
port=3306
long_query_time = 1
slow_query_log = 1
slow_query_log_file = "E:/wamp/logs/slowquery.log"
log = "E:/wamp/logs/genquery.log"
after that I used this tool (trial version) dbForge Studio where I can use a query profiler and I get the complete execution time.

Coldfusion: Move data from one datasource to another

I need to move a series of tables from one datasource to another. Our hosting company doesn't give shared passwords amongst the databases so I can't write a SQL script to handle it.
The best option is just writing a little coldfusion scripty that takes care of it.
Ordinarily I would do something like:
SELECT * INTO database.table FROM database.table
The only problem with this is that cfquery's don't allow you to use two datasources in the same query.
I don't think I could use a QoQ's either because you can't tell it to use the second datasource, but to have a dbType of 'Query'.
Can anyone think of any intelligent ways of getting this done? Or is the only option to just loop over each line in the first query adding them individually to the second?
My problem with that is that it will take much longer. We have a lot of tables to move.
Ok, so you don't have a shared password between the databases, but you do seem to have the passwords for each individual database (since you have datasources set up). So, can you create a linked server definition from database 1 to database 2? User credentials can be saved against the linked server, so they don't have to be the same as the source DB. Once that's set up, you can definitely move data between the two DBs.
We use this all the time to sync data from our live database into our test environment. I can provide more specific SQL if this would work for you.
You CAN access two databases, but not two datasources in the same query.
I wrote something a few years ago called "DataSynch" for just this sort of thing.
http://www.bryantwebconsulting.com/blog/index.cfm/2006/9/20/database_synchronization
Everything you need for this to work is included in my free "com.sebtools" package:
http://sebtools.riaforge.org/
I haven't actually used this in a few years, but I can't think of any reason why it wouldn't still work.
Henry - why do any of this? Why not just use SQL manager to move over the selected tables usign the "import data" function? (right click on your dB and choose "import" - then use the native client and permissions for the "other" database to specify the tables. Your SQL manager will need to have access to both DBs, but the db servers themselves do not need access to each other. Your manager studio will serve as a conduit.