Does anyone know how to fix this error on Microsoft Sync Framework: SQL Server change tracking has cleaned up tracking information for table ... ?
if SQL CT has indeed clean up the Change Tracking info, there's not much you can do about it. you will have to reinitialize your clients
Related
recently there was a change in Bigquery UI and it seems that is no longer possible to schedule a stored procedure to execute automatically.
Using the UI, just keeps asking to insert a destination table. If I put a dummy table, the schedule is created but then when tries to execute just throws an error saying that we can't have a destination table when executing a stored procedure.
Is anyone having this issue and has any kind of workaround ?
Thanks in advance.
You can opt out of the preview features by clicking the Hide Preview Features button in the top bar of the UI.
This is a known issue that the BigQuery engineering team is already aware of and are rolling the fix for it soon. You can follow their progress on this Public Issue
In Azure SQL Datawarehouse i just used the below tsql code to enable auto statistics creation.The command ran successfully , but when i checked in database properties under option tab Auto Create Statistics is till set to False.
ALTER DATABASE MyDB SET AUTO_CREATE_STATISTICS ON;
Please let me know if i'm missing here something. I have the db_owner access for the database also.
I'm guessing that you are using SQL Server Management Studio.
I was able to reproduce the symptom by turning off and on auto_create_statistics.
The issue appears to be that the database metadata is cached in SSMS. Right-click the database name and select "Refresh" before selecting "Properties". Using this method I got the correct setting for auto_create_statistics showing up each time.
My tests were done using SSMS 17.7
(The need to refresh the database metadata can also occur when adding or removing tables, columns, etc)
You can also query sys.databases, the is_auto_create_stats_on column.
I have one Tabular SSAS project & I have deployed it on SQL Server Analysis Service 2012.
Then I have created a BISM connection to that deployed project as in following way.
Then I have created one PowerPivot excel report out of that BISM workbook using its connection string & it is working correctly in Excel.
But when I upload it to the SharePoint, it give me following error while refreshing the Data from BISM workbook.
this error is observed only when I open it on browser i.e. Excel Web Access.
but the same error is not observed when I refresh the report in Microsoft Excel 2013.
why is it giving error on only SharePoint i.e. Excel Services. due to this I am not able to change Slicer's either.
One of the solution I found was to use some Hot Fix of SQL Server. but I haven't tried this. Plz refer the below link Hot Fix for SQl Service Pack 1 Update
Please help!
Thanks in advance.
This refresh should be possible, assuming permissions are set up correctly. You should post ULS logs for the time period of the refresh.
Does this same thing happen if you do a scheduled refresh (and choose refresh now)?
Is it possible to view the contents of a Dynamics NAV 2013 Database Table while being in a debugging session?
When I go the development environment I can normally hit run on any table and explore its contents. How ever, while the debugger is running, this is not possible, since the whole Dynamics Nav environment is frozen when the debugger stops on a break point.
One work around I have found, is to copy the relevant data to excel before running the debugger, but that is not so convenient. Also, in the watch list of the debugger, I can only view single variables, but not the whole database table.
You can simply open Sql Server Managemant Studio and have a look at the tables.
Of course, you will see the changes only when they are commited. so either the code in NAV has passed the actual trigger where the record is modified or you explicit call COMMIT();
If you never used sql server management you will notice that the tables are stored with the company nam e ahead.
For example the item ledger entry in demo database is:
[CRONUS AG$Item Ledger Entry]
and a select statement for reading all records in the table could be
SELECT *
FROM [Demo Database NAV (7-0)].[dbo].[CRONUS AG$Item Ledger Entry]
Regards
Alex
The debugger does not have a "table view". You're either stuck with using SQL, without getting calculated fields shown, or you can use another session (in some cases that requires another service tier, since the debugger has the nasty tendency to block the entire service tier).
But another session will not display uncommitted data.
An alternative (not great), is to create a simple method, that loops through all records and dumping FORMAT(rec) into a txt file. That method can be called in the places where you need to inspect the table.
But, unless calculated fields are necessary I would also go with SQL.
I want to track how much time needed for my queries to be executed
I referred to this post but I get only the queries without the time.
It is possible that after using my web application for a wile,using select, update , insert queries (not from console but real web-application execution) I can get a summary like this output generated by SHOW PROFILES; command.
I am working with wamp mysql V5.5.24
Many thanks
Edit: I used triggers to track the update and insert statement following this method
I still have the problem how to track the select query.
any idea please?
This no longer works.
As of July 2013, you need:
general-log=1
general-log-file = "C:\wamp\logs\mysql_general.log"
Are you sure you are not getting execution times in your slow query log?
If you are just looking to optimize your queries (rather than checking the execution time of every single one), you can look at the mysql server status in phpmyadmin (assuming you kept it in your wamp server) as covered here. The full tutorial is paid, but the preview will get you into the server status page where phpmyadmin will point out problem areas for you.
Finaly I used the general log by setup WAMP server like this
[mysqld]
port=3306
long_query_time = 1
slow_query_log = 1
slow_query_log_file = "E:/wamp/logs/slowquery.log"
log = "E:/wamp/logs/genquery.log"
after that I used this tool (trial version) dbForge Studio where I can use a query profiler and I get the complete execution time.