I am currently working on getting the Cumulative Update no.22 release for Italy(Dynamics NAV 2017).
My previous version was on CU7.
I am currently looking to update my objects but when i import the .fob file of the update i get warnings, many warnings , since my database is modified.
So after lot of research i found out that i can use Merge-NAVApplicationObject command to merge my objects and apply updates to them.
BUT i can not understand what paramaters should i set. Documentation says three parameters (Original , Target, Modified)
How am i supposed to know who is who? I have already updated my platform now looking to update my application objects.
Anyone done that before?
Original is CU 7 without your modifications.
Target is CU 22
Modified is your solution.
Related
I have a query that imports two sheets from an xlsx file, combines the data, unpivots it, and does other cleanup steps. For the resulting fact table I also defined relations to 3 further dim tables.
Then I decided to refactor the query in the PowerQuery editor, including splitting the query up into 3 steps and disabling load of the the first two. (To play it safe I also stored my pbix w/o applying the steps.)
But then after a Close & Apply I got the following cryptic error message:
Something went wrong.
Each original name must belong to the set of all names.
PBIDesktop version is 2.96.701.0 64-bit (August 2021)
Searching the web I found people suggesting that the error might be related to specific PBIDesktop versions and that moving back to a previous version would solve the issue.
For me it was easier than that, though a bit mysterious:
In PowerQuery I enabled load for all sub-queries,
let it apply all my steps and the error message was gone.
Then back in PowerQuery I could disable load for the sub-queries w/o triggering the error again!
I hope that this might help someone somewhen.
I'm writing procedure for other users to run in Enterprise guide based on SAS 9.3. It logs various bits of information to a table. Is there any way to stop this table appearing in the process flow?
NB This is almost all done using "User written code" steps. Unfortunately the setting in the menu (see vasja's answer below) does not seem to affect UWC steps.
(I've seen this: Tell SAS not to add newly generated tables on the Process Flow but I'm using 9.3 so it doesn't work!)
A colleague (twitter.com/binarytrain) figured out a solution.
Tables are always added to EG projects in 9.3 if, at the end of the code step, the library in which it exists is still assigned(1). So, in the question above, the trick is to clear the libname at the end of the code step.
This can further be used to "discourage" - not stop - users from meddling with temporary tables.
Create a folder in &sasworklocation called _work
Register it as a library
Save any temporary tables in this new library
Clear this library at the end of the code step
At this point the temporary table is inacessible without running a libname statement
Re-registered the library when the table is required again.
(1) Even if it's assigned using a different name, so this won't work for pre-assigned libraries.
In EG 5.1:
go to Tools - Options, select Result General:
deselect Automatically add output to the project tree.
I've been following the advice in this question.
How to add a WiX custom action that happens only on uninstall (via MSI)?
I have an executable running as a custom action after InstallFinalize which I intend to purge all my files and folders. I was just going to write some standard deletion logic but I'm stuck on the point that Rob Mensching made that the windows installer should handle this incase someone bails midway through an uninstallation.
"create a CustomAction that adds temporary rows to the RemoveFiles table"
I'm looking for some more information on this. I'm not really sure how to achieve this in c++ and my searching hasn't turned up a whole lot.
http://msdn.microsoft.com/en-us/library/windows/desktop/aa371201(v=vs.85).aspx
Thanks
Neil
EDIT: I've marked the answer due to the question being specific about how to add files to the removeFiles table in c++ however I'm inclined to agree that the better solution is to use the RemoveFolderEx functionality in wix even though it is currently in beta (3.6 I think)
Roughly you will have to use the following functions in this order:
MsiDatabaseOpenView - the (input) handle is the one you get inside your custom action functions
MsiCreateRecord - to create a record with the SQL stuff inside
MsiRecord* - set of functions to prepare the record
MsiViewExecute to insert the new record into whatever table you please ...
MsiCloseHandle - with the handle from the very first step and the record handle (from MsiCreateRecord)
Everything is explained in detail over at MSDN. However, pay special attention to the section "Functions Not for Use in Custom Actions".
The documentation of MsiViewExecute also explains how the SQL queries should look. To get a feel for them you may want to use one of the .vbs scripts that are part of the Windows Installer SDK.
If you use WiX to create your installation package, consider using RemoveFolderEx element. It does what you want and you don't have to write the code yourself.
Read Tactical directory nukes for an example of how to use it.
If you still want to implement it yourself, you can get your inspiration from this blog post, there's the code for doing this in VBScript.
I'm looking for a good efficient method for scanning a directory structure for changed files in Windows XP+. Something like how git does it is exactly what I'm looking for, when running a git status it displays all modified files, all new (untracked) files and deleted files very quickly which is exactly what I would like to do.
I have a basic model up and running which performs an initial scan and stores all filenames, size, dates and attributes.
On a subsequent scan it checks if the size, attributes or date have changed and marks as a changed file.
My issue now comes in detecting moved and deleted files. Is there a tried and tested method for this sort of thing? I'm struggling to come up with a good method.
I should mention that it will eventually use ReadDirectoryChangesW to monitor files and alert the user when something changes so a full scan is really a last resort after the initial scan.
Thanks,
J
EDIT: I think I may have described the problem badly. The issue I'm facing is not so much detecting the changes - I have ReadDirectoryChangesW() using IOCP on multiple threads to detected when a change happens, the issue is more what to do with the information. For example, a moved file is reported as a delete followed by a create and a rename comes in 2 parts, old name, followed by new name. So what I'm asking is how to differentiate between the delete as part of a move and an actual delete. I'm guessing buffering the changes and processing batches would be an option but feels messy.
In native code FileSystemWatcher is replaced by ReadDirectoryChangesW. Using this properly is not simple, there is a good baseline to build off here.
I have used this code in a previous job and it worked pretty well. The Win32 API itself (and FileSystemWatcher) are prone to problems that are described in the docs and also discussed in various places online, but impact of those will depending on your use cases.
EDIT: the exact change is indicated in the FILE_NOTIFY_INFORMATION structure that you get back - adds, removals, rename data including old and new name.
I voted Liviu M. up. However, another option if you don't want to use the .NET framework for some reason, would be to use the basic Win32 API call FindFirstChangeNotification.
You can use USN journaling if you are up to it, that is pretty low level (NTFS level) stuff.
Here you can find detailed information and source code included. It is written in C# but most of it is PInvoking C/C++ functions.
I have a fairly long cfc file, about 1800 lines long, that worked fine in ColdFusion 8, but after upgrading my development system to ColdFusion 9 and doing some testing I get a compile error for a cfc and the message says "Branch target offset too large for short". I modified the file to eliminate some unused functions and consolidated one to make it shorter and this resolved the problem to get it to work. But still, why did it die on me now when I upgraded to CF9? Anyone else run into this problem in previous or the current version of ColdFusion? Are there any solutions other than modifying the cfc file such as upgrading the jvm?
EDIT
If you have an answer to the questions I have, great! Post that, but don't waste time telling me something that I already know. If you are going to post a response, please read the question carefully and answer only if you know the answer. Don't do a google search and post crap that I already know and utilized to get the code to work. The question is, why did it work in CF8 and now not in CF9? Are there other solutions besides what I did?
This is a problem inherent with the JVM as you already know, CF9 has likely added more innate functions to a component and if the methods are all referenced via a giant switch statement with a short being used as the offset, we have less offset pointer space to work with each successive version. People moving from CF7 to CF8 also had the same problems.
So short answer is no.
Most recommendations you find basically tell you to split a large method into a smaller method and several helper methods. The first time I ran into an issue this worked for a large cfc I had. But then as it got bigger no number of helper functions would fix it. Eventually it had to be split into multiple cfcs.
PS: This guy said removing a transaction helped (CF7), there are none wrapping my calls though, so it's not a guaranteed fix I guess http://www.coldfusionmuse.com/index.cfm/2007/9/28/Branch.Target.Offset
Edit
Looks like my previous issue was a different function being too large, splitting the CFC into multiple CFCs was in error. I've since split that problem method into smaller methods, and have been able to consolidate all the functions in one CFC. So that is the solution it seems.
If you haven't already, try running the Code Analyzer in the CFAdmin page, "Debugging & Logging > Code Analyzer". This is a useful tool to find some changes which were made in the language between CF8 and CF9.
We had to change several variable names and function names as CF added them in 9.
Also check here:
http://help.adobe.com/en_US/ColdFusion/9.0/CFMLRef/WSc3ff6d0ea77859461172e0811cbec22c24-7ff0.html
CF version: 10
OS: Linux CentOS 6.0
Did face a similar issue where I had 1300+ lines of code in my cfc and one fine day I get the "Branch Target offset.." error. I tried
Code Analyser to find any loopholes of legacy- DID NOT FIX
Edited the cfc to trim down any last bit of redundant code or comments - reduced around 20+ lines of code - DID NOT FIX
I split the code into 2 cfc and extended one to the other- DID NOT FIX
I removed any unwanted dump of queries and arrays (part of testing) : THIS WORKED
so I would suggest ensure you don't have any dumps of large data content. Hope this helps.