I'm trying to use JRules BRMS 7.1 for a project. And I found out that DVS has some limitation in testing Ruleset.
It is that it cannot test the content in collections of complex type in Excel scenario file templates.
But I understand it is normal as that kind of content is too complex for an Excel table format.
So anyone has any idea what is the best way to test a ruleset that need tons of test cases with lots of complex type input without using DVS?
If developers are doing the testing, then use JUnit with an embedded rule engine. If non-technical users need to perform testing, it may be simplest to upgrade to WODM 7.5 which does not have this limitation. If that is not an option, then it is possible to use JRules 7.1 DVS, but it is somewhat complex and involves creating a separate wrapper rule project that takes the output collections as input and in its XOM, performs the comparison with the actual results.
Raj Rao is correct, you can use array as expected results (input is easy) but you will have to use hidden JRules API and it is painful anyway.
JUnit or 7.5 is the answer.
Unless you want to pay IBM to do it, even so they may say it is not possible because it is not detailled anywhere :(
Cheers
PS: BTW, arrays of complex types as input is easy for sure and well documented, I think.
If you have deployed your rules as a HTDS service to RES, then you could use SoapUI to test the HTDS web service.
SoapUI allows you to set up test cases that can be used to test different scenarios.
To validate the rules using Decision Validation Services, you create an Excel scenario file template that you populate with scenarios to test.
Before generating the Excel scenario file template, you must check that your project does not contain any errors or warnings that could prevent the generation of the Excel file.
step1:in your rule explorer select your project in rule project enable the dvs part click check point and make sure that you don't have any errors.
2:create scenario file click next give the name for test project name.xls.
3:pass the values in scenario and expected results in expected results column.
4:you can test multiple scenarios at a time.
5:now close and save the excel file.
6:run configuration right click dvs excel file give any name for test
7:in excel file field click browse and select xls file
8.in rule project field select your rule project
9:in HTML report field select your project and click OK.
10:click apply and run
11:in rule studio right click on your project and click refresh
12:the HTML file will be generated in project.
13:right click and open with web browser and observe the result of your scenarios.
14:you have successfully enabled dvs
Related
I am from Slovakia, I wouldn't be surprised if most of you haven't heard about it.
However, that causes me a troubles when it comes to reports. We need to have 3 (soon 4) language versions of each report: Slovak is main language, than, Polish and English.
Since pentaho does not support Polish nor Slovak, it is really pain for me to keep these localized.
What I do is:
Create report in Slovak language
Write down all phrases from report
Send phrases to one of our partners to translate
Create its copy in either pl/en directory
Open it in Report Designer and edit every phrase accordingly
Save as another language version
As you can imagine, the process is very time consuming, and error prone. Plus, every time I add new parameter to report or change its data source (which is BeanShell script), I need to do it in 3 separated files. As a result of this, language mutations are usually out of date, way behind main language version.
I have tried to automate it with OneSky and did a python script that does 2 stages:
Stage 1 (extract and upload):
Change *.prpt files sufix to *.zip
Extract phrases from files: ~/datadefinition.xml, ~/layout.xml, ~/styles.xml, ~/datasources/inline-ds.xml
Put those phrases into *.po file
Export *.po file into OneSky
Stage 2 (download and import):
Change *.prpt files sufix to *.zip
Download translated *.po file from OneSky
Run through ~/datadefinition.xml, ~/layout.xml, ~/styles.xml, ~/datasources/inline-ds.xml files and replace original phrases by translated
While this aproach works fine, it doe not translate everything. There are still flaws of this process. I need to go through it every time I do even slightest change in data source of report or fix small mistakes. Even if I just do a small six in SQL code, I need to do it in 3 files. That of course increases chance to mistake be made.
Soo, I was wondering, how are you guys solving this issue with translating of your reports?
I will share very simple method which we are following.
1)create a properties file with key value format for each language for resource labels(for static values)
2)put it into resources folder(report-designer/resources/)
3)Based on the parameter you can specify which properties file to select and you can specify keys into value field so that it can understand which value to display in which language.
4)if you need to convert the data which are coming from database,you have to design data warehouse specify all the mappings,accordingly it can fetch the data.
5)For converting dates and currency symbols or number format you can use inbuilt functions which can handle all this things,i am using mysql and mysql has translation functions which can handle all such things.
it is difficult to explain entire process here, but if you can get and idea from this it can be useful to you.
I searched a lot regarding the Data Matrix Code generation from Nav 2015 but could not get any proper solution for that though, i got some code from below link but still, some of the automation variables is not there in Navision, so I need you guys help on this, is there any Code Unit or any object or any other way in Nav..
http://www.barcode-soft.com/dynamics-nav-barcode.aspx
It depends on how much time you have to get the barcode.
If it's a back end job, like a report, you can call a command line tool to create the barcode and import the generated image file into a BLOB of a table variable. This table field is then printable within the report.
Another way I use in production is running a web service that creates the barcode and then let Navision create a web page that is opened in a browser window..
I suggest using a dll (written in C# with ZXING.NET) to generate it and then importing it in NAV.
I am currently developping a windows application who test railroad equipments to find any defaults.
Utility A => OK
Utility B => NOK
...
This application will check the given equipment and generate a report.
This report needs to be written once, and no further modifications are allowed since this file can be used as working proof for the equipment.
My first idea was ta use pdf files (haru lib looks great), but pdf can also be modified.
I told myself that I could obsfuscate the report, and implement a homemade reader inside my application, but whatever way I store it, the file would always be possibly accessed and modified right?
So I'm running out of ideas.
Sorry if my approach and my problem appear naive but it's an intership.
Thanks for any help.
Edit: I could also add checksums for files after I generated them, and keep a "checksums record file", and implement a checksums comparison tool for verification? just thought about this.
I believe the answer to your question is to use any format whatosever, and use a digital signature anybody can verify, e.g., create a gnupg, get that key signed by the people who require to check your documents, upload it to one of the key servers, and use it to sign the documents. You can publish the documents, and have a link to your public key available for verification; for critical cases someone verifying must be trust your signature (i.e., trust somebody who signed your key).
People's lives depend on the state of train inspections. Therefore, I find it hard to believe that someone expects you to solve this problem only using free-as-in-beer components.
Adobe supports a strong digital signature model. If you buy into their technology base, you can create PDF's that are digitally signed, and are therefore tamper-evident, as the consumer can check for the signature.
You can, as someone else pointed out, use GNUpg, or for that matter OpenSSL, to implement your own signature scheme, but railroad regulators are somewhat less likely to figure out how to work with it.
I would store reports in an encrypted/protected datastore.
When a user accesses a report (requests a copy, the original is of course always in the database and cannot be modified), it includes the text "Report #XXXXX". If you want to validate the report, retrive a new copy from the system using the Report ID.
Does anyone know if you can use CCTray (or an equivalent) with AnthillPro? I'm not finding a lot of documentation and am new to using AHP.
Thanks.
You should be able to use CCTray type tools with AnthillPro. You would need to create a custom report to generate the XML though.
Shoot me an email at eric#urbancode.com I may be able to write this later in the week.
Otherwise, you could experiment with report writing.
You can find the cc xml format here: http://confluence.public.thoughtworks.org/display/CI/Multiple+Project+Summary+Reporting+Standard
Example AP report code that iterates over each build workflow and spits out data about the latest build is here: https://bugs.urbancode.com/browse/AHPSCRIPTS-13
The "Recent Build Life Activity (RSS)" report that I think ships with the product would give you an XML example.
I've been following the advice in this question.
How to add a WiX custom action that happens only on uninstall (via MSI)?
I have an executable running as a custom action after InstallFinalize which I intend to purge all my files and folders. I was just going to write some standard deletion logic but I'm stuck on the point that Rob Mensching made that the windows installer should handle this incase someone bails midway through an uninstallation.
"create a CustomAction that adds temporary rows to the RemoveFiles table"
I'm looking for some more information on this. I'm not really sure how to achieve this in c++ and my searching hasn't turned up a whole lot.
http://msdn.microsoft.com/en-us/library/windows/desktop/aa371201(v=vs.85).aspx
Thanks
Neil
EDIT: I've marked the answer due to the question being specific about how to add files to the removeFiles table in c++ however I'm inclined to agree that the better solution is to use the RemoveFolderEx functionality in wix even though it is currently in beta (3.6 I think)
Roughly you will have to use the following functions in this order:
MsiDatabaseOpenView - the (input) handle is the one you get inside your custom action functions
MsiCreateRecord - to create a record with the SQL stuff inside
MsiRecord* - set of functions to prepare the record
MsiViewExecute to insert the new record into whatever table you please ...
MsiCloseHandle - with the handle from the very first step and the record handle (from MsiCreateRecord)
Everything is explained in detail over at MSDN. However, pay special attention to the section "Functions Not for Use in Custom Actions".
The documentation of MsiViewExecute also explains how the SQL queries should look. To get a feel for them you may want to use one of the .vbs scripts that are part of the Windows Installer SDK.
If you use WiX to create your installation package, consider using RemoveFolderEx element. It does what you want and you don't have to write the code yourself.
Read Tactical directory nukes for an example of how to use it.
If you still want to implement it yourself, you can get your inspiration from this blog post, there's the code for doing this in VBScript.