I am in the process of developing a fairly complex rules engine. So I decided to take help of any GNU rules engine and get it integrated with my application. I came across CLIPS as a good rules engine.
Now, My application is in C++ and I want a sample way (a Hello world kind of program) from which I can learn how to integrate the .clp rules engine to my C++ application.
Question
My application is developed on Linux/AIX/HP and Mingw (for windows). Can we develop rules engine in CLIPS and get it integrated to C++ application on all these platforms? Can you please share a link on how to integrate.
The fundamental reason for going into an rules engine is that, I have experienced that rules "built" within my C/C++ application takes huge memory/CPU. I am under the impression that by using a rules engine, I can achieve the same in a more optimized (better resource utilization) way. Can CLIPS help me achieve that?
Update 1:
What kind of application are you developing?
To put it in a line,I am developing a filter match based counter. User may increment (NetworkID=XYZ, Increment count = 7), (NetworkID = MNO, Increment count=934)... etc. Now you get a query for NetworkID=X*, then I need to provide all count from XAA...XZZ. It is updated in multiple process, multiple thread across different node (distributed environment).
Why do you have expert system rules inside, and what kind of rules?
Now, My platform/application is in C++ (where user does increment/decrement/query). Now I want to use a rules engine to aid me in these. Writing the logic in C/C++ code seems to kill more resource that needed.
PS: The critical code related to increment/decrement/query are all in optimized c code. Some wrappers are in C++ code. So I am checking for rules engine to do it for me which can be invoked from my platform/application (in C/C++ code).
The easiest way to integrate CLIPS with C++ is to use the compiler option (if available) to compile the C code as C++ code. From Section 1.2, C++ Compatibility, of the Advanced Programming Guide (http://clipsrules.sourceforge.net/documentation/v624/apg.htm):
The CLIPS source code can now be compiled using either an ANSI C or
C++ compiler. Minimally, non-ANSI C compilers must support full ANSI
style function prototypes and the void data type in order to compile
CLIPS. If you want to make CLIPS API calls from a C++ program, it is
usually easier to do the integration by compiling the CLIPS source
files as C++ files. This removes the need to make an extern "C"
declaration in your C++ program for the CLIPS APIs. Some programming
environments allow you to specify the whether a file should be
compiled as C or C++ code based on the file extension. Other
environments allow you to explicitly specify which compiler to use
regardless of the extension (e.g. in gcc the option “-x c++” will
compile .c files as C++ files). In some environments, the same
compiler is used to compile both C and C++ programs and the compiler
uses the file extension to determine whether the file should be
compiled as a C or C++ program. In this situation, changing the .c
extension of the CLIPS source files to .cpp usually allows the source
to be compiled as a C++ program.
Optionally you can try using something like clipsmm (http://sourceforge.net/projects/clipsmm/) which is a C++ interface to the CLIPS library.
Related
This question has been bothering me so much for the past couple of days. I was wondering how the standard library works, in terms of functionality. I couldn't find an answer anywhere, even by checking the source code provided by the LLVM compiler which is, for a beginner like me, a really complicated piece of code.
What I'm basically trying to understand here is how does the C++ standard library work. For example let's take the fstream header file which consist of a bunch of functions that help to write to and read from files.
How does it work? Does it use the OS specific API (since the library is cross platform), or what? And, if the standard library can do it, aren't I supposed to be able to mess with some files as well without calling the standard fstream file (which to my experience I can't do)?
I apologize if my questions are unclear since I'm not a native English speaker: feel free to modify this text so as to make it clearer.
Does it use the OS specific API (since the library is cross platform), or what?
At some point, the OS specific API is used. The fstream implementation does not necessarily call an OS function directly. It might use other classes, which call functions inherited from C, etc., but eventually the call chain will lead to an OS call. (Yes, the details are often too complicated for an intermediate programmer to follow. So, as a self-described beginner, your findings are not surprising.)
The library is cross-platform in the sense that on your end (the C++ programmer), the interface is the same regardless of platform. It is not, however, the same library on every platform. Each platform has its own library, exposing the same interface on the C++ side, but making use of different OS calls. (In fact, the same platform might have multiple standard libraries, as the library implementation is provided by your toolchain, not by the standards committee.)
And, if the standard library can do it, aren't I supposed to be able to mess with some files as well without calling the standard fstream file (which to my experience I can't do)?
Yes, you are allowed to. Apparently, you have not been able to yet, but with some practice and guidance you should be able to. Everything in the standard library can be recreated in your own code. The point of the standard library (and most libraries, for that matter) is to save you time, not to enable something that was otherwise unavailable. For example, you don't have to implement a file stream for every program you write; it's in the standard library so you can focus on more interesting aspects of your project.
A compiler is just a program which create executable file or library. You can use the compiler default libraries to gain time or write your own. The default libraries communicate with the os for file operation or memory allocation and provide a simple standard classes to allow the developper to write only one code which work on all target platforms supported by the compiler and the libraries. If you want to write your own you have to write each function for all your target os.
The standard library is cross-platform in a sense that its interface does not change between platforms but its implementation does - or in practical terms - if you only use C++ and its standard library, you can write your code the same way for Linux / Windows / MacOS / Android / Whatever and if you find a C++ compiler for one of those platforms that supports the language features you used, you will be able to compile your code for that platform without rewriting anything.
So while you can use std::vector or std::fstream or any other feature in the library independently of the platform you're writing for and expect the function definitions, type names, etc. to look the same, you cannot expect the executable which you compiled for PC with Windows 10 to run on a phone with Android. You cannot even expect the same executable to run on the same PC but with different system - that is what I mean by "the implementation is different"
There are two main reasons for this difference:
Processors with different architectures (x86-64 and ARM for example) use different instruction sets and as such the C++ source would need to be compiled to a completely different machine code to run properly
Computers with processors of the same architecture which have a different operating system have different ways of dynamically allocating memory, creating files, creating streams, writing to console, creating and scheduling threads etc. - which is part of the system functionality that you use via the standard library
If you really wanted to you could use HeapAlloc() instead of operator new() or CreateThread() instead of stdlib's std::thread but that would force you to both rewrite your program every time you wanted to compile it for something else than Windows and recompile it with the target platform's compiler (and by proxy learn its API). Standard library saves you from that trouble by abstracting away those system calls.
As for the fstream in particular, here is what it uses internally on most PCs nowadays.
Basically, fstream, iostream and printf works based on a kernel function write(). When your code call printf (we use printf as an example), it will finally call write() to let the kernel work on the IO stuff. After that, write() returns and printf returns and your code continues.
So if you really want to know how the printf works internally, you have to read the source code of the Kernel.
But you shouldn't do that for now.
For a beginner, do not try to go deeper when you haven't got a basic cognition about computer. A computer is a project, just like a building. So the right way to learn it is to learn it level by level. First, learning how to use brick and cement to build a building, this is what you should do for now. What you shouldn't do is that you are learning how to build a building and this is your first time to try to use brick, then you are interested in how to produce a brick and start to focus on brick, this is a wrong way to learn IT.
If you are learning C/C++, just learn it. Remember, learn it level by level. For now, knowing how to use printf is enough.
I've started long ago to work on a dynamic graph visualizer, editor and algorithm testing platform (graphs with nodes and arcs, not the other kinds).
For the algorithm testing platform i need to let the user write a script or call a script from a file, which will interact with the graph currently loaded. The visualizer would do things like light up nodes while they're being visited by the script algorithm, adding some artificial delay, in order to visualize the algorithm navigating and doing stuff.
Scripts would also be secondly used to add third party features that i could either make available as pre-existing scripts in the program folder OR just integrate inside the program in c++ once they're tested and working.
All my searches for an interpreter to embed in my program sent me to lua;
then i started handwriting my own recursive descent parser for my own C-like syntax scripting language (which i planned to use a subset of C++ grammar so that any code written in my scripting language can be copy-pasted in any C++ code.
It was an interesting crazy idea which i don't regret at all, I have scopes, functions, cycles, gotos, typesafe variables, expressions.
But now that i'm approaching the addition of classes, class methods, inheritance (some default classes would be necessary to interface scripts to the program), i realized it's going to take A LOT of time and effort. A bit too much for a personal project of an ungraduated student with exams to study for… but still i whish to complete this project.
The self-imposed requirement of the scripts being 100% compatible with C++ was all but necessary, it would have been just a little nice extra thing, which i can do without.
Now the question is, is there an alternative to lua with a c-like syntax that supports all i've already done plus classes and inheritance? (being able to add custom "classes" that interface scripts to the program is mandatory)
(i can't assume the user to have a full c++ compiler installed so i cant just compile their "script" at runtime as a dll to load and call it, although i whish i could)
Just-in-time compilation of C++
Parsing C++ is hard. Heck, parsing C is hard. It's difficult to get it right, and there are a lot of edge cases. Thankfully, there are a few libraries out there which can take code and even compile it for you.
libclang
libclang provides a lot of facilities for parsing c++. It's a good, clean library, and it'll parse anything the clang compiler itself will parse. This article here is a good starter
libclang provides a JIT compilation tool that allows you to write and compile C++ at runtime. See this blog post here for a overview of what it does and how to use it. It's very general, very powerful, and user-written code should be fast.
GCC also provides a library called libgccjit for just-in-time compilation during the runtime of a program. libgccjit is a C library, but there's also a C++ wrapper provided by the library maintainers. It can compile abstract syntax trees and link them at runtime, although it's still in Alpha mode.
cppast
If you don't want to use libclang, there's also a library under development called cppast, which is a C++ parser which will give you an abstract syntax tree representation of your c++ code. Unfortunately, it won't parse function bodies.
Other tools
If anyone knows any other libraries for compiling or interpreting C++ at runtime, I encourage them to update this post, or comment them so I can update it!
Here is something that lets you embed a C-like scripting language in your application (and a bunch of other cool things):
http://chaiscript.com/
There is lots of documentation:
https://codedocs.xyz/ChaiScript/ChaiScript/
Im trying to understand how a binding (port) to another language works in general, but to help clarify my question I will use the direct example of a project called libsass (A C/C++ implementation of a Sass compiler).
There is another project node-sass
which is Node.js bindings to libsass.
Im assuming this means node-sass is a javascript program which runs on nodejs and nodejs acts as a proxy forwarding instructions to the libsass C++ system level program.
My question is: how does the nodejs intepreter "talk" to the libsass C++ application? - is it using sockets?
sub question: If node-sass exposed an API in the node environment by initialising objects, functions etc that were available to your own node scripts - is this by definition -the "binding"?
The C++ library part is, given that it´s really a library and not some server program, not running by itself and not listening to some socket. If a C++ lib is used in a C++ program, it´s integrated in this programs process too and not running somewhere else.
Many languages have built-in possibilites to access native C language APIs, including Node.js (with C being the de-facto standard for language interoperabilty, eg. because every somewhat important OS consists mainly of C too.). About C++ vs C, it´s not hard to write something in C++ and provide a C interface too.
In such cases, a language binding often is nothing more than something to wrap the complicated native access part in something more easy to use in the target language.
To elaborate a bit further because of the comment:
The OS itself has functions (to be used in C programs) to load C libraries on the fly, get specific functions of them and call them, without the names of lib and functions being known when the C program is compiled (eg. you could make a C program which asks the user to enter a lib name which is then used...).
Independent of that, every language is either made in a way that programs are compiled to "real" programs containing CPU instructions etc., these programs can be executed directly (example: C), or the programs of the language are made is some other format, but a "real" program is needed for every start to help the OS/CPU understanding what should be done (example: Javascript, Java.... You can´t run a program alone without having helper software installed, like a browser or the JRE).
For this second type, the helper software can make use of the lib loading functions of the OS, and if the JS/Java program contains instructions to do so... (and for the first "real" type, a certain level of compatibilty with C libs is automatically given because they use the same binary format (yes, that´s simplified))
I'm very new to Haxe, and specifically want to use it to produce C++ code from Haxe (actually the flow would be AS3->Haxe, then Haxe->C++). My understanding is that Haxe compiles Haxe directly to a (C++) executable. But does it explicitly output the generated source?
Can/does Haxe supply the C++ code that it produces in this process? -- As I could then take and use this source within another C++ cross-compiler such as Marmalade (with modifications, of course).
I'm wondering about the intensiveness of the conversions, also. If Haxe does produce/supply the C++ source, then what does this source look like? Is e.g. memory management all packaged up into native DLLs/SOs? In that case, it seems like Haxe wouldn't be an ideal option.
(Disclaimer: I'm just trying to get some preliminary information before I go down this road. In fact, more specifically, I want to port from AS3 to C++ for Marmalade. So I want to know if it is worth writing my own converter or if Haxe provides a viable alternative.)
If you're looking to go from AS3->C++ through Haxe, then you should check out NME. It allows you to use the Flash Player API to write applications to compile to native ones (through the C++ backend), swfs and html5 applications.
Also it offers a whole workflow for assets and such. And it has pretty good integration with FlashDevelop (windows only) and MonoDevelop, but you can of course use any IDE.
Yes, Haxe outputs the source for you. Haven't ever looked into it very deeply, but it's there. When you compile for a C++ target (e.g. Windows) the source can be found under bin\cpp\windows\obj.
Is it possible to compile C++ program into some intermediate stage (similar to bytecode in java) where the output is platform independent and than later compile/link at runtime to run in native (platform dependent) code?
If answer is no, why?
It is indeed possible, see for example LLVM.
Of course. Keep in mind that the C++ standard only specifies behavior: What should happen when this program executes. It doesn't specify how it should be implemented.
C++ code can be compiled to an intermediate format and JIT'ed to machine code, or it can be interpreted or anything else you like.
This is trivial, and most compilers already do that. gcc compiles to RTL (register transfer language) which is then translated to the target CPU.
Similarly, managed C++ and C++/CLI are compiled to .NET.
Finally you can consider the Church Turing thesis that is a statement of equivalence of programming languages, so C++ can be compiled/translated to your favorite platform independent language (say, Perl, lisp, C--, etc).
C++ source code (with some restrictions) is a platform-independent bytecode.
Why is it not?
Indeed, "bytecode" compilation procedure is then mere copying. The virtual machine that runs the "bytecode" is C++ compiler and a wrapper script. Yeah, it does some stuff that resembles compilation to machine code--but that's an implementation detail.
Here's a Linux implementation of such a "C++ virtual machine":
#/bin/sh
tmp=`mktemp`
g++ $1 -o $tmp && $tmp $2 $3 $4 ...
Does it answer the question? I think, it does. To the extent how specific the question is. Because it clearly explains theoretical possibility of compiling C++ into bytecode. Practical implementations also exist, for example, LLVM.
Yes it is technically feasible. A bit of a plug for a former employer, but here's an implementation of exactly that: http://antixlabs.com/products/antixgamedevelopmentkit/. The packaging process is, roughly speaking, C/C++ -> (compiler) -> LLVM -> (backend) -> bespoke bytecode -> zip file. This is platform-independent. Once it's on the user's device the "player" converts bespoke bytecode -> (translator for that device) -> native elf file -> (loader/linker) -> fixed up code.
If the real question is, "does there exist any such industry-standard intermediate format which is widely supported on multiple platforms and suitable for all-purpose use, like Java bytecode?" then the answer is "no".
As for why, I'd say it's because there is no one organisation which has enough influence over C++ programmers, and no true necessity for Java-style deployment of C++ applications. Sun invented Java and a GUI library in one go, presented it to programmers, and didn't introduce the big proliferation of profiles until later.
C++ doesn't even have a standard GUI, and C++ environments are far more fragmented than Java. How do you tell a Windows app developer, a mobile phone developer, a smartcard implementer and a stock exchange backend implementer that they need to ditch their existing toolchain in favour of a platform-independent deployment mechanism for C++? They don't. And that's even before you get to the folks writing OSes and device drivers in C or C++ mixed with assembly. It's simply impossible to come up with a standard environment to support all of them.
Parrot project will have c++ bytecode compilation and execution parrot Visual Studio can compile C++ as bytecode C++ managed