Do Policy Intersection in WSO2ESB Class Mediator - wso2

I've created a Class Mediator in which I want to intersect two policies. I've created the Class Mediator with Carbon Studio for Eclipse, which automatically adds some predefined libs to the build path of my project.
One of the libs is neethi-2.0.4.wso2v1.jar.
If I want to use the intersect-method I get an exception. If I have a look at the source I see that the intersect-method just throws an "UnsupportedOperationException".
So the given neethi lib is useless for intersection, therefore I want to use the newest Neethi lib (aka neethi-3.0.2.lib) for intersection inside my class mediator.
Could you tell me how I can include third party libs, especially in my use case these ones, that should override the once used inside the esb( neethi-2.0.4.wso2v1.jar).
Thanks

Developer Studio adds these libs to your project classpath, so that developers do not get build errors in the source code.
Therefore those libraries are only for the development time and available only with Dev Studio. They are not the libraries used in the runtime of the server.
Runtime libraries are provided by the WSO2 Carbon Server runtime. So this UnsupportedOperationException is thrown by the server is occurred due to the version of the neethi in the server does not support it. So you need to upgrade the library in the server runtime.
But as you can see, WSO2 have forked the neethi codebase and have some custom implementations on the forked source. So IMO, simply upgrading the version would not help.
Anyway you need to actually upgrade the library and see whether you face any issues with it.
To upgrade the versions, you can either create a Java Library artifact for newer Neethi library and include it in the CAR file and deploy the CAR file or copy the new Neethi library to <CARBON_HOME>/repository/components/lib location while the server is running.
You can find the current Neethi library in the <CARBON_HOME>/repository/components/plugins location. If you find some error or exception similar to "Linkage Error", then try to remove the older version of the library. But it would cause some other exceptions.
So IMO the bottom line is, you will face some issues with the version upgrade. But yes of course you can give a try and see, whether we can overcome them. Give it a try and post your observations here. We will try our best to assist you.

Related

Build Tools vs Package Manager

I am very much confused between these two terms Build tools and Package Manager. According to my current knowledge, Package managers are the ones use to install dependencies required for the code to execute while Build tools are used to Package the code plus dependencies into single file i.e. building the code. Building our application will enable to make it production ready.
Am I right???
Short answer
Build systems/tools manage your compilation requirements.
Package managers/tools manager your library requirements.
A build tool may have integrated package management.
For example, in both C++ and Java you can directly call the compiler and provide all the include, source and library paths manually or you can use a build system (make/cmake... for c++, maven/gradle/ant... for java).
When you link external libraries with your build system it will do its best to find them in its search path, and will link with the first version that meets its requirements or tell you that it couldn't find it. Adding libraries manually is fairly easy, but sometimes each library you add will require another library with it.
A package manager would make sure that your libraries are downloaded, are the right version, and all the libraries they depend on are downloaded. some examples are maven and gradle which have integrated package managment for java, and conan is a fairly popular option to combine with cmake.
So ideally you would use both, but it can be more work setting them up than you save not doing things manually. It depends on your programming language, if you need multiple versions of something, and your OS.

Keycloak conflict (core and adapter)

When trying to run my webapplication in both an embedded jetty (locally) and a 'normal' jetty instance (remotely), I think I encounter some class collision.
java.lang.ClassCastException:
org.keycloak.adapters.RefreshableKeycloakSecurityContext cannot be cast to org.keycloak.KeycloakSecurityContext
The class KeycloakSecurityContext is both defined in the keycloak-jetty-adapter dependency and in the keycloak-core dependency in my project. (same version, 3.4.3 Final)
I've tried using scope 'provided' on the adapter or excluding the keycloak-core dependency from my adapter, but to no effect.
Any suggestions how to handle this combination?
I had similar issues with Keycloak. You need to make sure that libraries are not duplicated.
You download from Keycloak the adapters and place them under tomcat/lib in tomcat/wildfly.
In pom.xml you need to specify the tag without versions. This will make sure that "dependencies that are required for compiling the project code, but that will be provided at runtime by a container running the code".
(For IntelliJ) To make sure that your project can import Keycloak specific classes you need add them into your project. So CTRL+Shift+Alt+S -> under libraries you need to add the same jars you added at point 1.
For me I had to add the scope provided on the keycloak-core maven dependency. It corrected the error.

Do Applications written with MFC classes require external Frameworks to be installed

If I write an application using MFC libraries in C++, in deployment stage do I require to install some sort of frameworks or stuff like that?
My intent is to have a standalone exe without complicated installation scripts.
If you're developping a local application for your own organisation, you could go for static linking, as suggested by Danny.
But static linking is not the method recommended by Microsoft: every time there's an MFC related patch (example here) or z patch for another library, you'll have to recompile your code and redistribute or reinstall it in order to avoid PC's being exposed to security vulnerabilities.
This is why Microsoft recommends to use dynamic libraries: these are easier to update/replace (eventually latest versions are already installed; or automatic windows update; or if necessary manual download of the latest version).
If you go for dynamic approach:
there are a couple of mfc*.dll to distribute with your application, together with other standard libraries, such as Msvcr100.dll. It's all explained in the article. Installing such files in your app's directory has the advantage of a leaner installation process. But you have to take responsibility for their update in case of necessity.
or you choose to use Microsoft's redistribuable packages. These can be downloded directly from Microsoft's and are contained in a selff-installable file: vc_redist*.exe. Here some explanations on how to use them in installation process. It might install more dlls than required, but vc_redist is an installed Microsoft product that is kept up-to-date with Windows Update.
If you link MFC statically, there is no need for external files.
Project Settings / General:
Use of MFC: Use MFC in a Static Library
But, as Christophe mentions, it is not recommended by Microsoft.

How should/could/must I handle the dll that my C++ projects depend on?

I'm lost here and I have no clue how to proceed. This is not a question about how to make my program work, this is a question about how to stop wasting my time.
My programming environment is Visual Studio 2013 on windows, in C++.
I use 3 libraries extensively, namely: boost (using dynamic linking), OpenCV, and Qt.
During the development, I have configured VS to look at those 3 libraries by default for include and .lib. I have also added the 3 folders containing all the dlls to my PATH environment variable.
It works, but it is sometime painful, let me explain you when.
First hassle: Anytime I have a LNK error telling me I miss a function, it is usually on OpenCV since it has only one include file referencing all the functions. I have to look at OpenCV's source code to see what module this function belongs to and to know what I must link my program to.
Second Hassle: When comes the time to deploy my application, I have to ship it with all the relevant dlls. To know which one I need, I open dependency walker and try to forget nothing, I have then to test it on a different computer because 102% of the time I have missed a couple, and then I have to configure my Installer generator to include all those one by one.
Third Hassle: To ease a little bit the process of configuring a new development machine, I have recently switched to NuGet. It is great, I add boost with a couple of clicks to any project. But now my boost DLLs are everywhere, I have one folder per boost library, and since there are dozens of those I can't even add them all at once to my PATH now, so I have to move them manually to the appropriate folder, and that is really not what I want to do with my not-so-precious-but-who-are-you-to-judge time
I have looked around and couldn't find any good practice regarding this issue, maybe because they are too obvious, or too specific to a particular setup.
How do you do? How would you do if you were me?
We put all our external dependencies in version control along with the code. This ensures that all code can build "out of the box" on any of our development machines and also ensures that for any given version of the code, we know exactly which dependencies is has.
The best way to check for missing dependencies is how have a good automated test suite, if you've got comprehensive converge then if your tests pass you must have deployed the required libraries.
In terms of linking to the appropriate libraries, unfortunately, that just sounds like an issue with the structure of OpenCV (I'm not familiar with OpenCV). I tend to use dumpbin under Windows and nm under Linux to easily grep for symbols when I get link errors with an unfamiliar library.

How to build using ant in IBM RAD

All,
we have multiple applications that we develop in IBM RAD 7.5.
Since, RAD does build all the applications that are required, I was wondering how can I achieve the same using ant files.
What I wanted to know is apart from my application specific libraries (that I will know where they are etc), which other jar files should my application point to?
I am talking about IBM WAS runtime libraries etc, so that my application builds successfully.
Is there a standard guideline by IBM on this?
Thanks.
The JARs provided in WAS_HOME/dev/ in WebSphere Application Server 7.0 and later, such as was_public.jar or j2ee.jar, are intended specifically for this purpose.
#bkail's answer is probably what you want if you're on one of those more recent versions of WAS (I don't have v7 or v8 installation to verify).
Another option is to expand the Server Runtime library that you have added to your RAD Java Build Path, you'll see the jars you want to include for your Ant build.
However, if you're actually running within RAD, you'll be pointing to a full server runtime which contains more than you need to simply compile. In that case, you can add one of the "WebSphere Application Server vX stub" runtimes just to see which jars those include; they are the minimum for compiling.
Which specific jars are in that environment will depend on your version of WAS, any installed Feature Packs, and probably even Fixpack levels. In many cases - depending on which APIs you're using - all that's needed to compile is j2ee.jar.
(You probably already realize this, but remember that you only need those jars in your classpath to compile, you don't want to deploy them in your WAR since they're already part of WAS.)