Compiling OpenCV3.0.0 with mingw32-builds - c++

I've built OpenCV 2.4.8 in the past with the regularly distributed mingw32, but since I'm migrating to use newer C++11 standard functions and OpenCV3 formats and functionalities, I had to change my compiler to mingw32-builds.
It certainly supports the C++11 functions I want to use, but I can't compile OpenCV3 properly.
Some other questions suggested two approaches
Disabling WITH_IPP during CMake
Commenting out add_extra_compiler_option(-Werror=non-virtual-dtor) in the CMake configuration file
Although making these modifications made it compile, IPP is a performance enhancing library and that warning message is a high level warning which may lead to future bugs.
Is there anyway to compile OpenCV3.0.0 without these quirks?
Additional information:
I've already compiled the exact same build on Ubuntu and everything (from c++11 to opencv3) went fine, no errors, no high level warnings, no quirks
I am now working under Windows 8.1 64-bit, but focused on compiling 32-bit binaries (for compatibility)

Related

Cannot run HElib well, but it builds on Windows 10 x64

I'd really appreciate some help getting HElib to work on Windows 10 x64 using the MSVC 2017 compiler. I successfully managed to compile its dependency, NTL, using the same compiler by following this tutorial and also ran its tests, so it seems to work well.
However, in the case of HElib I tried generating Visual Studio projects using cmake and then compiled it successfully (see NOTE below), but running it fails. For example, I ran the Test_binaryCompare.cpp_exe test (has its own vcproj generated by cmake), but it fails because it reaches a part of code I doubt it's supposed to (it attempts to do an operation called bootstrapping and it is disabled for that test). However, on Linux it works.
LINUX: The reason I'd really like to run this on Windows is because I find it a lot easier to debug using Visual Studio. I'm also more used to Windows overall..
NOTE: Compiling HElib successfully required some modifications like fixing broken tr1 includes (e.g it was trying to include <tr1/memory> instead of just memory although the latter was actually available), suppressing the 4146 error (I also had to do this for NTL) and fixing two instances of variable-length arrays which Microsoft's compiler sadly does not support.
Without any error messages I can't really help you figure out your exact issues but I ported HElib to Windows some time ago: https://github.com/AlexanderViand/HElib/tree/Windows
It's a bit out of date but if the tests works in that version you can check the changes I made against your changes.
There's also the option that you're simply running into this issue: https://github.com/shaih/HElib/issues/228
If your linux version of NTL is slightly older, it might just be that your windows version of NTL is the buggy one.
Finally, I'd very much recommend against running HElib on Windows because without GMP it seems painfully slow.
Instead I ended up setting up a docker virtual machine and SSH'd into that from visual studio: https://hub.docker.com/r/alexanderviand/visual-studio-linux-build-box-with-helib/
Currently I'm using WSL and CLion (which supports WSL quite well) when I'm working with HElib on Windows.

How to set up Apache Thrift with Eclipse && MinGW in Windows?

I need to configure Thrift for Eclipse project with MinGW compiler. I googled for it. I couldn't find proper instructions to do that? Can somebody suggest any suitable way or proper link to do that?
There are several partial answers to this.
The Windows build is made by means of MinGW cross compiler on a Linux machine.
Since (at least) 0.9.2 it is no longer necessary to use MinGW to build the Thrift compiler on a Windows machine.1) Aside from the fact, that you don't need to do that at all, because a precompiled EXE is available on the download pages, there is a nice Visual Studio project to build the Windows Thrift compiler EXE. The project has only two dependencies: Bison and Lexx/Yacc, which are both available elsewhere as precompiled Setups as well.
In either case MinGW is only used to build the compiler. If you want to build the libraries with MinGW, I'm not sure if that even works. This way of doing things is not implemented or supported, simply because nobody needs it.
Which brings us back to the question, why you think you "need" it this way.
1)To my knowledge, numerous severe problems exist with the autotools and all the stuff needed to build Thrift under MinGW on a Windows machine. You will have to patch things, build some from source, spend a lot of time and do some strange things with your file system to make it work. At least that was the case when I stopped using MinGW to build Thrift about two years ago. And even if you get it to work, you still only get the compiler (which you could easily download in a fraction of that time), not the libraries.

GCC runtime libraries vs Microsoft Visual C++ runtime redistributables

Could anyone shed some light on C++ library versioning and distribution of
GCC library (libgcc, libstdc++,..?)
Microsoft Visual C++ runtime libraries (6.0, 2005, 2008, 2010, 2012, 2013, 2015,....)
With my limited exposure to GCC programming, I have never seen C++ runtime libraries being distributed along with program. That is often the case with MS Windows programs.
Can a relatively old linux system run a newer C++14 program (which is compiled on a newer system and then copied over to old system)?
Do GCC programmers distribute runtime libraries along with programs? If no, Why do Windows programs distribute them? How GCC distributions ensure that a C++ program always works when installed?
What about the frameworks such as Qt, How does Qt handle versioning and distribution on Linux and Windows? Does Qt also distribute runtimes for different versions?
May be it has to do with platform, how Linux is designed vs How Windows is designed.
What is so fundamentally different in approaches GCC & MS Windows take?
GCC's runtime libraries, like GNU's C library, provide a stable binary interface (small footnote: GCC 5.1 kind of blew this up, due to new C++ features that had to be implemented).
Microsoft's libraries don't, and each little version difference may and probably will break the ABI (application binary interface).
Additionally, Microsoft compilers change their ABI with version increments as well, making it a bad idea to combine code built by different versions of their tools. Also here, GCC maintains a strict ABI, which makes object code perfectly compatible (if no ABI breaking codegen options are given of course).
This ABI consists of object size and layout, which a compiler uses when generating code. Therefore running code built against one version but using a different version at runtime may produce unexpected results because the memory layout and use is just different.
GNU/Linux is quite strong in this regard, and is generally able to maintain strong backwards compatibility.
As long as the compiled program was compiled against an older version of the library, it will run perfectly if loaded with a newer version that a user has installed.
The same story goes for Qt, which only breaks ABI between major version numbers (Qt 4 and Qt 5 cannot be loaded at runtime interchangeable).
There are some small exceptions, GCC 5's libstdc++ being a big problem there. I know of no big glibc ABI breakages though.
The new Microsoft Universal CRT attempts to solve this issue, by providing a stable C runtime interface and as the story would have us believe, provide a glibc style library ABI stability. This UCRT is available for Windows Vista and up, but applications need to be compiled specifically against this. The first version of VS that has this capability, is VS2015.

How to install a simple Intel C/C++ compiler on a 64-bit Ubuntu system?

I need to compile c/c++ code, by running a build.sh file.
The instruction on the program (that i want to run) says it needs to be compiled by a Intel's compiler1.
After searching on the net I came across information on what to do.
Some people said that we must install first a 32-bit libraries:
https://help.ubuntu.com/community/InstallingCompilers
Others said that we must, first of any installation, change some things:
http://software.intel.com/en-us/articles/using-intel-compilers-for-linux-with-ubuntu
In the other hand, Intel's page show many suites:
http://software.intel.com/en-us/c-compilers
while the only thing that I want is simply Intel's C/C++ compiler.
Can somebody be so gentle to give me the directions on how to install a Intel's compiler on a 64-bit Ubuntu system?
Footnote 1 / Editor's note: other x86 compilers including GCC and clang (and MSVC on Windows), support Intel's SSE/AVX intrinsic functions, but Intel's compiler comes with some libraries such as SVML (e.g. SIMD sin and exp) and MKL which some code might need. Other compilers can be used with SVML if you have it installed separately.
In short, it's worth trying with other compilers, especially if you understand why something says it needs to be compiled by ICC, if getting ICC would be inconvenient. But you might (or might not) be missing out on performance for packages that detect what's available instead of just erroring.
for non commercial use you can download it from Intel
EDIT:
the IntelĀ® System Studio 2016 includes a c++ compiler.
I've tried the 32-bit version of it, the non-commecial one. I don't think it differ from the 32-bit on basic stuff related to installation. Open this and go to compilers and libraries section and you will see the C/C++ compiler. After download it read the files in doc folder; it includes how to install/use/get a key to compiler etc.
You need to install gcc compiler through apt-get install gcc
Look on example here: Install GCC

GCC worth using on Windows to replace MSVC?

I currently develop in C++ on Windows, using Visual Studio 2010. After the official announcement of C++11, I have begun to use some of its features that are already available in MSVC. But, as expected, the great majority of the new changes are not supported.
I thought maybe the upcoming version of Visual Studio would add these new features. However, after reading this it looks like very little is going to change.
And so, I'm curious about the feasibility of using GCC on Windows rather than MSVC, as it appears to support the great majority of C++11 already. As far as I can tell, this would mean using MinGW (I haven't seen any other native Windows versions of GCC). But I have questions about whether this would be worth trying:
Can it be used as a drop-in replacement for cl.exe, or would it involve a lot of hacks and compatibility issues to get Visual Studio to use a different compiler?
The main selling point for Visual Studio, in my opinion, is it's debugger. Is that still usable if you use a different compiler?
Since GCC comes from the *nix world, and isn't native to Windows, are there code quality issues with creating native Windows applications, versus using the native MSVC compiler? (If it matters: most of my projects are games.)
In other words, will the quality of my compiled exe's suffer from using a non-Windows-native compiler?
MSVC has the huge advantage of coming with an IDE that has no equals under Windows, including debugger support.
The probably best alternative for MinGW would be Code::Blocks, but there are worlds in between, especially regarding code completion and the debugger.
Also, MSVC lets you use some proprietary Microsoft stuff (MFC, ATL, and possibly others) that MinGW has no support for, and makes using GDI+ and DirectX easier and more straightforward (though it is possible to do both with MinGW).
Cygwin, as mentioned in another post, will have extra dependencies and possible license issues (the dependency is GPL, so your programs must be, too). MinGW does not have any such dependency or issue.
MinGW also compiles significantly slower than MSVC (though precompiled headers help a little).
Despite all that, GCC/MinGW is an entirely reliable quality compiler, which in my opinion outperforms any to date available version of MSVC in terms of quality of generated code.
This is somewhat less pronounced with the most recent versions of MSVC, but still visible. Especially for anything related to SSE, intrinsics, and inline assembly, GCC has been totally anihilating MSVC ever since (though they're slowly catching up).
Standards compliance is a lot better in GCC too, which can be a double-edged sword (because it can mean that some of your code won't compile on the more conforming compiler!), as is C++11 support.
MinGW optionally also supports DW2 exceptions, which are totally incompatible with the "normal" flavour and take more space in the executable, but on the positive side are "practically zero cost" in runtime.
I want to add some information because the field may have changed since the question was asked.
The main problem for switching away from MSVC was the lack of a good IDE that flawlessly integrates with MinGW . Visual Studio is a very powerful tool and was the only player on Windows for quite some time. However, Jetbrains released a preview version of their new C++ IDE CLion some days ago.
The main benefit comes when working on cross platform applications. In this case, a GCC based tool chain can make life much easier. Moreover, CLion narrowly integrates with CMake, which is also a big plus compared to Visual Studio. Therefore, in my opinion, it is worth to consider switching to MinGW now.
GCC's C++11 support is quite phenomenal (and quite up to par with standards conformance, now that <regex> has been implemented).
If you replace your compiler, you'll need to make sure every dependency can be built with that new compiler. They're not made to be substitutable plugins (although Clang is working on becoming that way).
GCC is a fine compiler, and can produce code that has pretty much the same performance, if not better, than MSVC. It is missing some low-level Windows-specific features though.
Apart from this, to answer your questions:
To get VS to use GCC as a compiler, you'd pretty much need to turn to makefiles or custom build steps all the way. You'd be much better off compiling from the commandline and using CMake or something similar.
You cannot use the VS debugger for GCC code. GCC outputs GDB compatible debug information, and the VS debug format is proprietary, so nothing will change in that area anytime soon.
Code quality is just as good as you'd want it. See above.
No, the quality of your code will actually increase, as GCC will point out several assumed standard extensions MSVC would hide from you. All self-respecting open source projects can be compiled with GCC.
I my humble opinion, it's depends how someone started to code in the first place. I've been using g++ and gcc for more than 20 years now but the reason why i keep using gcc is mainly for licensing reasons. Although i like it too when i don't have a bunch of runtime dependencies or dll's to bundle with my stuff since i came from the DOS era, i still like my stuff small and fast. gcc for windows comes with standard win32 libraries and common control but i had to develop my own win32 controls for stuff that might require mcf shit to work properly or just to look nicer.
Although gcc might have strong support over internet, when it comes to win32 stuff, many rely on mcf and vc proprietary stuff so again, one may have to work his own issues around and be creative when difficulty arises.
I think it's all about needs and circumstances. If you are just a hobbyist coders and have the time for researches, creating you own libs and stuff but you want a solid compiler that's around since the late 80's and free, gcc sound perfect for the job.
But in the industry visual studio is a must if you want to be competitive and stay in the race. Many hardware manufacturers would prefer bundling visual studio compatible libraries for they hardware over some opensource gnu stuff.
That's my two cents.
To be honest, C++ should be handled with MS Visual Studio. If you want to make cross-platform or Unix apps, use GCC. GCC works and can be used with any IDE other than Visual Studio. Even Visual Studio Code can use GCC. Code::Blocks, Eclipse IDE for C/C++ developers, CLion, Notepad++ and even the good ol' tool we've always known, Notepad works with GCC. And finally, on a PC with low disk space, installing Visual Studio's "Desktop Development with C++" is something like 5 GB, if it was to be useful. And this is where GCC hits MSVC hard. It has native C support. MSVC can compile C, but only with a lot of fine-tuning. It takes a lot of time and effort to finally be able to compile. The final verdict:
If MSVC works, it hella works! If MSVC doesn't work, it HELLA DON'T WORK.
If GCC installs, it works, and if it doesn't work, it's the IDE's problem.
GCC is for people who don't mind spending 4 hours at the computer making it work properly. MSVC is for those who don't care about C and want it to install without any pokin' around.
It can't be used as a direct swap-out replacement for the microsoft compilers, for a start it has a vastly different set of command line arguments and compiler specific options.
You can make use of MinGW or Cygwin to write software but introduce extra dependencies ( especially in the case of cygwin ).
One not often touted advantage of gcc over cl is that gcc can be used with ccache to drastically speed up rebuilds or distcc to build using several other machines as compiler slaves.
Consider the Intel compiler (or "Composer" as they seem to have taken to calling it) as another option. I'm not too sure where its C++11 support is at compared with MS (certainly it has lambdas), but it does integrate very nicely with VisualStudio (e.g different projects within a solution can use the Intel or MS compilers) and there's also been some efforts made to match the MS compiler commandline options.
GCC and MSVC use different name mangling conventions for C++. C++ dlls compiled by one compiler can not be used in applications compiled with the other. I believe this is the main reason we don't see more widespread use of gcc in windows.