In many website they talk about Armadillo+something else. What do they mean?
I use Armadillo library in form of
#include <armadillo>
in Linux environment.
In this website
http://nghiaho.com/?p=1726
Armadillo+OpenBLAS is mentioned. What do they mean? How to use Armadillo+OpenBLAS?
UPDATE
Now is more than a year later. I just add this point that Armadillo is a wrapper over implementations such as BLAS or OpenBLAS. It is not a matrix operation implementation.
Instead of linking Armadillo based code with BLAS, you link with OpenBLAS. This can be done manually, or the Armadillo installer can figure out that OpenBLAS is present. See the FAQ for details.
Basically you need to install OpenBLAS first, then install Armadillo (not from a Linux repository, but the downloaded version).
Armadillo can do its own math or it can call 3rd party libraries to do the math. Atlas, BLAS, OpenBLAS, uBLAS, lapack, MKL are examples of such 3rd party libraries. If Armadillo does its own maths, it will be single thread. Some of these 3rd party libraries can do multi-thread eg OpenBLAS. Some libraries can use GPU eg nvBLAS from Nvidia. Note that nvBLAS only does partial blas implementation and you still need another blas library for what nvBLAS does not do.
You can control Armadillo by editing armadillo_bits/config.hpp or by using -D compiler option to set the relevant precompiler directives for your needs.
Something that might save you time: the order in which you link armadillo and 3rd party libraries is important. Armadillo calls to say lapack and lapack calls to blas so the order should be:
-larmadillo -llapack -lblas otherwise you will have link errors.
Be careful with the OpenBLAS version, i.e. you should install the version 0.2.14.
Otherwise you will have problems if you want to use multithreads.
So:
1 - remove everything that you have already installed (Armadillo or openBLAS).
2 - Install openBLAS ver 0.2.14
3 - Install Armadillo (if you use the repository probably you will not have access to the last version).
4 - Enjoy it!
In addition, you should use the key -lopenblas instead -lblas. Also, you must add the path to the folders (include, lib) in the openblas package (previously downloaded and made). In my experience, the order and number of installed packages doesn't matter. I experimented with different versions of openblas packages without reinstalling armadillo.
Related
It looks like more and more standard libraries implementations are relying on TBB for their parallel algorithms. This is a bit surprising to me, as I didn't think that standard libraries would have external dependencies (outside of stuff like pthread), but that's a different question I imagine.
My issue is that I need to bake this into my CMakeLists.txt files now.
First bad news: There is no official CMake support for TBB, and TBB itself does not provide any FindTBB.cmake file. You can find it here and there on the web, but if standard libraries start relying on it, it would be nice to have it officially supported by CMake. Is this coming further down the line?
Then, I need to have some slightly convoluted code in my CMakeLists.txt file to find_package(TBB REQUIRED) and link the corresponding targets when required (depending on the standard library, version, etc.). It looks like Conan is already offering a package that hides all that stuff from the user. You just get parallelstl and that's it. Will we have something similar in CMake in the future?
We can already use these parallel algorithms in CMake today, but it would be great to make it easier to create such projects.
I have Ubuntu 21.10 with GCC and TBB (libtbb-dev 2020.3-1) installed.
Since TBB uses pkg-config what worked for me is this:
# file CMakeLists.txt
find_package(PkgConfig REQUIRED)
pkg_search_module(TBB REQUIRED tbb)
link_libraries(PkgConfig::TBB)
Unfortunately, when I later installed intel-oneapi 2021.4.0 which includes its own TBB (in /opt/intel/oneapi) this stopped working.
This newer version of TBB cannot be used as a backed for GCC's (parallel) STL apparently (generates compilation errors in my system), so I did this in order to pick the system's TBB and not the /opt/intel version.
link_libraries(-ltbb) #(PkgConfig::TBB)
This defeats the purpose of CMake, so I am also looking for a more robust solution.
My recommendation at the moment is do not install oneapi's TBB if what you want is to make it work with system's GCC.
Trying to compile some Fortran code using gfortran 9.x.x on my CentOS 7.xx machine. Have a particular version of the code that requires linking to LAPACK and BLAS (specifcially, liblapack.a, and librefblas.a). Have LAPACK (and all the -devel libs), and BLAS (same about -devel libs) installed (both available in the CentOS base repo).
While I (and therefore the linker) can find liblapack.a (its in /usr/lib64), no trace of librefblas.a (which causes the linker to complain bitterly, and the compilation to crash and burn).
In fact, I tried installing both BLAS and OpenBLAS on the same machine, but that didn't help -- librefblas.a still nowhere to be found.
The first thing to try is to use the regular libblas. Either change your Makefile to to use libblas instead of librefblas or make a symlink. Then check if you have any unresolved reference. Or do the same for OpenBLAS and point your makefile to libopenblas. Note that OpenBLAS also includes LAPACK.
Background: BLAS and LAPACK are publicly available interfaces. There is a reference implementation available, but also many alternative optimized or machine-specific ones. It should not matter which one you use so it seems unnecessary to specifically require the reference one. Normally, your Linux distribution libblas is the reference one anyway. It is probably just a quirk of your Makefile.
no trace of librefblas.a
The CentOS 7 lapack-3.4.2-8.el7 package build doesn't create or install the file librefblas.a .
I.e. no available package providing /usr/lib64/librefblas.a .
The package blas-static provides one file only : /usr/lib64/libblas.a
Build librefblas.a :
tar xvf lapack-3.4.2-clean.tgz
https://src.fedoraproject.org/repo/p....4.2-clean.tgz
cd lapack-3.4.2/
cp make.inc.example make.inc
make blaslib
... And librefblas.a will be created.
do this in the package folder:
sudo ln -s $HOME/lapack-3.9.0/librefblas.a /usr/local/lib/librefblas.a
solved
I have a file that needs to use boost numeric bindings's library. How can I get that binding library?
The following link seems not able to work anymore. The zipped file is corrupted.
http://mathema.tician.de/dl/software/boost-numeric-bindings/
And I hope I could use it in Window and Visual Studio tool.
I have a file that needs to use boost numeric bindings's library. How can I get that binding library?
You can grab the sources of the current version via
svn co http://svn.boost.org/svn/boost/sandbox/numeric_bindings
The version from http://mathema.tician.de/dl/software/boost-numeric-bindings/ is an older version with a different interface. You can grab the sources of that older version via
svn co http://svn.boost.org/svn/boost/sandbox/numeric_bindings-v1
And I hope I could use it in Window and Visual Studio tool.
You need a blas/lapack library in addition to the bindings. For windows, Intel MKL, AMD's ACML, clapack and atlas used to work, last time I checked. (You only need one of these, but note that atlas only implements blas and a small subset of lapack). These libraries have widely different performance and license conditions, but they all implement the same interface (more or less).
In general, the people at http://lists.boost.org/mailman/listinfo.cgi/ublas seem to be helpful, if you try to use the bindings (or ublas) and run into problems. But I'm not sure if any of them checks here at stackoverflow for potential users that ran into problems.
I just downloaded Boost because I need the precise floating-point arithmetic found in cpp_dec_float.hpp; I looked around a lot for other options, and couldn't find a good alternative.
I spent a while figuring out how to install bcp, and now I've finally installed all of it. I ran bcp to copy the cpp_dec_float.hpp file into my project, and lo and behold! Now I have a 9.5 MB Boost folder sitting in my C++ application directory. This will not be acceptable for my purposes.
Is there a way I can only install cpp_dec_float library without the rest of the multiprecision part? If not, does anyone know of a lightweight (VERY important!), fast, maintained and (at least relatively) recent library for arbitrary-precision numbers?
You can download Boost not to your application directory. Usually you install Boost for use with all projects by all users. Boost.Multiprecision is a header-only library, you only need its headers.
If you need to reduce your space that much, you can try to do the following (I did not test it).
Follow standard steps from Boost installation:
Download Boost sources, say, to folder my_boost
cd my_boost; mkdir build
Run ./bootstrap (will create a bjam executable for your platform)
Then ask Boost to configure for specific set of libraries only. Since the library multiprecision depends on some others, you might need to specify them all:
bjam --build-dir=build --with-multiprecicsion --with-utility --with-type_traits install
You are guaranteed that unrelated libraries will not be built. I am not so sure that unrelated header files will not be copied to Boost include library.
See "bjam --help" for more options.
i've also noticed that bcp copies more files than it should. my strategy: get it compiling, then spend 10 minutes removing stuff from your boost dir, checking to see if it still compiles each time. :)
I want to use Boost library in my iPhone project, specifically only boost::numeric::ublas. I managed to build static libraries for boost in order to link them in my iPhone project. However, when I look at those .a libraries I can't find one that's related to ublas (I tried ./bootstrap.sh --with-libraries=ublas in terminal but no luck). Does anyone know which static library to look for ublas? Or how to use ublas in an iPhone project in general?
Thanks!
uBlas is header-only so there is no static library - see this view of the libraries:
http://www.boost.org/doc/libs/1_44_0/?view=filtered_header-only
If you are OK with running iOS4 only, use the Accelerate framework, it has BLAS and features hardware acceleration (when available, software otherwise).
Even if you need 3.x support, it would be worth figuring out how to toggle the use of Accelerate when possible just to get the hardware support.
I can't answer the iPhone-specific part but I can help at least with the Boost part...
Boost uBlas is a header-only library so you don't need to build and link against any .a files. Just include the headers in your project if you want to use the library.