Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I've recently encountered a need for a library or set of libraries to handle operations on 2D polygons. I need to be able to perform boolean/clipping operations (difference and union) and triangulation.
So far the libraries I've found are poly2tri, CGAL, and GPC. Poly2tri looks good for triangulation but I'm still left with boolean operations, and I'm unsure about its maturity.
CGAL and GPC are only free if my own project is free. My particular project isn't commercial, so I'm hesitant to pay or request for any licenses. But I may want to use my code for a future commercial project, so I'm hesitant about CGAL's open source licenses and GPC's freeware-only restriction. There doesn't seem to be any polygon clipping libraries with nice BSD-style licenses.
Oh, and C/C++ is preferred.
Clipper is an open source freeware polygon clipping library (written in Delphi and C++)^ that does exactly what you're asking (except for triangulation) - http://sourceforge.net/projects/polyclipping/
In my testing, Clipper is both significantly faster and far less prone to error than GPC (see more detailed comparisons here - http://www.angusj.com/delphi/clipper.php#features).
Re: Anti-grain Geometry (AGG) graphics library - it doesn't do polygon clipping, but simply uses GPC (which isn't free for commercial applications). However, Clipper does have AGG units to make clipping in AGG just as easy as GPC.
^ Edit: Clipper is now written in C# too (together with Perl, Ruby, Haskell and Flash modules written by third-parties).
PolygonLib is a new polygon clipping library written in С++ and already used in two projects. It is numerically robust, uses double coordinates, and is optimized for polygons with large numbers of vertices. See http://www.ulybin.de/products/polygonlib.php?lang=en for more details and comparison of performance and memory utilization with GPC and PolyBoolean.
The restricted evaluation version of the library is free for not commercial use and supports the operations you need (except for triangulation).
How about boost? http://www.boost.org/doc/libs/1_47_0/libs/polygon/doc/index.htm
If you're fine with the heavy use of generics in the interface, I suspect this will serve your purposes well. I'm not sure if it contains triangulation, but you can implement one of the many available triangulation algorithms if it does not.
Check out Liszt a Scala DSL
http://www.antigrain.com/license/index.html is the closest I can find, you may have to spend a buck if it does go commercial, but you can use it for free for now, and get consent later on.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I have the need to replicate GPU tessellation on the CPU (ie get the same uvw coordinates on the CPU side as I will get on the GPU from the tessellator).
The reason for this is rather complicated, but simply put I have an algorithm that stores data per tessellation point, and to calculate it in the first place I need the uvw coordinates on the CPU.
I have googled a lot for the exact details of the tessellation pattern, but I only find very vague texts speaking about it in a general nature, the best one being this one: http://fgiesen.wordpress.com/2011/09/06/a-trip-through-the-graphics-pipeline-2011-part-12/
Is the reason for the lack of texts on this that it's vendor dependent, or have I simply not found the not found the right page?
I'm interested in texts both on the OpenGL and DX11 implementation, if they differ.
I was also very interested by tessellation, and specifically subdivision surfaces some time ago. This is a very complicated topic. It was researched from early 70th and it's still in research.
It isn't clear if you want to re-implement whole shader tessellation pipeline (which I think will take years for a single programmer) or just a single subdivision algorithm (or even algorith that isn't subdivision).
Anyway, there is some links about subdivision:
Theory
Typically we implement subdivision surfaces in tessellation shaders using Catmull–Clark subdivision surface algorithm. You can find some papers from original authors in Google. There is main one:
"Recursively generated B-spline surfaces on arbitrary topological meshes" (year 1978, PDF)
Closer to code
Check papers of those cool guys from Microsoft Research:
Charles Loop - (BTW, author of another subdiv algorithm), purely mathematical stuff
Hugues Hoppe - Look for "progressive meshes", much more close to software
Even more closer to code.
You can find some libraries on the web. When I searched a while ago, there was a dozen of libs, that implemented subdivision on CPU. I didn't look they much, because I was interested in GPU implementation. The keyword for search is "subdivision lib" =)
The most interesting one is Pixar's OpenSubdiv. Take a look at their code.
Also look at "NVIDIA Instanced Tessellation" Sample (DirectX, OpenGL). They've implemented tessellation pipelne in Vertex and Geometry shaders.
Hope it helps!
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
CGAL seems to do just about everything I need and a little more for my upcoming project. It can create polygons out of arc line segments and run boolean operations on them. It has spatial sorting packages already that would save me a lot of time regarding a few things and the whole library seems quite standardized and well planned.
There's just the issue with the license being QPL (GPL for the upcoming version 4.0) for most of the packages (except the very basic ones). I've got a meager budget and can likely not gather funds to buy the commercial licenses for those specific packages in CGAL that require it.
My specific needs of such a library would be:
Exact precision 2D euclidean space
Complex polygons
Polygons able to have curved line (arc) segments
Boolean operations on those polygons
Polygon offsetting
Polygon partitioning or effective triangulation
Inscribed area and polygon fitting algorithms
Possibly some spatial sorting structures with circular range searches
All in all, I'm looking for a well rounded 2D geometry C++ library with exact precision.
Preferably with MIT, LGPL at a stretch, or a low cost one-time royalty-free license below $500.
Boost got some basic structures down, but from what I can tell they lack a lot of the higher level functionality. Any libraries that has expanded on this? I would consider doing it myself, but I lack the expertise to do it well and it'd prolong my project by quite a bit.
Just to be clear, I'm not looking for a 2D graphics library, just pure geometry structures.
Have a look at Wykobi.
It is a templated library and you can template the dimension as 2D.
It is distributed under the MIT License.
Take a look at Geometric Tools for Computer Graphics.
Refined over a decade
Unbelievably good documentation, both in hard bound and extensively in PDF form
Boost license
It meets all your requirements:
Exact precision 2D euclidean space: Yes
Complex polygons : Yes
Polygons able to have curved line (arc) segments : Nonsensical. By
definition, polygons are composed of line segments. If you are
looking for splines and NURBS, the library has them.
Boolean operations on those polygons : Yes
Polygon offsetting : Unclear what you mean. The library certainly supports translation.
Polygon partitioning or effective triangulation: Yes, Delaunay triangulation
and Voronoi regions
Inscribed area and polygon fitting algorithms :Yes
Possibly some spatial sorting structures with circular range searches : Yes, spatial sorting and a whole bushel of intersection functions.
All this comes from the book Geomtric Tools for Computer Graphics by Schneider and Eberly. The book is outstanding, with clear presentation of how the algorithms work and what their limitations are. The authors have made the code available online under the Boost license and include most (all?) of the book online as a PDF to accompany each code module. They maintain an very useful website that is indexed in various ways.
I have no connection to the authors nor any monetary interest. I used their book in my thesis and it was extremely pleased with it as an easy to use reference and a powerful library.
Have you looked at Boost.Geometry library? It's nowhere near CGAL in terms of functionality, but it might help you.
You could try GeoLib www.geolib.co.uk. Not as much functionality but does offer boolean operations and is very easy to use.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I am currently prototyping some algorithms in Matlab that rely on matrix, DSP, statistics and image analysis functionality.
Some examples of what I may need:
eigenvectors
convolution in 2D and 3D
FFT
Short Time Fourier Transform
Hilbert transform
Chebyshev polynomials
low pass filter
random multivariate gaussian numbers
kmeans
Later on I will need to implement these algorithms in C++.
I also have a license for Numerical Recipes in C++, which I like because it is well documented and have a wide variety of algorithms.
I also found a class that helps with wrapping NR functions in MEX:nr3matlab.h.
So using this class I should be able to generate wrappers that allow me to call NR functions from Matlab. This is very important to me, so that I can check each step when porting from Matlab to C++.
However Numerical Recipes in C++ have some important shortcomings:
algorithms implemented in a simple, and not necessarily very efficient
manner
not threaded
I am therefore considering using another numerical library.
The ideal library should:
be as broad in scope and functionality as possible
be well documented
(have commercial support)
have already made Matlab wrappers
very robust
very efficient
threaded
(have a GPU implementation that can be turned
on instead of the CPU with a "switch")
Which numerical library (libraries) would you suggest?
Thanks in advance for any answers!
You have a pretty long list of requirements, and it may be challenging to cover them all with a single library.
For general Matlab-to-C++ transitions, I can highly recommend Armadillo which is a templated C++ library with a focus on linear algebra --- and a given focus on making it easy to write Matlab-alike expression. It as very good performance, is very well documented and actively maintained. You could start there and try to fill in the missing pieces for your task.
Actually you should have a look at openCV.
Although its first goal is computer vision/image processing, this library has a lot of linear algebra tools (Almost all that you ask for). At first, this library has been implemented by intel, with a lot of focus on performance. It can handle multi thread, IPP,...
The syntax is rather easier to use than usual C++ library.
You should have a look at this cheat sheet. The syntax has been changed since version 2.0 to mimic matlab.
This library is broadly used, and well active (last big update August 2011).
NAG could be one good option. Loads of financial institutions use it in their mathematical libraries. Don't have a GPU implementation though, when I last used it.
there is also the Eigen library: http://eigen.tuxfamily.org
but it is mostly used as part of a larger framework. It offers basic (and a bit more complex) algebra
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
Which library do you use for N-dimensional arrays?
I use blitz++ at work and I really dislike some aspect of it.
Some aspect of it are even dangerous. The need for resizing before
using operator=. A(Range::all(), Range::all()) throws for an (0,0)
matrix, etc. and the linear algebra operations are to be
done via clapack.
I used and loved eigen. I appreciate its "all-in-header" implementations,
the C++ syntactic sugar, and the presence of all the linear algebra operations
I need (matrix multiplication, system resolution, cholesky...)
What are you using?
boost::array and also boost::MultiArray. There's also a pretty good linear algebra package in boost called uBLAS
There is also armadillo which I am using in some projects. From their website:
Armadillo is a C++ linear algebra library (matrix maths) aiming towards
a good balance between speed and ease
of use. Integer, floating point and
complex numbers are supported, as well
as a subset of trigonometric and
statistics functions. Various matrix
decompositions are provided through
optional integration with LAPACK and
ATLAS libraries.
A delayed evaluation approach is employed (during compile time) to
combine several operations into one
and reduce (or eliminate) the need for
temporaries. This is accomplished
through recursive templates and
template meta-programming.
This library is useful if C++ has been decided as the language of choice
(due to speed and/or integration
capabilities), rather than another
language like Matlab ® or Octave. It
is distributed under a license that is
useful in both open-source and
commercial contexts.
Armadillo is primarily developed at NICTA (Australia), with
contributions from around the world.
We've used TNT successfully for a number of years. There are sufficient issues, however, that we're moving toward an internally developed solution instead. The two biggest sticking points for us are that
The arrays are not thread safe, even for read access, because they use a non-thread safe reference count.
The arrays cause all sorts of problems when you write const-correct code.
If those aren't a problem then they're fairly convenient for a lot of common array tasks.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
Basically, I'm looking for a library or SDK for handling large point clouds coming from LIDAR or scanners, typically running into many millions of points of X,Y,Z,Colour. What I'm after are as follows;
Fast display, zooming, panning
Point cloud registration
Fast low level access to the data
Regression of surfaces and solids (not as important as the others)
While I don't mind paying for a reasonable commercial library, I'm not interested in a very expensive library (e.g. in excess of about $5k) or one with a per user run-time license cost. Open source would also be good. I found a few possibilities via google, but they all tend to be too expensive for my budget.
Check Point Cloud Library (PCL). It is quite a complete toolkit for processing and manipulating point clouds. It also provides tools for point clouds visualisation: pcl::visualization::CloudViewer which makes use of VTK library and wxWidgets
Since 2011, point clout translation (read/write) and manipulating toolkit has been developed: PDAL - Point Data Abstraction Library
I second the call for R which I interface with C++ all the time (using e.g. the Rcpp and RInside packages).
R prefers all data in memory, so you probably want to go with a 64bit OS and a decent amount of RAM for lots of data. The Task View on High-Performance Computing with R has some pointers on dealing with large data.
Lastly, for quick visualization, the hexbin is excellent for visually summarizing large data sets. For the zooming etc aspect try the rgl package.
Why don't you go have a look at the R programming language which can link directly to C code, thereby forming a bridge. R was developed with statistical code in mind but can very easily help not only to handle large datasets but also visualize them as well. There are quite a number of atmospheric scientists who are using R in their work. I know, I work with them for exactly the stuff you're trying to do. Think of R as a poor man's Matlab or IDL (but soon won't be.)
In spirit of the R answers, ROOT also provides a good undeling framework for this kind of thing.
Possibly useful features:
C++ code base and the Cint c++ interpreter as the working shell. Python binding.
Can display three dim point clouds
A set of geometry classes (though I don't believe that they support all the operations that you need)
Developed by nuclear and particle physicists instead of by statisticians :p
Vortex by Pointools can go up to much higher numbers of points than the millions that you ask for:
http://www.pointools.com/vortex_intro.php
It can handle files of many gigabytes containing billions of points on modest hardware.