Moving from sourceCpp to a package w/Rcpp - c++

I currently have a .cpp file that I can compile using sourceCpp(). As expected the corresponding R function is created and the code works as expected.
Here it is:
#include <Rcpp.h>
using namespace Rcpp;
// [[Rcpp::export]]
NumericVector exampleOne(NumericVector vectorOne, NumericVector vectorTwo){
NumericVector outputVector = vectorOne + vectorTwo;
return outputVector;
}
I am now converting my project over to a package using Rcpp. So I created the skeleton with rStudio and started looking at how to convert things over.
In Hadley's excellent primer on Cpp, he says in section "Using Rcpp in a Package":
If your packages uses the Rcpp::export attribute then one additional step in the package build process is requried. The compileAttributes function scans the source files within a package for Rcpp::export attributes and generates the code required to export the functions to R.
You should re-run compileAttributes whenever functions are added, removed, or have their signatures changed. Note that if you build your package using RStudio or devtools then this step occurs automatically.
So it looks like the code that compiled with sourceCpp() should work pretty much as is in a package.
I created the corresponding R file.
exampleOne <- function(vectorOne, vectorTwo){
outToR <- .Call("exampleOne", vectorOne, vectorTwo, PACKAGE ="testPackage")
outToR
}
Then I (re)built the package and I get this error:
Error in .Call("exampleOne", vectorOne, vectorTwo, PACKAGE = "voteR") :
C symbol name "exampleOne" not in DLL for package "testPackage"
Does anyone have an idea as to what else I need to do when taking code that compiles with sourceCpp() and then using it in a package?
I should note that I have read: "Writing a package that uses Rcpp" http://cran.rstudio.com/web/packages/Rcpp/vignettes/Rcpp-package.pdf and understand the basic structure presented there. However, after looking at the RcppExamples source code, it appears that the structure in the vignettes is not exactly the same as that used in the example package. For example there are no .h files used. Also neither the vignette nor the source code use the [[Rcpp::export]] attribute. This all makes it difficult to track down exactly where my error is.

Here is my "walk through" of how to go from using sourceCpp() to a package that uses Rcpp. If there is an error please feel free to edit this or let me know and I will edit it.
[NOTE: I HIGHLY recommend using RStudio for this process.]
So you have the sourceCpp() thing down pat and now you need to build a package. This is not hard, but can be a bit tricky, because the information out there about building packages with Rcpp ranges from the exhaustive thorough documentation you want with any R package (but that is above your head as a newbie), and the newbie sensitive introductions (that may leave out a detail you happen to need).
Here I use oneCpp.cpp and twoCpp.cpp as the names of two .cpp files you will use in your package.
Here is what I suggest:
A. First I assume you have a version of theCppFile.cpp that compiles with sourceCpp() and works as you expect it to. This is not a must, but if you are new to Rcpp OR packages, it is nice to make sure your code works in this simple situation before you move to the more complicated case below.
B. Now build your package using Rcpp.package.skeleton() or use the Project>Create Project>Package w/Rcpp wizard in RStudio (HIGHLY recommended). You can find details about using Rcpp.package.skeleton() in hadley/devtools or Rcpp Attributes Vignette. The full documentation for writing packages with Rcpp is in Writing a package that uses Rcpp, however this one assumes you know your way around C++ fairly well, and does not use the new "Attributes" way of doing Rcpp. It will be invaluable though if you move toward making more complex packages.
You should now have a directory structure for your package that looks something like this:
yourPackageName
- DESCRIPTION
- NAMESPACE
- \R\
- RcppExports.R
- Read-and-delete-me
- \man\
- yourPackageName-package.Rd
- \src\
- Makevars
- Makevars.win
- oneCpp.cpp
- twoCpp.cpp
- RcppExports.cpp
Once everything is set up, do a "Build & Reload" if using RStudio, or compileAttributes() if you are not in RStudio.
C. You should now see in your \R directory a file called RcppExports.R. Open it and check it out. In RcppExports.R you should see the R wrapper functions for all the .cpp files you have in your \src directory. Pretty sweet, eh?.
D) Try out the R function that corresponds to the function you wrote in theCppFile.cpp. Does it work? If so move on.
E) You can now just add new .cpp files like otherCpp.cpp to the \src directory as you create them. Then you just have to rebuild the package, and the R wrappers will be generated and added to RcppExports.R for you. In RStudio this is just "Build & Reload" in the Build menu. If you are not using RStudio you should run compileAttributes()

In short, the trick is to call compileAttributes() from within the root of the package. So for instance for package foo
$ cd /path/to/foo
$ ls
DESCRIPTION man NAMESPACE R src
$ R
R> compileAttributes()
This command will generate the RcppExports.cpp and RcppExports.R that were missing.

You are missing the forest for the trees.
sourceCpp() is a recent function; it is part of what we call "Rcpp attributes" which has its own vignette (with the same title, in the package, on my website, and on CRAN) which you may want to read. It among other things details how to turn something you compiled and run using sourceCpp() into a package. Which is what you want.
Randomly jumping between documentation won't help you, and at the end of the genuine source documentation by package authors may be preferable. Or to put a different spin on it: you are using a new feature but old documentation that doesn't reflect it. Try to write a basic package with Rcpp, ie come to it from the other end as well.
Lastly, there is a mailing list...

Related

Guidelines for including TMB c++ code in an R package

I've recently discovered the wonders of TMB and I'm working on a package which would ideally include TMB c++ templates in it for rather computationally expensive models.
I'm assuming that there's a possibility of:
Automatically compiling the TMB source code on package install
but I can't find any clear guidelines in the TMB documentation regarding this. As of now, my alternative is to write functions that compile the TMB code upon the first call of a function which uses an uncompiled class... but I have a feeling there are nicer ways to do this.
Has anyone successfully included TMB functions within another package and could point me in the direction of relevant documentation or examples?
With a bit more searching i finally found my answer in this thread. I guess I missed it because the resolutions it details were moved to the wiki page titled development, where the content is specifically targeted for users wishing to contribute to the development of TMB, whereas I just want to distribute code which incorperates TMB.
To summarize, the thread suggests some changes which I adopted like this (myPkg should be the name of your package):
src/
Place your .cpp template in mypkg/src. This will then be automatically compiled by R when you build your package.
DESCRIPTION
Add these lines to your description file so R has all the tools necessary to compile the model template.
Depends: TMB, RcppEigen
LinkingTo: TMB, RcppEigen
R/roxygentags.r
Now we need to add our TMB template to the namespace file. We can do this easily through roxygen by making a dummy file like so:
#' Roxygen commands
#'
#' #useDynLib myPkg
#'
dummy <- function(){
return(NULL)
}
The dummy function is just an excuse to have the tag #useDynLib myPkg somewhere in my source code where I wont mess with it. This tag will populate your NAMESPACE with useDynLib(myPkg)... and as I understand, this loads the shared libraries upon loading the package for you.
Calling the function in your package:
Finally, when calling MakeADFun, set DLL="myPkg". With this setup, you can compile a single TMB model into your package. This is because the content compiled in your ./src/ folder will automatically be renamed according to your package name, thus you cannot create uniquely named models.
EDIT: Solution for distributing multiple DLLs
After some more searching (same thread as referenced above)... I realized that solution described in the official wiki (and detailed above) is only relevant for distributing a single dll (i.e. a single TMB model).
If you want to distribute multiple TMB models in a package, you'll have to use your own makefile. I've given a more detailed description in my blog, so I'll only briefly describe the steps here with regard to how they differ from the previous steps I described.
src/Makefile
You'll have to define your own Makefile (or Makefile.win for windows users) and drop it in your src/ directory. Here's an example that works for me:
all: template1.so template2.so
# Comment here preserves the prior tab
template1.so: template1.cpp
Rscript --vanilla -e "TMB::compile('template1.cpp','-O0 -g')"
template2.so: template2.cpp
Rscript --vanilla -e "TMB::compile('template2.cpp','-O0 -g')"
clean:
rm -rf *o
For windows, replace so, with dll, and use the relevant compiler flags (for debugging). See ?TMB::compile for info regarding compiler flags for debugging.
R/roxygentags.r
This is slightly different than above:
#' Roxygen commands
#'
#' This is a dummy function who's purpose is to hold the useDynLib roxygen tag.
#' This tag will populate the namespace with compiled c++ functions upon package install.
#'
#' #useDynLib template1
#' #useDynLib template2
#'
dummy <- function(){
return(NULL)
}
Using your models in the package
Finally, the above changes will compile multiple uniquely named TMB templates and load them into the namespace. To call these models in your package, here's an example:
obj <- MakeADFun(data = data,
parameters = params,
DLL="template1",
inner.control = list(maxit = 10000),
silent=F)
Tips...
I had issues when I tried compiling this on a windows machine... it turned out to be related to not properly cleaning the src folder and I had old linux compiled files stuck in there. If you have compilation issues, its worth manually cleaning out the residual files in your src/ directory from previous builds... or perhaps someone can give some good advice on writing a better make file!
If you want access to the CppAD library with the additional code from TMB (which is quite substantial!) then you can use the WITH_LIBTMB macro variable as I do in this header here. This will allow you to have multiple .cpp files which you can compile separately. Importantly, you only need to compile the code from the TMB header once using a file like this which #includes the TMB.hpp header without defining WITH_LIBTMB.
This reduces the compilation time substantively as you can compile each .cpp on its own without all the code which is declared in TMB.hpp. Moreover, you can also use the code with Rcpp if you undefine and define a few macros as I do in the link.
You can also have one file which can used by TMB::MakeADFun. It requires a bit of manual work but can be done whilst also using Rcpp by using Rcpp::compileAttributes and changing the created file called RcppExports.cpp to instead be named init.cpp and then include these additional lines in the CallEntries array and R_init_survTMB function:
CallEntries array.
R_init_survTMB function.
Note on using Rstudio
Rstudio calls Rcpp::compileAttributes (or something similar) each time you build. Hence, you cannot use this. One way around this is to create a custom build script similar to the one here. It essentially calls R CMD INSTALL after having removed the RcppExports.cpp file created by Rcpp::compileAttributes. I also like to run the tests by calling devtools::test() but you can remove this if you like.

Force source rebuild in R package

This is similar to this question and answer, except specialized to R packages. Since R uses its own custom build process, what is the correct way to force a rebuild using Rcpp?
(For reasons that I won't go into here, all of my C++ code is located outside /pkg/src, and is called via a simple wrapper function that never changes. For this reason, when the important code changes, R thinks nothing has changed and declares the dreaded make: Nothing to be done for 'all'.)
The simplest solution is to add the flag --preclean to R CMD INSTALL. In Rstudio, this flag can be added under Project Options -> Build Tools -> Build and reload - R CMD INSTALL additonal options.
Regarding
what is the correct way to force a rebuild using Rcpp
the obvious answer is to rebuild from source
R CMD INSTALL sourceTarballOfPackage_0.1.2.tar.gz
The question then becomes where to get the source: CRAN, GitHub, GitLab, BitBucket, ... but we have helpers for that.
If your code is internal, then you just need to rebuild the wrapper calling it, and that is still in src/ in the package. That is no different then another R(cpp) package linking against external resources.

Do I need to import RccpEigen in the DESCRIPTION file for an R package using it, or is "LinkingTo" enough?

I used the RcppEigen.package.skeleton() as a template for adding a small function to an existing R package, so that my DESCRIPTION file now has the lines:
Imports: Rcpp (>= 0.11.3), RcppEigen (>= 0.3.2.3.0)
LinkingTo: Rcpp, RcppEigen
However, doing R CMD check --as-cran <myPackageName_1.0.0>.tar.gz gives the following:
"Package in Depends/Imports which should probably only be in LinkingTo: 'RcppEigen'"
The Writing R Extensions page says: "Specifying a package in ‘LinkingTo’ suffices if these are C++ headers containing source code or static linking is done at installation: the packages do not need to be (and usually should not be) listed in the ‘Depends’ or ‘Imports’ fields. This includes CRAN packages BH and almost all users of RcppArmadillo and RcppEigen."
I don't know any C++, so I don't know what this means. My procedure for creating the package is here: RcppEigen - going from inline to a .cpp function in a package and "Map"
Is it okay to remove the RcppEigen from "Imports" and why/why not? (i.e. Can you please explain what the Writing R Extension page is saying, for my case, so that I can understand what I'm doing? Both the R and software experts in my lab said they don't understand the difference between "Imports" and "LinkingTo".)
Briefly:
There are currently 25 packages on CRAN which use RcppEigen. That makes 25 working case studies. You could look at one or two.
LinkingTo: is generally enough.
It may be a bug that the skeleton generator still adds Imports. We no longer do that in RcppArmadillo.
When I just ran the corresponding function for RcppArmadillo, I got
Imports: Rcpp (>= 0.11.3)
LinkingTo: Rcpp, RcppArmadillo
so I am leaning towards a bug. And I now opened an issue ticket for it.
More broadly, the differences between LinkingTo: and Imports: are
Imports: is the preferred, modern alternative to Depends; you also need to use NAMESPACE
LinkingTo: is chiefly used to point to header files as we do here.
So Writing R Extensions, or Hadley's online book for details.

How to get a python .pyd for Windows from c/c++ source code? (update: brisk now in Python in case that's what you want)

How to get from C/C++ extension source code to a pyd file for windows (or other item that I could import to Python)?
edit: The specific library that I wanted to use (BRISK) was included in OpenCV 2.4.3 so my need for this skill went away for the time being. In case you came here looking for BRISK, here is a simple BRISK in Python demo that I posted.
I have the Brisk source code (download) that I would like to build and use in my python application. I got as far as generating a brisk.pyd file... but it was 0 bytes. If there is a better / alternative way to aiming for a brisk.pyd file, then of course I am open to that as well.
edit: Please ignore all the attempts in my original question below and see my answer which was made possible by obmarg's detailed walkthrough
Where am I going wrong?
Distutils without library path: First I tried to build the source as is with distutils and the following setup.py (I have just started learning distutils so this is a shot in the dark). The structure of the BRISK source code is at the bottom of this question for reference.
from distutils.core import setup, Extension
module1 = Extension('brisk',
include_dirs = ['include', 'C:/opencv2.4/build/include', 'C:/brisk/thirdparty/agast/include'],
#libraries = ['agast_static', 'brisk_static'],
#library_dirs = ['win32/lib'],
sources = ['src/brisk.cpp'])
setup (name = 'BriskPackage',
ext_modules = [module1])
That instantly gave me the following lines and a 0 byte brisk.pyd somewhere in the build folder. So close?
running build
running build_ext
Distutils with library path: Scratch that attempt. So I added the two library lines that are commented out in the above setup.py. That seemed to go ok until I got this linking error:
creating build\lib.win32-2.7
C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\BIN\link.exe /DLL /nologo /INCREMENTAL:NO /LIBPATH:win32/lib /LIB
PATH:C:\Python27_32bit\libs /LIBPATH:C:\Python27_32bit\PCbuild agast_static.lib brisk_static.lib /EXPORT:initbrisk build
\temp.win32-2.7\Release\src/brisk.obj /OUT:build\lib.win32-2.7\brisk.pyd /IMPLIB:build\temp.win32-2.7\Release\src\brisk.
lib /MANIFESTFILE:build\temp.win32-2.7\Release\src\brisk.pyd.manifest
LINK : error LNK2001: unresolved external symbol initbrisk
build\temp.win32-2.7\Release\src\brisk.lib : fatal error LNK1120: 1 unresolved externals
error: command '"C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\BIN\link.exe"' failed with exit status 1120
Uncontrolled flailing: I thought maybe the libraries needed to be built, so I did a crash course (lots of crashing) with cmake + mingw - mingw + vc++ express 2010 as follows:
cmake gui: source: c:/brisk, build: c:/brisk/build
cmake gui: configure for Visual Studio 10
cmake gui: use default options and generate (CMAKE_BACKWARDS_COMPATIBILITY, CMAKE_INSTALL_PREFIX, EXECUTABLE_OUTPUT_PATH, LIBRARY_OUTPUT_PATH)
VC++ Express 10: Change to Release and build the solution generated by cmake and get about 20 pages of what look like non-critical warnings followed by all succeeded. Note - no dlls are generated by this. It does generate the following libraries of similar size to the ones included with the download:
win32/lib/Release/
agast_static.lib
brisk_static.lib
Further flailing.
Relevant BRISK source file structure for reference:
build/ (empty)
include/brisk/
brisk.h
hammingsse.hpp
src
brisk.cpp
demo.cpp
thirdparty/agast/
include/agast/
agast5_8.h ....
cvWrapper.h
src/
agast5_8.cc ...
CMakeLists.txt
win32/
bin/
brisk.mexw32
opencv_calib3d220.dll ...
lib/
agast_static.lib
brisk_static.lib
CMakeLists.txt
FindOpenCV.cmake
Makefile
Are you sure that this brisk library even exports python bindings? I can't see any reference to it in the source code - it doesn't even seem to import python header files. This would certainly explain why you've not had much success so far - you can't just compile plain C++ code and expect python to interface with it.
I think your second distutils example is closest to correct - it's obviously compiling things and getting to the linker stage, but then you encounter this error. That error just means it can't find a function named initbrisk which I'm guessing would be the top level init function for the module. Again this suggests that you're trying to compile a python module from code that isn't meant for it.
If you want to wrap the C++ code in a python wrapper yourself you could have a look at the official documentation on writing c/c++ extensions. Alternatively you could have a look into boost::python, SIP or shiboken which try to somewhat (or completely) automate the process of making python extensions from C++ code.
EDIT: Since you seem to have made a decent amount of effort to solve the problem yourself and have posted a good question, I've decided to give a more detailed response on how to go about doing this.
Quick Tutorial On Wrapping C++ Libraries Using boost::python
Personally I've only ever used boost::python for stuff like this, so I'll try and give you a good summary of how to go about doing that. I'm going to assume that you're using Visual C++ 2010. I'm also going to assume that you've got a 32bit version of python installed, as I believe the boost pro libraries only provide 32bit binaries.
Installing boost
First you'll need to grab a copy of the boost library. The easiest way to do this is to download an installer from the boost pro website. These should install all the header files and binary files that are required for using the boost c++ library on windows. Take note of where you install these files to, as you'll need them later on - it might be best to install to a path without a space in it. For easyness I'm going to assume you put these files in C:\boost but you can substitute that for the path you actually used.
Alternatively, you can follow these instructions to build boost from source. I'm not 100% sure, but it might be the case that you need to do this in order to get a version of boost::python that is compatible with the version of python you have installed.
Setting up a visual studio project
Next, you'll want to setup a visual studio project for brisk.pyd. If you open visual studio, go to New -> Project then find the option for Win32 Project. Set up your location etc. and click ok. In the wizard that appears select a DLL project type, and then tick the empty project checkbox.
Now that you've created your project, you'll need to set up the include & library paths to allow you to use python, boost::python and the brisk.lib file.
In Visual Studios solution explorer, right click on your project, and select properties from the menu that appears. This should open up the property pages for your project. Go to the Linker -> General section and look for the Additional Library Directories section. You'll need to fill this in with the paths to the .lib files for boost, python and your brisk_static.lib. Generally these can be found in lib (or libs) subdirectories of
wherever you've installed the libraries. Paths are seperated with semicolons. I've attached a screenshot of my settings below:
Next, you'll need to get visual studio to link to the .lib files. These sections can be found in the Additional Dependencies field of the Linker -> Input section of the properties. Again it's a semicolon delimited list. You should need to add in libraries for python (in my case this is python27.lib but this will vary by version) and brisk_static.lib. These do not require the full path as you added that in the previous stage. Again, here's a screenshot:
You may also need to add the boost_python library file but I think boost uses some header file magic to save you the trouble. If I'm incorrect then have a look in you boost library path for a file named similar to boost_python-vc100-mt.lib and add that in.
Finally, you'll need to setup the include paths to allow your project to include the relevant C++ header files. To get the relevant settings to appear in project properties, you'll need to add a .cpp file to your project. Right click the source files folder in your solution explorer, and then go to add new item. Select a C++ File (.cpp) and name it main.cpp (or whatever else you want).
Next, go back to your project properties and go to C/C++ -> General. Under the additional libraries directory you need to add the include paths for brisk, python and boost. Again, semicolons for seperators, and again here's a screenshot:
I suspect that you might need to update these settings to include the opencv2 & agast libraries as well but I'll leave that as a task for you to figure out - it should be much the same process.
Wrapping existing c++ classes with boost::python.
Now comes the slightly trickier bit - actually writing C++ to wrap your brisk library in boost python. You can find a tutorial for this here but i'll try and go over it a bit as well.
This will be taking place in the main.cpp file you created earlier. First, add the relevant include statements you'll need at the top of the file:
#include <brisk/brisk.h>
#include <Python.h>
#include <boost/python.hpp>
Next, you'll need to declare your python module. I'm assuming you'd want this to be called brisk, so you do something like this:
BOOST_PYTHON_MODULE(brisk)
{
}
This should tell boost::python to create a python module named brisk.
Next it's just a case of going through all the classes & structs that you want to wrap and declaring boost python classes with them. The declerations of the classes should all be contained in brisk.h. You should only wrap the public members of a class, not any protected or private members. As a quick example, I've done a couple of the structs here:
BOOST_PYTHON_MODULE(brisk)
{
using namespace boost::python;
class_< cv::BriskPatternPoint >( "BriskPatternPoint" )
.def_readwrite("x", &cv::BriskPatternPoint::x)
.def_readwrite("y", &cv::BriskPatternPoint::y)
.def_readwrite("sigma", &cv::BriskPatternPoint::sigma);
class< cv::BriskScaleSpace >( "BriskScaleSpace", init< uint8_t >() )
.def( "constructPyramid", &cv::BriskScaleSpace::constructPyramid );
}
Here I have wrapped the cv::BriskPatternPoint structure and the cv::BriskScaleSpace class. Some quick explanations:
class_< cv::BriskPatternPoint >( "BriskPatternPoint" ) tells boost::python to declare a class, using the cv::BriskPatternPoint C++ class, and expose it as BriskPatternPoint in python.
.def_readwrite("y", &cv::BriskPatternPoint::y) adds a readable & writeable property to the BriskPatternPoint class. The property is named y, and will map to the BriskPatternPoint::y c++ field.
class< cv::BriskScaleSpace >( "BriskScaleSpace", init< uint8_t >() ) declares another class, this time BriskScaleSpace but also provides a constructor that accepts a uint8_t (an unsigned byte - which should just map to an integer in python, but I'd be careful to not pass in one greater than 255 bytes - I don't know what would happen in that situation)
The following .def line just declares a function - boost::python should (I think) be able to determine the argument types of functions automatically, so you don't need to provide them.
It's probably worth noting that I haven't actually compiled any of these examples - they might well not work at all.
Anyway, to get this fully working in python it should just be a case of doing similar for every structure, class, property & function that you want accessible from python - which is potentially quite a time consuming task!
If you want to see another example of this in action, I did this here to wrap up this class
Building & using the extension
Visual studio should take care of building the extension - then using it is just a case of taking the .DLL and renaming it to .pyd (you can get VS to do this for you, but I'll leave that up to you).
Then you just need to copy your python file to somewhere on your python path (site-packages for example), import it and use it!
import brisk
patternPoint = brisk.BriskPatternPoint()
....
Anyway, I have spent a good hour or so writing this out - so I'm going to stop here. Apologies if I've left anything out or if anything isn't clear, but I'm doing this mostly from memory. Hopefully it's been of some help to you. If you need anything clarified please just leave a comment, or ask another question.
In case someone needs it, this what I have so far. Basically a BriskFeatureDetector that can be created in Python and then have detect called. Most of this is just confirming/copying what obmarg showed me, but I have added the details that get all the way to the pyd library.
The detect method is still incomplete for me though since it does not convert data types. Anyone who knows a good way to improve this, please do! I did find, for example, this library which seems to convert a numpy ndarray to a cv::Mat, but I don't have the time to figure out how to integrate it now. There are also other data types that need to be converted.
Install OpenCV 2.2
for the setup below, I installed to C:\opencv2.2
Something about the API or implementation has changed by version 2.4 that gave me problems (maybe the new Algorithm object?) so I stuck with 2.2 which BRISK was developed with.
Install Boost with Boost Python
for the setup below, I installed to C:\boost\boost_1_47
Create a Visual Studio 10 Project:
new project --> win32
for the setup below, I named it brisk
next --> DLL application type; empty project --> finished
at the top, change from Debug Win32 to Release Win32
Create main.cpp in Source Files
Do this before the project settings so the C++ options become available in the project settings
#include <boost/python.hpp>
#include <opencv2/opencv.hpp>
#include <brisk/brisk.h>
BOOST_PYTHON_MODULE(brisk)
{
using namespace boost::python;
//this long mess is the only way I could get the overloaded signatures to be accepted
void (cv::BriskFeatureDetector::*detect_1)(const cv::Mat&,
std::vector<cv::KeyPoint, std::allocator<cv::KeyPoint>>&,
const cv::Mat&) const
= &cv::BriskFeatureDetector::detect;
void (cv::BriskFeatureDetector::*detect_vector)(const std::vector<cv::Mat, std::allocator<cv::Mat>>&,
std::vector< std::vector< cv::KeyPoint, std::allocator<cv::KeyPoint>>, std::allocator< std::vector<cv::KeyPoint, std::allocator<cv::KeyPoint>>>>&,
const std::vector<cv::Mat, std::allocator<cv::Mat>>&) const
= &cv::BriskFeatureDetector::detect;
class_< cv::BriskFeatureDetector >( "BriskFeatureDetector", init<int, int>())
.def( "detect", detect_1)
;
}
Project Settings (right-click on the project --> properties):
Includes / Headers
Configuration Properties --> C/C++ --> General
add to Additional Include Directories (adjust to your own python / brisk / etc. base paths):
C:\opencv2.2\include;
C:\boost\boost_1_47;
C:\brisk\include;C:\brisk\thirdparty\agast\include;
C:\python27\include;
Libraries (linker)
Configuration Properties --> Linker --> General
add to Additional Library Directories (adjust to your own python / brisk / etc. base paths):
C:\opencv2.2\lib;
C:\boost\boost_1_47\lib;
C:\brisk\win32\lib;
C:\python27\Libs;
Configuration Properties --> Linker --> Input
add to Additional Dependencies (adjust to your own python / brisk / etc. base paths):
opencv_imgproc220.lib;opencv_core220.lib;opencv_features2d220.lib;
agast_static.lib; brisk_static.lib;
python27.lib;
.pyd output instead of .dll
Configuration Properties --> General
change Target Extension to .pyd
Build and rename if necessary
Right-click on the solution and build/rebuild
you may need to rename the output from "Brisk.pyd" to "brisk.pyd" or else python will give you errors about not being able to load the DLL
Make brisk.pyd available to python by putting it in site packages or by putting a .pth file that links to its path
Update Path environment variable
In windows settings, make sure the following are included in your path (again, adjust to your paths):
`C:\boost\boost_1_47\lib;C:\brisk\win32\bin`

use dyn.load or library.dynam to call a C++ function in an R package

I am very new to R. I would like to build an R package which will call a C++ function using .Call().
I have a NAMESPACE file, with
useDynLib(mypkg)
where mypkg is also the function name of my c++ code.
It works if I use this line at the begining of the mypkg.R file:
dyn.load("src/mypkg.so")
but I want to use library.dynam instead, so in the zzz.R file, I put
.onLoad<-function(libname, pkgname)
{
library.dynam("mypkg", pkgname, libname)
}
It gives the error when checking the package:
...
Error in .Call("mypkg", PACKAGE = "mypkg") :
C symbol name "mypkg" not in DLL for package "mypkg".
Error : unable to load R code in package 'mypkg'
...
It looks like the *.so file is generated in the wrong place? Why there is not /libs folder generated?
I would like to build the package to be os independent, is there a way to do it with dyn.load?
And this may be a very silly question, where did pkgname and libname get their input from?
Thank you very much for your help.
You could look at one of the many existing packages (with compiled source code) on CRAN.
Smaller and simpler is easier to grok, so you could e.g. look at digest which uses a NAMESPACE to load the one shared library built from the handful of C source files, and uses .Call() to access the main entry point.
And then there is of course the manual...