I have been working my way through Dirk Eddelbuettel's Rcpp tutorial here:
http://www.rinfinance.com/agenda/
I have learned how to save a C++ file in a directory and call it and run it from within R. The C++ file I am running is called 'logabs2.ccp' and its contents are directly from one of Dirk's slides:
#include <Rcpp.h>
using namespace Rcpp;
inline double f(double x) { return ::log(::fabs(x)); }
// [[Rcpp::export]]
std::vector<double> logabs2(std::vector<double> x) {
std::transform(x.begin(), x.end(), x.begin(), f);
return x;
}
I run it with this R code:
library(Rcpp)
sourceCpp("c:/users/mmiller21/simple r programs/logabs2.cpp")
logabs2(seq(-5, 5, by=2))
# [1] 1.609438 1.098612 0.000000 0.000000 1.098612 1.609438
I am running the code on a Windows 7 machine from within the R GUI that seems to install by default. I also installed the most recent version of Rtools. The above R code seems to take a relatively long time to run. I suspect most of that time is devoted to compiling the C++ code and that once the C++ code is compiled it runs very quickly. Microbenchmark certainly suggests that Rcpp reduces computation time.
I have never used C++ until now, but I know that when I compile C code I get an *.exe file. I have searched my hard-drive from a file called logabs2.exe but cannot find one. I am wondering whether the above C++ code might run even faster if a logabs2.exe file was created. Is it possible to create a logabs2.exe file and store it in a folder somewhere and then have Rcpp call that file whenever I wanted to use it? I do not know whether that makes sense. If I could store a C++ function in an *.exe file then perhaps I would not have to compile the function every time I wanted to use it with Rcpp and then perhaps the Rcpp code would be even faster.
Sorry if this question does not make sense or is a duplicate. If it is possible to store the C++ function as an *.exe file I am hoping someone will show me how to modify my R code above to run it. Thank you for any help with this or for setting me straight on why what I suggest is not possible or recommended.
I look forward to seeing Dirk's new book.
Thank you to user1981275, Dirk Eddelbuettel and Romain Francois for their responses. Below is how I compiled a C++ file and created a *.dll, then called and used that *.dll file inside R.
Step 1. I created a new folder called 'c:\users\mmiller21\myrpackages' and pasted the file 'logabs2.cpp' into that new folder. The file 'logabs2.cpp' was created as described in my original post.
Step 2. Inside the new folder I created a new R package called 'logabs2' using an R file I wrote called 'new package creation.r'. The contents of 'new package creation.r' are:
setwd('c:/users/mmiller21/myrpackages/')
library(Rcpp)
Rcpp.package.skeleton("logabs2", example_code = FALSE, cpp_files = c("logabs2.cpp"))
I found the above syntax for Rcpp.package.skeleton on one of Hadley Wickham's websites: https://github.com/hadley/devtools/wiki/Rcpp
Step 3. I installed the new R package "logabs2" in R using the following line in the DOS command window:
C:\Program Files\R\R-3.0.1\bin\x64>R CMD INSTALL -l c:\users\mmiller21\documents\r\win-library\3.0\ c:\users\mmiller21\myrpackages\logabs2
where:
the location of the rcmd.exe file is:
C:\Program Files\R\R-3.0.1\bin\x64>
the location of installed R packages on my computer is:
c:\users\mmiller21\documents\r\win-library\3.0\
and the location of my new R package prior to being installed is:
c:\users\mmiller21\myrpackages\
Syntax used in the DOS command window was found by trial and error and may not be ideal. At some point I pasted a copy of 'logabs2.cpp' in 'C:\Program Files\R\R-3.0.1\bin\x64>' but I do not think that mattered.
Step 4. After installing the new R package I ran it using an R file I named 'new package usage.r' in the 'c:/users/mmiller21/myrpackages/' folder (although I do not think the folder was important). The contents of 'new package usage.r' are:
library(logabs2)
logabs2(seq(-5, 5, by=2))
The output was:
# [1] 1.609438 1.098612 0.000000 0.000000 1.098612 1.609438
This file loaded the package Rcpp without me asking.
In this case base R was faster assuming I did this correctly.
#> microbenchmark(logabs2(seq(-5, 5, by=2)), times = 100)
#Unit: microseconds
# expr min lq median uq max neval
# logabs2(seq(-5, 5, by = 2)) 43.086 44.453 50.6075 69.756 190.803 100
#> microbenchmark(log(abs(seq(-5, 5, by=2))), times=100)
#Unit: microseconds
# expr min lq median uq max neval
# log(abs(seq(-5, 5, by = 2))) 38.298 38.982 39.666 40.35 173.023 100
However, using the dll file was faster than calling the external cpp file:
system.time(
cppFunction("
NumericVector logabs(NumericVector x) {
return log(abs(x));
}
")
)
# user system elapsed
# 0.06 0.08 5.85
Although base R seems faster or as fast as the *.dll file in this case, I have no doubt that using the *.dll file with Rcpp will be faster than base R in most cases.
This was my first attempt creating an R package or using Rcpp and no doubt I did not use the most efficient methods. Also, I apologize for any typographic errors in this post.
EDIT
In a comment below I think Romain Francois suggested I modify the *.cpp file to the following:
#include <Rcpp.h>
using namespace Rcpp;
// [[Rcpp::export]]
NumericVector logabs(NumericVector x) {
return log(abs(x));
}
and recreate my R package, which I have now done. I then compared base R against my new package using the following code:
library(logabs)
logabs(seq(-5, 5, by=2))
log(abs(seq(-5, 5, by=2)))
library(microbenchmark)
microbenchmark(logabs(seq(-5, 5, by=2)), log(abs(seq(-5, 5, by=2))), times = 100000)
Base R is still a tiny bit faster or no different:
Unit: microseconds
expr min lq median uq max neval
logabs(seq(-5, 5, by = 2)) 42.401 45.137 46.505 69.073 39754.598 1e+05
log(abs(seq(-5, 5, by = 2))) 37.614 40.350 41.718 62.234 3422.133 1e+05
Perhaps this is because base R is already vectorized. I suspect with more complex functions base R will be much slower. Or perhaps I am still not using the most efficient approach, or perhaps I simply made an error somewhere.
You say
I have never used C++ until now, but I know that when I compile C code
I get an *.exe file
and that is true if and only you build an executable. Here, we build dynamically loadable libraries and those thend to have different extensionos depending on the operating system: .dll for Windoze, .so for Linux, .dynlib for OS X.
So nothing wrong here, you simply had the wrong assumption.
If you want to get some entity you can keep, what you are looking for is an R package. There are many resources online to learn how to make them (e.g. Hadley's slides).
We have Rcpp.package.skeleton you might find useful.
So, the function is compiled once when the package is installed, and then you just use it.
Related
I have been struggling with installing an R package containing some clustering algorithms from Github using this command:
require("devtools")
install_github("bfatimah/OASW")
(https://github.com/bfatimah/OASW/)
In order for the package to work properly for what I am doing, I have to install some other packages, for example.
require("cluster")
require("mclust")
require("nnet")
I did install these packages and loaded them into R. However, when I tried to run this block of codes:
n <- 100
K <- 2
dmat <- TwoGaussian(n)$data
dys <- dist(dmat)
initClustering <- init(dmat, K, distmethod = "euclidean")
osilClustering <- osilFix(dys, n, K, initClustering$lab_best)
plot(dmat, col = osilClustering$clus_lab, pch = 16, cex = 1.5)
It returned an error:
> osilClustering <- osilFix(dys, n, K, initClustering$lab_best)
Error in sil_lab_swap(K, n, clus_lab, alt_clus_lab, clus_size, disty, :
object '_oasw_sil_lab_swap' not found
Called from: sil_lab_swap(K, n, clus_lab, alt_clus_lab, clus_size, disty,
iter, dys_i, avg_dys_clus, silh, altsilh)
It turns out that there is a folder (/src) in the package containing C++ programs, and the "R" functions in the packages only act as wrapper of C++ functions in /src. None of the C++ programs seem to be compiled.
I just do not know how to fix this problem as it is not my expertise at all. Is there any advice? Thanks a lot!
P/s: It worked after restarting R and ran all the program again. It did not work previously because of confliction.
I have looked extensively in the net, yet not found exactly what I want.
I have a big simulation program that outputs results in a MATLAB M-file (let's call it res.m) and I want to plot the results visually.
I want to start the simulation with C++ many times in a row and therefore want to automatize the plotting of the results.
I come up to two options:
Execute from C++ an Octave or MATLAB script that generates the graph.
-> Haven't found anyone who managed to do so
Use the Octave source files to read the res.m file and output them after with whatever plotting C++ tool.
-> Theoretically possible but I get lost in those files
Is someone able to solve this? Or has a better, easier approach?
The answer is to execute through the terminal.
I didn't manage to actually run a octave script from my c++ program directly, but there is a way around messing with/through the terminal and a extra Octave file. I used in my cpp:
string = "octave myProgr.m"
const char *command = str.c_str();
system(command);
myProgr.m is the script that plots the res.m file
New to C++, I would like to make functions compiled in a DLL available in R.
So far, I managed to do that for a basic function taking integer as input and returning the same, by following these steps in VisualStudio, then using dyn.load from R to load the created dll.
However, my C++ functions will need to handle R data.frame objects, and I am not sure how to make that possible. I saw from the Rcpp gallery that Rcpp might include some kind of translations between R and c++ data types including data.frame objects, but I don't know if I can generate a dll using Rcpp that I can then include in R using dyn.load.
From this answer by Dirk Eddelbuettel, it seems possible to generate a "dynamic library" using Rcpp, however, I could not find any dll when I tried generating a package with a source .cpp file using rcpp.package.skeleton(). The function I'd like to have a dll for is from the Rcpp gallery
#include <Rcpp.h>
using namespace Rcpp;
// [[Rcpp::export]]
DataFrame modifyDataFrame(DataFrame df) {
// access the columns
IntegerVector a = df["a"];
CharacterVector b = df["b"];
// make some changes
a[2] = 42;
b[1] = "foo";
// return a new data frame
return DataFrame::create(_["a"]= a, _["b"]= b);
}
I tried to just paste that code into VisualStudio to try and generate that DLL, however I have the error "cannot find Rcpp.h" which I quite expected.
I then followed these steps in RStudio:
Create new project / Package
Include this cpp source file as a source file for this package
include Rcpp and enter Rcpp.package.skeleton("mypackage") so far, no DLL in the package folders
Tried to build the package in RStudio by going to Build/Install and Restart, but then I get an error message "Building R Packages needs installation of additional build tools, do you want to install them?" However I already have RbuildTools 3.4 installed, and when I click "YES" in RStudio nothing happens.
PS: Happy to hear about other methods but here the DLL format should be used if possible. Any piece of info is greatly appreciated since I have basically no idea of how Rcpp or C++ work
You need to figure out why your setup is hosed. This is meant to be easy and it is easy. Watch:
R> Rcpp::cppFunction('DataFrame modDF(DataFrame df) { IntegerVector a = df["a"]; CharacterVector b = df["b"]; a[2] = 42; b[1] = "foo"; return DataFrame::create(Named("a")=a, Named("b")=b); } ')
R> df <- data.frame(a=1:3, b=letters[24:26])
R> df
a b
1 1 x
2 2 y
3 3 z
R> modDF(df)
a b
1 1 x
2 2 foo
3 42 z
R>
Now, I obviously don't recommend writing this way in a long one-liner. You are already on the right track setting up a package. But you need to sort out what is holding up your tools.
And as a PS the one-liner above with better indentation:
R> Rcpp::cppFunction('DataFrame modDF(DataFrame df) { \
IntegerVector a = df["a"]; \
CharacterVector b = df["b"]; \
a[2] = 42; \
b[1] = "foo"; \
return DataFrame::create(Named("a")=a, Named("b")=b); \
} ')
The following seems to work:
from R, ran Rcpp.package.skeleton("dfcpp4", cpp_files="modifyDataFrame.cpp"). The second argument is required in order for the modifyDataFrame function to be available from the dll using dyn.load.
From the command line ran R CMD build dfcpp4
ran R CMD check dfcpp4 --no-manual from the command line.
The dll file in now present in the src-x64 folder
I am now able to call this function using
dyn.load("dfcpp4/src-x64/dfcpp4.dll")
df <- data.frame(a = c(1, 2, 3),
b = c("x", "y", "z"))
.Call("_dfcpp4_modifyDataFrame", df)
a b
1 1 x
2 2 foo
3 42 z
What I don't get is why in this case .Call should be used instead of .C...
I created a MyMex.m and a MyMex.cpp. Inside the .m I compile the .cpp using mex. It should happen only if the .mex64 does not exists. The .mex64 is complite to a directory in Matlab PATH. But Matlab will keep running the .m on an infinity loop if I don't set the Matlab current working dir to the .mex64 dir. What I'm missing?
MyMex.m:
function [dataOut] = MyMex(dataIn)
mexCppFile = 'MyMex.cpp';
mexCmd = 'mex MyMex.cpp;';
fprintf('\nFile %s not compiled yet, compiling it now...\n%s\n',mexCppFile,mexCmd);
fileFullPath = which(mexCppFile);
if size(fileFullPath,2) > 0 && exist(fileFullPath,'file')
[fileDir, fileName, ext] = fileparts(fileFullPath);
curDir = pwd;
cd(fileDir);
mex MyMex.cpp;
cd(curDir);
else
error('prog:input','Unable to find %s to compile it. Check if the file is in the current dir or in the Matlab PATH!',mexCppFile);
end
% Call C++ mex
[dataOut] = MyMex(dataIn)
end
Edit to defend myself from the comments that I did a infinity loop:
Matlab was supposed to know that there is a compiled version of the function. I don't know how it does it and my problem is related to that, since some times it finds the function some times it doesn't.
Here is a consolidated online mex sample that does the same "infinity" thing and work smoothly:
2D interpolation
His code in mirt2D_mexinterp.m:
% The function below compiles the mirt2D_mexinterp.cpp file if you haven't done it yet.
% It will be executed only once at the very first run.
function Output_images = mirt2D_mexinterp(Input_images, XI,YI)
pathtofile=which('mirt2D_mexinterp.cpp');
pathstr = fileparts(pathtofile);
mex(pathtofile,'-outdir',pathstr);
Output_images = mirt2D_mexinterp(Input_images, XI,YI);
end
Maybe the .m and the .mex64 need to be on the same folder.
It all comes down to Matlab's search path.
Mex-files are prioritized over m-files if they are on the same level in the path. And files in the current directory take precedence over files found elsewhere in the matlab search path.
So when you experience an infinite loop, it is clear that the m-file is locate higher in the search path than the mex-file.
In essence, all is fine if the two files are in the same folder.
The following code of mine computes the confidence interval using Chi-square's 'quantile' and probability function from Boost.
I am trying to implement this function as to avoid dependency to Boost. Is there any resource where can I find such implementation?
#include <boost/math/distributions/chi_squared.hpp>
#include <boost/cstdint.hpp>
using namespace std;
using boost::math::chi_squared;
using boost::math::quantile;
vector <double> ConfidenceInterval(double x) {
vector <double> ConfInts;
// x is an estimated value in which
// we want to derive the confidence interval.
chi_squared distl(2);
chi_squared distu((x+1)*2);
double alpha = 0.90;
double lower_limit = 0;
if (x != 0) {
chi_squared distl(x*2);
lower_limit = (quantile(distl,((1-alpha)/2)))/2;
}
double upper_limit = (quantile(distu,1-((1-alpha)/2)))/2;
ConfInts.push_back(lower_limit);
ConfInts.push_back(upper_limit);
return ConfInts;
}
If you're looking for source code you can copy/paste, here are some links:
AlgLib
Koders
YMMV...
Have a look at the Gnu Scientific library. Or look in Numerical Recipes. There's also a Java version in Apache Commons Math, which should be straightforward to translate.
I am trying to implement this function as to avoid dependency to Boost.
Another option is to reduce Boost dependencies, but not avoid them. If you reduce the dependency, you might be able to use a Boost folder with say, 200 or 300 source files rather than the entire 1+ GB of material. (Yes, 200 or 300 can be accurate - its what I ended up with when copying out shared_ptr).
To reduce the dependency, use bcp (boost copy) to copy out just the files needed for chi_squared.hpp. The bad thing is, you usually have to build bcp from sources because its not distributed in the ZIPs or TARBALLs.
To find instructions on building bcp, see How to checkout latest stable Boost (not development or bleeding edge)?