Using closure library with jsTestDriver - unit-testing

I'm learning about google closure tools by writing a simple JavaScript game. I'm having trouble figuring out how to set up jsTestDriver so that it works well with closure library.
Specifically: I'd like to use the goog.require mechanism to include any additional JavaScript files rather than have to manually add them all to the config file.
Following meyertee's suggestion I made a simple script to automatically write the dependencies to a config file
#!/bin/bash
cp tests/jsTestDriver.conf.proto tests/jsTestDriver.conf
libs/closure-library/closure/bin/build/closurebuilder.py --root="./libs/closure-library" --root="./js" --namespace="lds" | sed "s#^# - \.\./#" >> tests/jsTestDriver.conf
The tests/jsTestDriver.conf.proto file is a simple template:
test:
- "*.js"
load:
- ../libs/knockout-2.1.0.js
# Crucial, the load key needs to be last, and this comment must be followed by a newline.
It is a very fragile script, but hopefully someone (other than me) will find it useful.

You can do it semi-automatically by letting Closure Compile generate a manifest file, which will output all files in the correct order of dependency. You can then transform that file to relative paths and paste them into the JsTestDriver config file. That's how I do it.
You could even write a script that does this transformation automatically.
This is the relevant compiler argument:
--output_manifest manifest.MF
There are some details on the Closure Compiler's Google Code Wiki
Edit:
There are also some Python scripts to help you calculate dependencies. You can use calcdeps.py or closurebuilder.py to generate a manifest file, which even includes files that haven't been 'required' by your code.

Since JsTestDriver does not following the Closure Library convention of declaring dependencies with goog.provide() and goog.require(), your best option may be meyertee's solution.
However, the Closure Library includes its own testing framework. See:
Test Driven Development with the Closure Framework
Asserts API

Related

"Embedding" a folder into a C/C++ program

I have a script library stored in .../lib/ that I want to embed into my program. So far, that sounds simple: On Windows, I'd use Windows Resource Files - on MacOS, I'd put them into a Resource folder and use the proper API to access the current bundle and it's resources. On plain Linux, I am not too sure how to do it... But, I want to be cross-platform anyway.
Now, I know that there are tools like IncBin (https://github.com/graphitemaster/incbin) and alike, but they are best used for single files. What I have, however, might even require some kind of file system abstraction.
So here is the few guesses and estimates I did. I'd like to know if there is possibly a better solution - or others, in general.
Create a Zip file and use MiniZ in order to read it's contents off a char array. Basically, running the zip file through IncBin and passing it as a buffer to MiniZ to let me work on that.
Use an abstracted FS layer like PhysicsFS or TTVFS and add the possibility to work off a Zip file or any other kind of archive.
Are there other solutions? Thanks!
I had this same issue, and I solved it by locating the library relative to argv[0]. But that only works if you invoke the program by its absolute path -- i.e., not via $PATH in the shell. So I invoke my program by a one-line script in ~/bin, or any other directory that's in your search path:
exec /wherever/bin/program "$#"
When the program is run, argv[0] is set to "/wherever/bin/program", and it knows to look in "/wherever/lib" for the related scripts.
Of course if you're installing directly into standard locations, you can rely on the standard directory structure, such as /usr/local/bin/program for the executable and /etc/program for related scripts & config files. The technique above is just when you want to be able to install a whole bundle in an arbitrary place.
EDIT: If you don't want the one-line shell script, you can also say:
alias program=/wherever/bin/program

Guidelines for including TMB c++ code in an R package

I've recently discovered the wonders of TMB and I'm working on a package which would ideally include TMB c++ templates in it for rather computationally expensive models.
I'm assuming that there's a possibility of:
Automatically compiling the TMB source code on package install
but I can't find any clear guidelines in the TMB documentation regarding this. As of now, my alternative is to write functions that compile the TMB code upon the first call of a function which uses an uncompiled class... but I have a feeling there are nicer ways to do this.
Has anyone successfully included TMB functions within another package and could point me in the direction of relevant documentation or examples?
With a bit more searching i finally found my answer in this thread. I guess I missed it because the resolutions it details were moved to the wiki page titled development, where the content is specifically targeted for users wishing to contribute to the development of TMB, whereas I just want to distribute code which incorperates TMB.
To summarize, the thread suggests some changes which I adopted like this (myPkg should be the name of your package):
src/
Place your .cpp template in mypkg/src. This will then be automatically compiled by R when you build your package.
DESCRIPTION
Add these lines to your description file so R has all the tools necessary to compile the model template.
Depends: TMB, RcppEigen
LinkingTo: TMB, RcppEigen
R/roxygentags.r
Now we need to add our TMB template to the namespace file. We can do this easily through roxygen by making a dummy file like so:
#' Roxygen commands
#'
#' #useDynLib myPkg
#'
dummy <- function(){
return(NULL)
}
The dummy function is just an excuse to have the tag #useDynLib myPkg somewhere in my source code where I wont mess with it. This tag will populate your NAMESPACE with useDynLib(myPkg)... and as I understand, this loads the shared libraries upon loading the package for you.
Calling the function in your package:
Finally, when calling MakeADFun, set DLL="myPkg". With this setup, you can compile a single TMB model into your package. This is because the content compiled in your ./src/ folder will automatically be renamed according to your package name, thus you cannot create uniquely named models.
EDIT: Solution for distributing multiple DLLs
After some more searching (same thread as referenced above)... I realized that solution described in the official wiki (and detailed above) is only relevant for distributing a single dll (i.e. a single TMB model).
If you want to distribute multiple TMB models in a package, you'll have to use your own makefile. I've given a more detailed description in my blog, so I'll only briefly describe the steps here with regard to how they differ from the previous steps I described.
src/Makefile
You'll have to define your own Makefile (or Makefile.win for windows users) and drop it in your src/ directory. Here's an example that works for me:
all: template1.so template2.so
# Comment here preserves the prior tab
template1.so: template1.cpp
Rscript --vanilla -e "TMB::compile('template1.cpp','-O0 -g')"
template2.so: template2.cpp
Rscript --vanilla -e "TMB::compile('template2.cpp','-O0 -g')"
clean:
rm -rf *o
For windows, replace so, with dll, and use the relevant compiler flags (for debugging). See ?TMB::compile for info regarding compiler flags for debugging.
R/roxygentags.r
This is slightly different than above:
#' Roxygen commands
#'
#' This is a dummy function who's purpose is to hold the useDynLib roxygen tag.
#' This tag will populate the namespace with compiled c++ functions upon package install.
#'
#' #useDynLib template1
#' #useDynLib template2
#'
dummy <- function(){
return(NULL)
}
Using your models in the package
Finally, the above changes will compile multiple uniquely named TMB templates and load them into the namespace. To call these models in your package, here's an example:
obj <- MakeADFun(data = data,
parameters = params,
DLL="template1",
inner.control = list(maxit = 10000),
silent=F)
Tips...
I had issues when I tried compiling this on a windows machine... it turned out to be related to not properly cleaning the src folder and I had old linux compiled files stuck in there. If you have compilation issues, its worth manually cleaning out the residual files in your src/ directory from previous builds... or perhaps someone can give some good advice on writing a better make file!
If you want access to the CppAD library with the additional code from TMB (which is quite substantial!) then you can use the WITH_LIBTMB macro variable as I do in this header here. This will allow you to have multiple .cpp files which you can compile separately. Importantly, you only need to compile the code from the TMB header once using a file like this which #includes the TMB.hpp header without defining WITH_LIBTMB.
This reduces the compilation time substantively as you can compile each .cpp on its own without all the code which is declared in TMB.hpp. Moreover, you can also use the code with Rcpp if you undefine and define a few macros as I do in the link.
You can also have one file which can used by TMB::MakeADFun. It requires a bit of manual work but can be done whilst also using Rcpp by using Rcpp::compileAttributes and changing the created file called RcppExports.cpp to instead be named init.cpp and then include these additional lines in the CallEntries array and R_init_survTMB function:
CallEntries array.
R_init_survTMB function.
Note on using Rstudio
Rstudio calls Rcpp::compileAttributes (or something similar) each time you build. Hence, you cannot use this. One way around this is to create a custom build script similar to the one here. It essentially calls R CMD INSTALL after having removed the RcppExports.cpp file created by Rcpp::compileAttributes. I also like to run the tests by calling devtools::test() but you can remove this if you like.

NMake Optional Dependencies

We’re currently upgrading our archaic build system from a bunch of batch scripts to a makefile system using NMake. It’s challenging as we use a custom intermediate language that ends up getting translated to C++ where some of our translators can generate 10’s of files what have a common parts in the file names. The other challenging thing is we use a bunch of CSV files to configure our interfaces and these files get passed through to our configuration tools which generate more source code files. Right now I am focusing on creating the simple rules for our configuration files but can’t seem to figure out a way associate a dependency with a rule if the dependency exists. I tried to use $(wildcard xxx.csv) but found out that this command doesn’t exist for NMake like it does for GNU Make.
So how can I create my rule so that it executes and runs my commands if I have two dependency csv files that will always exists and a third csv file that will exist only when my project calls for it?
[..] will exist only when my project calls for it?
This is a bit unclear. Assuming that there is a command that - depending on some external circumstances - might generate that third csv file, you could use a "stamp file" (I think they call it "pseudo target" in NMAKE):
stamp:
command_that_might_generate_csv3
touch stamp # updates timestamp of "stamp" (or creates it)
target: csv1 csv2 stamp
command_using_all_of csv1 csv2 csv3

What's the purpose of _tags file with OCaml, and how to interpret the contents?

From Building OCaml code that uses list comprehension post, I can use _tags file to execute ocamlbuild with less build options.
$ cat _tags
<**/*> : package(camlp4),syntax(camlp4o),package(pa_comprehension)
From the Batteries introduction, I also need to have _tags file to use Batteries package.
<*>: package(batteries)
Why the first example uses <**/*> when the second example uses <*>? In general, what's the purpose of _tags file in ocamlbuild? Do you have some good tutorials on it?
A _tags file in addition to myocamlbuild.ml forms the root of the ocamlbuild compilation system.
ocamlbuild is a very generic tool, that can compile anything. It is driven by a solver, that according to the target, set by a user, finds a solution that can satisfy the target. The solution is a chain of rules that are applied to files. Files can be tagged by tags. Tags can alter rules. For example, it can add flags, like enabling profiling or linking to a library.
A _tags file provides a mechanism to assign tags to files and it has a simple grammar:
pattern ":" tag "," {tag}.
What is to the left of : is actually a pattern or regular expression. Each file that matches the expression will be assigned with all tags that occur to the right of :.
<**/*> means for all files in this folder, and all subfolders, there is a shortcut for this: true. <*> means for all files on this folder (without descending into subfolders). Other examples are <*.ml>, <*.cmx> or <**/*.cma> (btw or can also be used to build a pattern).
OCamlbuild is documented in OCaml Manual, also there is a dump of old wiki, with lots of information.
But the fun part is that usually you do not need to know this in order to use OCaml. There is an OASIS tool that will automate all tasks, and create _tags file for you from a simple and high-level definition.

Using ocamldoc with packs

I have an ocamlbuild project which includes some files in a subdirectory with an .mlpack file listing them.
e.g. I have a file support/logging.ml which defines the module Support.Logging. The _tags file says "support": for-pack(Support).
This all builds and runs fine. But how can I generate docs for this using ocamldoc?
The most recent post I found was ocamldoc generation and packed files from 2011, which suggests using ocp-pack to generate one large .ml file and pass that to ocamldoc. However, that doesn't take into account the build order, so the generated module doesn't work due to forward references.
What's the best way to handle this?
The problem is described in the following bugreport. Handling -pack inside ocamldoc requires an implementation effort that the maintainer is not motivated to perform, and so far nobody stepped up to contribute a patch for this feature.
In the meantime, you can easily copy your foo.mlpack file into a foo.odocl generating the documentation of the separate submodules. That's only an imperfect workaround as the doc will talk about X rather than Foo.X, but that's a least-effort solution.
Here's the solution I'm now using in my Makefile. It does work, and cross-references into the Support module work:
doc:
ocp-pack -o support.ml.tmp support/logging.ml support/common.ml support/utils.ml support/basedir.ml support/qdom.ml support/system.ml
echo '(** General support code; not 0install-specific *)' > support.ml
cat support.ml.tmp >> support.ml
rm support.ml.tmp
$(OCAMLBUILD) 0install.docdir/index.html
rm support.ml
It's hacky because:
You have to list the support.ml files in build order, by hand
The Makefile adds the doc comments for Support (otherwise, it takes the description of the first sub-module, which you don't want)