Establish gtest version - c++

How do I know which version of Gtest is being used in the project I'm working with? I'm working on a linux platform.

The source code of libgtest or libgtest_main libraries doesn't contain special functions which allow recognize their version (something like GetGTestVersion () or something else).
Also header files doesn't have any defined identifiers (something like GTEST_VERSION or something else).
So you can’t check version of Google C++ Testing Framework at runtime inside user code.
But maintainers provide as part of the framework special script scripts/gtest-conf which:
...
provides access to the necessary compile and linking
flags to connect with Google C++ Testing Framework, both in a build prior to
installation, and on the system proper after installation.
...
Among other things this script has several options which connected with version:
...
Installation Queries:
...
--version the version of the Google Test installation
Version Queries:
--min-version=VERSION return 0 if the version is at least VERSION
--exact-version=VERSION return 0 if the version is exactly VERSION
--max-version=VERSION return 0 if the version is at most VERSION
...
The script also contain usage example of it:
Examples:
gtest-config --min-version=1.0 || echo "Insufficient Google Test version."
...
It means that user can test version of the framework in build time using script gtest-config.
Note:
The script gtest-config get actual version of the framework during configuration through variables declared in configure.ac.
...
AC_INIT([Google C++ Testing Framework],
[1.7.0],
[googletestframework#googlegroups.com],
[gtest])
...
And after calling autoconf the following identifiers inside configure file populated:
...
# Identity of this package.
PACKAGE_NAME='Google C++ Testing Framework'
PACKAGE_TARNAME='gtest'
PACKAGE_VERSION='1.7.0'
PACKAGE_STRING='Google C++ Testing Framework 1.7.0'
PACKAGE_BUGREPORT='googletestframework#googlegroups.com'
PACKAGE_URL=''
...
# Define the identity of the package.
PACKAGE='gtest'
VERSION='1.7.0'
...
As far the framework compiled with option AC_CONFIG_HEADERS this identifiers stored into file build-aux/config.h and availiable for user at compile time.

The file CHANGES, in the gtest home directory, contains a gtest version number.

If you have cloned the official repo you can check the latest Git commit inside Google Test's directory (using for example git log -n 1 or git rev-parse HEAD) and compare it with the list of released versions.
In my case, the commit hash is ec44c6c1675c25b9827aacd08c02433cccde7780, which turns out to correspond to release-1.8.0.

Related

List of Jetty9 modules

Is there a list of available Jetty 9 modules somewhere?
Just a simple table "this is the name, this is what it does, and here are links" type.
I have searched the Eclipse site and used search engines for some time now, without any usable result. Is it really that much of a secret what jetty modules exist, and what they do?
Use the command line.
$ cd /path/to/mybase
$ java -jar /path/to/jetty-home/start.jar --list-modules
Some modules are dynamic/virtual (dependent on your environment).
Some are 3rd party (jsp, jolokia, gcloud, etc).
Of the remaining few, you have the module information itself.
IE: rewrite is the rewrite behaviors in doc, http is the http server connector, etc.
Going from module to doc is a 1::n scenario, while going from doc to module is a 1::1 scenario.
If you want to know what they do, look at the module definition - (aka ${jetty.home}/modules/${name}.mod
They might have properties (documented in module)
They might have libs (obvious in module)
They might have xml (see standard XML configuration behaviors in Jetty doc)
They might have a non-Eclipse license (documented in module)
They might have a dependent module (documented in module)
The result of enabling a module is simply a command line along the lines of --module=http.
The combination of enabled modules (via the combination of ini files) is a longer command line + server classpath + xml load order.
You can see this via ...
$ cd /path/to/mybase
$ java -jar /path/to/jetty-home/start.jar --list-config

julia: rerun unittests upon changes to files

Are there julia libraries that can run unittests automatically when I make changes to the code?
In Python there is the pytest.xdist library which can run unittests again when you make changes to the code. Does julia have a similar library?
A simple solution could be made using the standard library module FileWatching; specifically FileWatching.watch_file. Despite the name, it can be used with directories as well. When something happens to the directory (e.g., you save a new version of a file in it), it returns an object with a field, changed, which is true if the directory has changed. You could of course combine this with Glob to instead watch a set of source files.
You could have a separate Julia process running, with the project's environment active, and use something like:
julia> import Pkg; import FileWatching: watch_file
julia> while true
event = watch_file("src")
if event.changed
try
Pkg.pkg"test"
catch err
#warn("Error during testing:\n$err")
end
end
end
More sophisticated implementations are possible; with the above you would need to interrupt the loop with Ctrl-C to break out. But this does work for me and happily reruns tests whenever I save a file.
If you use a Github repository, there are ways to set up Travis or Appveyor to do this. This is the testing method used by many of the registered modules for Julia. You will need to write the unit test suite (with using Test) and place it in a /test subdirectory on the github repository. You can search for julia and those web services for details.
Use a standard GNU Makefile and call it from various places depending on your use-case
Your .juliarc if you want to check for tests on startup.
Cron if you want them checked regularly
Inside your module's init function to check every time a module is loaded.
Since GNU makefiles detect changes automatically, calls to make will be silently ignored in the absence of changes.

API stable way to automatically link to either MySQL or MariaDB in Debian for backward compatibility

I upgraded from Debian Jessie to Debian Stretch, and now found out that MariaDB has replaced MySQL, which is fine.
Luckily, on C++, the MariaDB client is still accessible with
#include <mysql/mysql.h>
However, the linking is different. I used to link with -lmysqlclient, and now I have to link to -lmariadbclient.
My program has to work on both. So my question is: Is there a way to check whether MySQL is available, and if not, link to MariaDB?
I'm using qmake and cmake in the relevant projects. Please advise.
For cmake you could simply use:
find_library( MYSQL_LIBRARY
NAMES "mysqlclient" "mysqlclient_r"
PATHS "/lib/mysql"
"/lib64/mysql"
"/usr/lib/mysql"
"/usr/lib64/mysql"
"/usr/local/lib/mysql"
"/usr/local/lib64/mysql"
"/usr/mysql/lib/mysql"
"/usr/mysql/lib64/mysql" )
And then check it with:
if(MYSQL_LIBRARY) {
...
}
Like the examples from github: FindMYSQL(RenatoUtsch) or FindMySQL(mloskot).
For qmake the only thing i found is to check for typical locations like this:
!exists("/foo/bar/baz.so"):!exists("/hello/world/baz.so"):...: message("...")
CMake module FindMariaDB is included with MariaDB. FindMySql seems to be external to MySql, there are lots of search results in GitHub repo's for it. Search for both, with the flag OPTIONAL. Then based on MariaDB_FOUND, set a variable to the value if MariaDB_LIBRARIES or MySql_LIBRARIES. Use that variable in subsequent target_link_libraries().

Compile/Build Gluon Charm-Down itself

I would like to add some features to gluon charm-down I am currently missing.
Too bad that no documentation exisits about how to do that.
All the steps I describe here are done on my development computer where I also develop a test-app using gluon mobile (incl. charm-down). Compiling/deploying this app I have no problem under iOS, Android, Desktop/Windows Surface.
My Development Environment is
Windows 10 /x64 Intel i7 32GB. Java 8u121 (some others too), installed Android SDK
For the iPhone I also have a Macbook here (which I do not use for development, only for compile/deployment/tests).
In order to be able to make my addons to charm-down I checked out the source from BitBucket via Mercurial.
hg clone https://bitbucket.org/gluon-oss/charm-down
Then I changed the working directory to the checked out root (with build.gradle, gradle.properties etc.) and called
gradlew clean install
After a short while I am informed that ANDROID_HOME is not set. Well, it is set, but as Windows Enviroment. To enable gradle to have it, I added it to gradle.properties (ANDROID_HOME=C:/.....)
The directory I gave is the one containing the directories (add-ons, build-tools etc.)
No more complaints from gradle about the missing ANDROID_HOME, but now I get compile errors for missing Android Classes
C:\projects\Gluon-Charm\charm-down\plugins\plugin-lifecycle\android\src\main\java\com\gluonhq\charm\down\plugins\android\AndroidLifecycleService.java:30: error: package android.app does not exist
import android.app.Activity;
^
C:\projects\Gluon-Charm\charm-down\plugins\plugin-lifecycle\android\src\main\java\com\gluonhq\charm\down\plugins\android\AndroidLifecycleService.java:31: error: package android.app does not exist
import android.app.Application;
^
C:\projects\Gluon-Charm\charm-down\plugins\plugin-lifecycle\android\src\main\java\com\gluonhq\charm\down\plugins\android\AndroidLifecycleService.java:32: error: package android.os does not exist
import android.os.Bundle;
.... many more
What am I missing ?
If you have a look at the core/android module's build.gradle file, there is a dependency on the android.jar:
dependencies {
compile project(":core")
compile files("$ANDROID_HOME/platforms/android-$androidPlatformVersion/android.jar")
compile "org.javafxports:jfxdvk:$javafxportsVersion"
}
You have already defined your ANDROID_HOME path, but there is another variable: $androidPlatformVersion.
This one is defined in the gradle.properties file that you will find in the root of the Charm Down project, with these two properties set:
androidPlatformVersion=24
javafxportsVersion=8.60.8
You'll need to install Android SDK 24, so the dependency is resolved and android.jar is added.

OpenLayers 3 Build from master

I've cloned the OpenLayers 3 repo and merged the latest from master. There exists a recently merged pull request that I'm interested in exploring, but I'm not sure how to create a regular old comprehensive, non-minified build.
Does anyone know how to create a non-minified, kitchen sink (everything included) build for OpenLayers?
(similar to ol-debug.js).
You can use the ol-debug.json config to concatenate all sources for the library without any minification.
node tasks/build.js config/ol-debug.json ol-debug.js
Where the ol-debug.json looks like this:
{
"exports": ["*"],
"umd": true
}
The build.js task generates builds of the library given a JSON config files. The custom build tutorial describes how this can be used to create minified profiles of the library. For a debug build, you can simply omit the compile member of the build config. This is described in the task readme:
If the compile object is not provided, the build task will generate a "debug" build of the library without any variable naming or other minification. This is suitable for development or debugging purposes, but should not be used in production.