Could OpenCV compile/use with WASI(WebAssembly System Interface)? - c++

WASI (WebAssembly System Interface) is intended to bring WebAssembly outside the browser.
I built a simple face recognition application with the eigenfaces example of OpenCV 4.3.0 (See: https://docs.opencv.org/2.4/modules/contrib/doc/facerec/facerec_tutorial.html#eigenfaces-in-opencv) and made it work.
Recently I would like to build a WebAssembly(WASM)-based face recognition application with OpenCV. I searched for WASM+face recognition and I got lots of git repositories and examples with opencv_js.wasm and uses it with a JavaScript binding.
My purpose is to build a standalone *.wasm module rather than html+js+wasm project, hence I ran into WASI(WebAssembly System Interface). Several runtimes such as wasmtime and wasmer could run standalone *.wasm which is compiled from C/C++ with an WASI toolchain(wasicc, wasic++, etc., such as wasienv project).
Have you guys have any ideas or experiences to build a standalone face recognition/detection/... or similar projects with WASI? Really appreciate for your reply!

Related

How to build and use the C++ API for Tensorflow on windows

Has anyone been successful in building/using the c++ API for TensorFlow on windows (withing Visual Studio)? The tutorials I found online or on TensorFlow's website have only shown building from source for python, or are outdated (3+ years old or for TensorFlow 1.x)
I see that there are docs for the functions of the built C++ API on TensorFlow's website, but there's no mention of where to get it or how to build it, the build from source section builds the python package that can be installed using pip, not anything that can be used with C++.
I've looked at the C api, but that seems to be different from what is mentioned in the C++ API section of the docs, and I was unable to get the C API working within VS due an error in mixing C++ and C.
If it helps, the end goal is to run inferences for pre-trained models (built using the python API) using C++.
Any help is appreciated.

How to build and use Google tensorflow C++ API on ARM processor

This is a follow-on to "how-to-build-and-use-google-tensorflow-c-api" : can any one explain how to build a Tensorflow C++ program on an ARM processor? I'm thinking specifically of Nvidia's Jetson family of GPU devices. Nvidia has lots and lots of documentation for these, but it all seems to be for Python (like this), for toy examples, and nothing for anyone who wants to write a C++ program using the full tensorflow API (if one even exists) for their own machine learning models. I'd like to be able to build programs like this one, which is a deep learning inference and exactly what the Jetson is supposedly made for.
I've found Web sites that offer links to installers too, but they all seem to be for the x86 architecture instead of ARM.
I have the same question about Bazel. I gather from all the unsatisfactory documentation I've been looking at that Bazel is mandatory for anyone who wants to build tensorflow programs using a GPU, but all of the installation instructions I can find are either incomplete or for a different architecture such as x86 (for example https://www.osetc.com/en/how-to-install-bazel-on-ubuntu-14-04-16-04-18-04-linux.html
I'll add that any link or github repository that dumps a load of code in my lap without making clear the prerequisites (since my little Jetson may not have the stuff installed that you assume) or the commands needed to actually build it (especially if it includes a project file for a compiler I never heard of) isn't very much help.

How to program with C++ API library on Windows using Bazel?

What I want to do
First of all, my goal is using Tensorflow C++ API as a library on Windows, which is part of my project, instead of building my project inside Tensorflow.
Background
I had achieved this by building Tensorflow with CMake. However, from Tensorflow 1.10, building with CMake was deprecated and Bazel is recommended instead. But the official way to use C++ API is building project inside Tensorflow with Bazel. Thus, this way is not good for me.
What I have done
To use a newer version of Tensorflow, I have been trying to build Tensorflow with Bazel as a standalone library.
Some maintainer denoted that it is possible by substituting //tensorflow/tools/pip_package:build_pip_package to //tensorflow:libtensorflow_cc.so in the official tutorial. But in fact I encountered some problems and solved them by reading this tutorial. Now I have successfully built libtensorflow_cc.so.
What the problem is
However, I have no idea what should be done next to use the built result. And it is exactly what my problem is. There is no documentation of course. Only some incomplete ideas on it I have found, and I will show all of them, trying to give you more information:
There is somebody already successfully linking built *.so and having solved the problems he has encountered.
There is a repo doing the what I want to do on Ubuntu and Arch Linux. I have contacted with the maintainer and he told me that they have no plan for supporting Windows now.
A related issue: Building a .dll on Windows.
A related issue: Packaged TensorFlow C++ library for bazel-independent use.
A related issue: Feature request: provide a means to configure, build, and install that includes cc.
A related question: How to build and use Google TensorFlow C++ api. The scope of this question is a little larger without 'using bazel' and 'on Windows' restrictions.
A related pull request: C++ API
There must be someone struggling with similar problems like me. I hope this question can build a reservoir of ways to solve the problem.
It's over 2 years since this question was asked, and the news is not good: it seems there are insufficient people with Windows skills in a position to provide the support to integrate Tensorflow into Windows applications using the familiar headers + library model. And Tensorflow advances week by week, meaning that the Windows support falls further behind.
In my assessment, the path to building on Windows is currently blocked due to inadequate documentation. It's not so much that "There is no documentation of course" as the OP asserts, it's that the sparse documentation is distributed throughout dozens of separate posts, each of which dates rapidly with the continuing development of the Tensorflow along paths other than Windows C++.
I originally gave this answer to a similar question, but updated it with advice along the following lines yesterday:
Windows is a Microsoft product, so watch what Microsoft is doing
Hint: Microsoft is investing in the ONNX format
you can convert Tensorflow to ONNX, or Keras to ONNX
You can implement your (ONNX) model on Windows in C++ in at least 3 ways:
Windows ML (uses Onnx runtime)
Onnx runtime (supports DirectML as an execution provider)
DirectML (how Microsoft uses graphics cards to boost performance)
We don't have the latest or best hardware (e.g. we have Intel graphics cards), but have been able to get a solution based on Onnx runtime that classifies 224 x 224 RGB images in about 20 milliseconds for us. We found the Windows ML path much more difficult to work with legacy code, and also slower to run.

How to deploy a Tensorflow trained model for inference for a Windows standalone application

I would like to use a model trained with Tensorflow in a Windows standalone desktop application. I only need to perform predictions, I can train the model with Tensorflow Python API. What is the recommended approach?
I know there is a C++ API, but it is really hard to compile it, especially on Windows. Can I find any prebuilt C++ Tensorflow binaries for Windows?
Is there an easy way to distribute Python with Tensorflow as a Windows installer prerequisite?
Can I import the Tensorflow model in another technology and use it for inference? OpenCv DNN module has a function which imports data from Tensorflow, but I understood it has many limitations, and I was not able to import and use a model with OpenCv.
Thanks for help!
I was challenging the same issues as you.
You should at least try to compile it (try CMake, it might be easier)
If you still having trouble:
Compiler is out of Heap Space
Standalone Windows Lib
Basic Tensorflow Handling with C++
I asked a similar question and eventually found my own way to the answer. In the end, I found the Tensorflow instructions were actually pretty good (it was my reading them that was bad!). I have not tried using Bazel for Windows, but building Tensorflow using CMake ended up working fine.
The main issue was the compiler heap space issue. This always seems to occur in some random place if you are using the MS Visual Studio 32-bit compiler (default). The key is to make sure you run vcvarsall.bat or vcvars64.bat or whatever it takes to invoke the 64-bit compiler (in Task Manager, it should show up as cl.exe, not cl.exe *32) I found it hard (read: impossible) to get Visual Studio to use the 64-bit compiler, but using the MSBuild tool to compile on the command line worked fine.
Once you can build the example program, you have an example of an application that links to a static tensorflow library to do its stuff. You can just make your own application link to this library for what you want.

creating C++ program that runs on most of PCs

I have a project that requires writing a code for small executable file. I used visual C++ express 2010 IDE to create this file. After I finished writing the code, I tried to copy it to a couple of different PCs. It gives me an error message every time I clicking on this file to execute it. The message states that I have to install (.NET framework). I watched a couple videos on YouTube explaining how overcome this problem by changing the runtime library from multi-threaded Debug DLL (/MDd) to multi-threaded Debug (/Mtd). However; the IDE can’t debug the C++ code because when I create my project by using CLR template!
Is there any way to solve this problem? Can I create a similar program that not requires any further downloading once I using on different PC?
Is learning a different language like JAVA or C# will help creating small programs (like my program) that run on most Window platform machine?
Just use Qt - it runs on Windows, Linux, MacOS, support for Android and iOS is scheduled for this year, plus it supports embedded platforms and some of the more obscure mobile platforms. Also, support for Windows RT was just kickstarted. A complete library with tons of functionality, good documentation and lots of educational resources. It provides tons of tools, from implicitly shared containers through threading, signals and slots, 2D and 3D graphics, widgets, multimedia, sensors... and whatnot...
You can even develop commercial applications under the LGPL license.
Also comes with a pretty good IDE - Qt Creator.
You can develop standard C++ applications or use QML, which is a JavaScript like language for markup and scripting, which is used to build applications from C++ implemented components. You can also extend QML. It is much faster to develop with QML and you still get the advantages of platform native binary under the hood.
Note that you will still need to either ship a few DLLs with your application. Unless of course you use a static build, which requires you to either have your application open source, or purchase a commercial license... which doesn't come cheap...
But still, a few MB of DLLs are far better than the entire .NET framework. A static build will produce executables about 8-9 MB with no external dependencies.
Stick with the C++ standard, avoid Microsoft extensions (managed code), and call only POSIX functions of your OS, then you should be able to write portable programs.
You seem to have created a Managed C++ Project. Instead create an empty Win32 C++ project and then add in your .cpp/.h files. This will limit you to the default libraries available on all PCs with the C++ runtime. If you want to remove that dependency too then statically compile in the runtime using the /MT option. Details # http://msdn.microsoft.com/en-us/library/2kzt1wy3(v=vs.71).aspx
As you move ahead you would need to be conscious of what libraries you take dependencies on and what versions of the OS are those libraries available on or if you need to package them with your bits.
Both Java and C# will help making portable programs. Usually, people will have to install runtime environments for executables written in these two languages, however. These days, C++ is more portable than ever. You can easily run C++ executables in your browser:
https://github.com/kripken/emscripten
http://code.google.com/p/nativeclient/
This makes many of the reasons why Java and C# came about irrelevant.
Open standards like OpenGL also make portable GUI programming easier than ever. Try Qt, if you want to write a simple GUI in C++.
Note: It is possible to run C++ program in any computer without installing anything if you haven't use .NET framework. In your case, there can be two reasons to trigger error in target computer.
New computer doesn't have required run-time assembles.
New computer doesn't have required .NET framework installed.
..........................................
So what to do:
Before start your program you have to design weather are you going to use .NET framework support or not. If you use .NET framework when you develop your program, then you much install same or higher .NET framework in target computer.
If you no need to use .NET component then your target computer should only containing run-time assemblies.
How to get rid of .net framework
right click on the project in solution -> properties -> General -> Common language run time support -> select "No common language run time support".
..........................................
Then what you need is only relevant run-time assemblies be in target computer.
How can run-time assemblies be in new computer:
There are two ways:
Install suitable C++ disputable environment in target computer(if you use VS2008 SP1, C++ RD package should be this. Please consider the solution build architecture also (32 bit/64 bit) before download ).
Deploy run-time assemblies with your solution package. (I like this because user no need to install any third party components)
..........................................
How Deploy assemblies with my project:
for this your all DLL, LIB, EXE should use same run time version.(if not, you face troubling to redirect assemblies by 'manifest' files ).
How to check the run-time version.
open DLL,EXE by visual studio (open->file) -> expand RT_MANIFEST-> double click the file under it ->then assembly dependency details will open. -> copy the data in right column and paste to note pad.
You will see this kind of line there. and ther is the version run-time assemblies your specific DLL or EXE use.
assemblyIdentity type="win32" name="Microsoft.VC90.CRT" version="9.0.21022.8" processorArchitecture="x86".....
After identifying the version of run-time assemblies follow this tutorial and try to run in fresh installed computer.
At last: If you think this bla.. bla.. is so complex and your program is very simple, then you can consider about "run time assemblies statically linking" (try Google). But personally I don't like this method.
good luck!