When programming a C++ Node.JS Addon, what is the equivalent of require('./someModule') so that a Module can be loaded for use within a compiled Addon?
I have found this method:
Handle<String> source =
String::New("NameOfLibrary.register(require('./someModule'))");
Handle<Script> script =
Script::Compile(source);
script->Run();
which if used in conjunction with what I asked here would work nicely, but I was wondering if there was a more native way.
You should be able to access the standard module require function in your initialization function. Generally I'd just call it from there since lazy calls to require aren't a good idea since they are synchronous.
static void init (Handle<Object> target, Handle<Object> module) {
HandleScope scope;
Local<Function> require = Local<Function>::Cast(
module->Get(String::NewSymbol("require")));
Local<Value> args[] = {
String::New("./someModule")
};
Local<Value> someModule = require->Call(module, 1, args);
// Do whatever with the module
}
NODE_MODULE(module_file_name, init);
Related
Creating an application in C++, I integrated CPython to facilitate the development of certain top-level logics.
The application has a plugin subsystem, which can loaded/unloaded plugins at runtime, this implies to add and remove Python definitions at runtime.
I found that I can add functions with PyModule_AddFunctions, and similarly, I can add constants, objects, etc...
But I found no equivalent PyModule_RemoveFunction.
How to remove a Python function from a module using C++?
Note: Ideally, I would like to avoid solutions like:
Removing the full module and reloading everything
Crafting Python code, that when executed would remove the function.
Let see an example:
DISCLAIM: I removed most of the error checks for simplicity.
#define PY_SSIZE_T_CLEAN
#include <Python.h> //before std includes
// Just an example function to add/remove from Python
static PyObject* log(PyObject* , PyObject* args)
{
//Do something
Py_INCREF(Py_None);
return Py_None;
}
// Setup a "AppModule" module
PyMODINIT_FUNC initAppModule()
{
static PyModuleDef AppModuleInfo{
PyModuleDef_HEAD_INIT,
"AppModule", // name of module
nullptr, // module documentation, may be NULL
-1, // size of per-interpreter state of the module, or -1 if the module keeps state in global variables.
nullptr
};
const auto AppModule = PyModule_Create(&AppModuleInfo);
PyModule_AddFunctions(AppModule, AppModuleMethods);
return AppModule;
}
// Adding a function when a plugin is loaded
void PluginAddingAFunction()
{
static PyMethodDef AppModuleMethods[]{
{"log", log, METH_VARARGS,
"Log a message in the standard output."},
{NULL, NULL, 0, NULL} // Sentinel
};
PyObject *modName = PyUnicode_FromString("AppModule");
PyObject *mod = PyImport_Import(modName);
PyModule_AddFunctions(mod, AppModuleMethods);
}
// Removing the function when the plugin is unloaded
void PluginRemoveAFunction()
{
PyObject *modName = PyUnicode_FromString("AppModule");
PyObject *mod = PyImport_Import(modName);
// How to do this?
//PyModule_RemoveFunctions(mod, "log");
}
int main(int argn, const char* argv[])
{
Py_SetProgramName(argv[0]);
PyImport_AppendInittab("AppModule", &initAppModule);
Py_Initialize();
PyObject *pmodule = PyImport_ImportModule("AppModule");
PluginAddingAFunction(); // <<===== This is done at any time, when loading a plugin
PyRun_SimpleString("import AppModule\n"
"AppModule.log('Hi World')\n"); // <== Example code
PluginRemoveAFunction(); // <<===== This is done when unloading a plugin
Py_FinalizeEx();
}
You can use PyObject_DelAttr C-apis.
int PyObject_DelAttr(PyObject *o, PyObject *attr_name)
Delete attribute named attr_name, for object o. Returns -1 on failure. This is the equivalent of the Python statement del o.attr_name
Reference.
So you could do something like this to remove log function
void PluginRemoveAFunction()
{
PyObject *modName = PyUnicode_FromString("AppModule");
PyObject *mod = PyImport_Import(modName);
PyObject *funcName = PyUnicode_FromString("log");
PyObject_DelAttr(mod, funcName);
}
This question already has answers here:
Integrate Python And C++
(11 answers)
Closed 3 years ago.
I'm looking for a way to run C++ code alongside python in real time. I have a neural network program which takes in inputs, and outputs a certain integer. The process of running the neural network is the main "meat" of my program and uses up the most resources so I'd like to code it in c++ to make the overall system more efficient.
What I'm looking to do now if run a python main program that passes in python variables to an external CPP file. The CPP file does its math magic and spits put an integer in Python. Is this possible? Is it advisable?
I have heard of Cython before but would much rather code this by scratch to ensure it is optimized for my specific task.
Kindest reguards!
I did something like you want, but with C++ as start point:
C++ code creates python-like module with C functions.
After that C++ code runs target python module, that invokes functions from python-like module.
As result, python module is running and is invoking C++ functions.
Here the sample of C++ code:
// Sample of C++ function, which python module invokes
static PyObject* OutDtmf(PyObject* self, PyObject* args)
{
PyObject* a1;
if (PyArg_UnpackTuple(args, "OutDtmf", 1, 1, &a1))
{
// Check the result
PyObject* astr = PyUnicode_AsUTF8String(a1);
const char* ustr = PyBytes_AsString(astr);
OutDtmf(ustr);
}
Py_RETURN_NONE;
}
// Pack of definitions
// --------------------
static PyMethodDef WarSysMethods[] = {
{ "Finish", FinishScript, METH_VARARGS, NULL },
{ "KpsoSetControl", KpsoSetControl, METH_VARARGS, NULL },
{ "KpsoSetLine", KpsoSetLine, METH_VARARGS, NULL },
{ "OutDtmf", OutDtmf, METH_VARARGS, NULL },
{ "PsoSetLine", PsoSetLine, METH_VARARGS, NULL},
{ NULL, NULL, 0 , nullptr }
};
static struct PyModuleDef WarSysModuleDef = {
PyModuleDef_HEAD_INIT,
"WarSys",
NULL,
-1,
WarSysMethods };
PyMODINIT_FUNC PyInit_WarSys(void)
{
PyObject *module;
module = PyModule_Create(&WarSysModuleDef);
return module;
}
// Start point for creation of python-like module and loading target python module
void StartScript(bool serverMode, const char* testModuleName)
{
// Initialization Python -> C++
PyImport_AppendInittab("WarSys", PyInit_WarSys);
// Initialization C++ -> Python
Py_Initialize();
PyObject* pDict; // borrowed
TestModule = PyImport_ImportModule(testModuleName);
if (!TestModule)
{
PyErr_Print();
return;
}
pDict = PyModule_GetDict(TestModule);
// Read function objects
FuncInit = PyDict_GetItemString(pDict, "Init");
....................
// Invokes python function in module (f.e. Init)
PyObject_CallObject(command.Function, command.Arguments)
in python code use:
import WarSys
and invokes functions WarSys.Finish(False) or other.
Here is a node.js addon module I've written in C++ and built using node-gyp.
When StoreFunction I am trying to store a pointer to the function so I can use it later
When I try to invoke it later though in InvokeFunction I get a Segmentation fault. What baffled me if I examined the pointer in both functions (using cout) they are the same value.
So I'm guessing either the change of invoking context changes between calling the two functions or I don't understand what I'm pointing to.
All (ummmmmm) pointers gratefully received on my problem here..............
#include <node.h>
#include <v8.h>
using namespace v8;
v8::Persistent<v8::Function> callbackFunction;
Handle<Value> StoreFunction(const Arguments& args) {
HandleScope scope;
callbackFunction = *Local<Function>::Cast(args[0]);
return scope.Close(Undefined());
}
Handle<Value> InvokeFunction(const Arguments& args) {
HandleScope scope;
Local<Value> argv[1] = { String::New("Callback from InvokeFunction")};
callbackFunction->Call(Context::GetCurrent()->Global(), 1, argv);
return scope.Close(Undefined());
}
void init(Handle<Object> target) {
NODE_SET_METHOD(target, "StoreFunction", StoreFunction);
NODE_SET_METHOD(target, "InvokeFunction", InvokeFunction);
}
NODE_MODULE(someaddonmodule, init);
And of course some calling js...........
var myaddon = require('../build/Release/someaddonmodule');
myaddon.StoreFunction(function(data){
console.log("Called back: "+data);
});
myaddon.InvokeFunction(); //causes a segmentation fault
The answer is because we're not programming in Java any more Toto.
The pointer I created is pointing at the Local Handle, rather than the function. Holding a 'reference' to this isn't enough to stop the V8 garbage collection destroying it when the scope closes.
To deal with this an explicit request needs to be made to V8 to put aside some memory to hold
the function which done like this :
Persistent< Function > percy;
Local<Function> callbackFunction = Local<Function>::Cast(args[0]);
percy = Persistent<Function>::New(callbackFunction);
If anyone with a better understanding of V8 internals knows more than this, I'd still really like to hear your explanation :)
I have a Visual Studio 2008 C++03 application that uses Lua 5.2.1. I would like to extend Lua with a module called "foo", but when I call require("foo") in my Lua script, I get the error:
foo_test.lua:1: module 'foo' not found:
no field package.preload['process']
no file '!\lua\process.lua'
no file '!\lua\process\init.lua'
no file '!\process.lua'
no file '!\process\
My Lua script:
foo.bar()
My lua_foo.h file:
#include <lua.h>
extern "C" int luaopen_foo( lua_State* L );
My lua_foo.cpp file:
#include "lua_foo.h"
#include <lua.hpp>
static int l_bar( lua_State *L )
{
puts( "in bar()" );
return 1;
}
int luaopen_foo( lua_State *L )
{
static const luaL_Reg foo[] = {
{ "bar", l_bar },
{ NULL, NULL }
};
luaL_newlib( L, foo );
return 1;
}
These are compiled in to a static library "lua_foo.lib" which is statically linked to my main Lua executable.
Can anybody help me understand where I'm going wrong? thanks. I would prefer to avoid c++ wrappers (for now) and I do not want to package this library as a separate DLL from the main Lua engine.
EDIT
The issue was in the lua engine code. I added the luaL_requiref per #NicolBolas 's suggestion.
lua_State* L = luaL_newstate();
if( NULL != L )
{
luaL_openlibs( L );
luaL_requiref( token.get(), "foo", luaopen_foo, 1 );
luaL_dofile( L, "foo_test.lua" );
lua_close( L );
}
It's important to understand how the require machinery works and therefore why your code doesn't.
require is designed to look for Lua scripts in the file system and DLLs. Static libraries are not DLLs; indeed, as far as C/C++ is concerned, once you've finished linking, static libraries are no different than compiling those .c/.cpp files into your application directly.
When require finds a DLL with the appropriate name, it loads it and attempts to find a function named luaopen_<modname>, where <modname> is the name of the module. When it does, it will execute this function and store the value it returns in an internal database of loaded modules.
Calling require for a module will return whatever this function returned; if the module has already been loaded, then the return value is pulled from the database and returned directly.
Simply calling luaopen_foo will not do any of this. Indeed, simply calling this function is a bad idea; it is a Lua function and needs to be called as a Lua function (ie: you need to push it onto the Lua stack with lua_pushcfunction and call it with lua_call and so forth).
If you want to create a local module (one not in a Lua script or DLL, but exposed from your code), then you need to use the Lua facilities to do that. Specifically, use luaL_requiref:
luaL_requiref(L, "foo", luaopen_foo, 0);
Call this instead of calling luaopen_foo directly. This will automatically register the return value from luaopen_foo with require's internal database of loaded modules. Thus, subsequent calls to require "foo" will return this table.
One more thing: do is a keyword in Lua; you should not use keywords for Lua table key names. You can, but you always have to quote them (ie: your script must do foo["do"](...) to call it).
luaopen_foo creates a table with one function in it, but it doesn't expose it to Lua in any way. You need to assign it to something your scripts can access if you want to access it. You can do this with the package mechanism, or just assign it to a global (which is what Lua's built-in libraries do).
You have a field named do, which is problematic if you want to use foo.do syntax, because do is a keyword.
The return value of a Lua function tells Lua how many values you left on the stack. Your l_do function lies with its return value.
In the case of luaopen_foo, since you're calling it directly and ignoring it's return value, there's no need for it to return anything at all.
Change your code to this:
static int l_bar( lua_State *L )
{
puts("l_bar called.");
return 0;
}
void luaopen_foo( lua_State *L )
{
static const struct luaL_Reg foo[] = {
{ "bar", l_bar },
{ NULL, NULL }
};
luaL_newlib( L, foo ); // create table containing `bar`
lua_setglobal(L, "foo"); // assign that table to global `foo`
}
And change your script to this:
foo.bar()
I'm trying to write a Node.js module, using C++, that wraps and exposes some classes from libhdf5.
I'm currently interested in two classes from libhdf5. The first one is File, and it opens an hdf5 file. The second one is Group, and it represents groups within that file. You get Group objects from a File object.
I've written some code in which I create a File object and attempt to get a Group from it. I am trying to make my Node.js module as JavaScripty as possible, so I want to return the group using a callback. So, I am trying to code my module so that it's used like this:
var hdf5 = require('hdf5');
var file = new hdf5.File('/tmp/example.h5');
file.getGroup('foobar', function (err, group) { console.log(group); });
So, in the C++ code for my File wrapper, I'd have a function that maps to the getGroup function here, and it'd call the given anonymous function, passing in any errors as well as the new Group object wrapper.
Given that this sounded to me like what the Node.js documentation shows to be a factory of wrapped objects, I have modeled my Group code after the examples there.
So, I have my Group wrapper coded up, but am stuck trying to instantiate it. I don't know enough yet to know how to stray away from using the v8 Arguments class for function parameters. Because of that, I can't seem to be able to pass in some parameters that I need for my v8 persistent constructor function (because I am instantiating this from C++, and not from JS-land).
You are almost there. You don't need to pass Arguments to Group::Instantiate. Just pass what you need and use the constructor to create the new instance of Group. For example:
Handle<Value> Group::Instantiate(const std::string& name) {
HandleScope scope;
Local<v8::Value> argv[1] = {
Local<v8::Value>::New(String::New(name.c_str()))
};
return scope.Close(Constructor->NewInstance(1, argv));
}
The method Group::New does the rest of the construction work.
Handle<Value> Group::New(const Arguments& args) {
HandleScope scope;
if (!args[0]->IsString()) {
return ThrowException(Exception::TypeError(String::New("First argument must be a string")));
}
const std::string name(*(String::Utf8Value(args[0]->ToString())));
Group * const group = new Group(name);
bar->Wrap(args.This());
return args.This();
}
In File::OpenGroup you can do this:
Handle<Value> File::OpenGroup (const Arguments& args) {
HandleScope scope;
if (args.Length() != 2 || !args[0]->IsString() || !args[1]->IsFunction()) {
ThrowException(Exception::SyntaxError(String::New("expected name, callback")));
return scope.Close(Undefined());
}
const std::string name(*(String::Utf8Value(args[0]->ToString())));
Local<Function> callback = Local<Function>::Cast(args[1]);
const unsigned argc = 2;
Local<Value> argv[argc] = {
Local<Value>::New(Null()),
Local<Value>::New(Group::Instantiate(name))
};
callback->Call(Context::GetCurrent()->Global(), argc, argv);
return scope.Close(Undefined());
}