Is it possible to modify the Lua script to require? - c++

When I call require 'name' in Lua, the name can be either a preloaded module name or a file that exists in a current working directory.
I have the following two questions:
A. I would like to know if it's possible to find out whether a preloaded module or a file will be required right before it will be required.
B. And if it's a file, I want to modify the script which will be required (by prepending/appending some code on top of existing one) and then require the modified script finally.
Are A and B both possible?
P.S.: I'm using Lua with C++.

Are A and B both possible?
Yes, as you can write your own "require" function that does what you need (including everything you describe). You can also look at package.searchers, as registering your function as one of the searchers may be enough to implement what you want.

Related

Adding Custom c++ function in chromium and call them in browser

I am trying to write custom function in bootstrapper.cc under v8/src/init.
int helloworld(){
return 0;
}
When it try to call it from chromium console, it throws undefined.
Look around bootstrapper.cc to see how other built-in functions are installed. Examples you could look at include Array and DataView (or any other, really).
There is no way to simply define a C++ function of a given name and have that show up in JavaScript. Instead, you have to define a property on the global object; and the function itself needs to have the right calling convention, and process its parameters / prepare its return value appropriately so that it can be called from JavaScript. You can't just take or return an int.
If you find it inconvenient to work with C++, an alternative might be to develop a Chrome extension, which would allow you to use JavaScript for the implementation, and also remove the need to compile/maintain/update your own build (which is a lot of work!). There is no existing guide for how to extend V8 in the way you're asking, because that approach is so much work that we don't recommend doing it like this (though of course it is possible -- you just have to read enough of the existing C++ source to understand how it's done).

Why am I getting some of my python functions, when I import my module, but not others?

I'm writing function libraries in Python 2.7.8, to use in some UAT testing using froglogic Squish. It's for my employer, so I'm not sure how much I can share and still conform to company privacy regulations.
Early in the development, I put some functions in some very small files. There was one file that contained only a single function. I could import the file and use the function with no problem.
I am at a point where I want to consolidate some of those tiny files into a larger file. For some reason that completely eludes me, some of the functions that I copy/pasted into this larger file, are not being found, and a "NameError: global name 'My_variableStringVerify' is not defined" error is displayed, for example. (I just added the "My_", in case there was a name collision with some other function...)
This worked with the EXACT same simple function in a separate 'module'. Other functions in this python file -- appearing both before and after this function in the new, expanded module -- are being found and used without problems. The only module this function needs is re. I am importing that. I deleted all the pyc files in the directory, in case that was not getting updated (I'm pretty sure it was, from the datetime on the pyc file).
I have created and used dozens of functions in a dozen of my 'library modules', all with no issues. What's so special about this trivial, piece of crap function, as a part of a different module? It worked before, and it STILL works -- as long as I do not try to use it from the new library module.
I'm not python guru, but I have been doing this kind of thing for years...
Ugh. What a fool. The answer was in the error, after all: "global name xxx is not found". I was trying to use the function directly inside a Squish API call, which is the global scope. Moving the call to my function outside of the Squish API call (using it in the local scope), it worked fine.
The detail that surprised me: I was using "from foo import *", in both cases (before and after adding it to another 'library' module of mine).
When this one function was THE ONLY function in foo, I was able to use it in the global scope successfully.
When it was just one of many functions in foo-extended (names have been changed, to protect the innocent), I could NOT use it in the global scope. I had to reference it in the local scope.
After spending more time reading https://docs.python.org/2.0/ref/import.html (yes, it's old), I'm surprised it appeared in the global scope in either case. That page did state that "(The current implementation does not enforce the latter two restrictions, but programs should not abuse this freedom, as future implementations may enforce them or silently change the meaning of the program.)" about scope restrictions with the "from foo import *" statement.
I guess I found an edge case that somehow skirted the restriction in this implementation.
Still... what a maroon! Verifies my statement that I am no python guru.

How to organize subroutines for use by multiple commands?

I am working on creating a package with two new commands, say foo and bar.
For example, if foo.ado contains:
program define foo
...
rex
end
program define rex
...
end
But my other command, bar.ado, also needs to call rex. Where should I put rex?
I see the following few options:
Create a rex.ado file as well.
Create a rex.do file and include it from within both foo.ado and bar.ado using include "`c(sysdir_plus)'r/rex.do" at the bottom of each file.
Copy the code into both foo.ado and bar.ado, which seems ugly because now the code must be maintained in two places.
What is best practice for organizing subroutines that are needed by both foo and bar?
Also, should the subroutine be called rex, _rex, or something else — maybe _foobar_rex — to indicate it is actually a sub-command that foo and bar depend on to work correctly rather than a separate command intended to stand on its own?
Create a rex.ado file as well
Your question is a bit too broad. Personally, I would go with the first option to be safe, although it really depends on the structure of your project. Sometimes including rex in a single ado file may be enough. This will be the case, for example, if foo is a wrapper command. However, for most other use cases, including two commands sharing a common program, i strongly believe that you will need to have a separate ado file.
The second option is obviously unnecessary, since the first does the same thing, plus it does not have to load the program every single time you call it. The third option is probably the worst in a programming context, as it may create conflicts and will be difficult to maintain down the road.
With regards to naming conventions, I would recommend using something like _rex only if you include the program as a subroutine in an ado file. Otherwise, rex will do just fine and will also indicate that the program has a wider scope within your project. It is also better, in my opinion, to provide a more elaborate explanation about the intended use of rex using a comment at the start of the ado file, rather than trying to incorporate this in the name.

C++: Can function pointers be traced back to the original function before compilation without looking at the function name?

I want to set up a server on which students can upload and run code for a course. However, I don't want them to access various functions, like system(), which could allow bad access to my server. I can search the pre-processor output for an explicit function call, but if the user makes a function pointer like this:
int (*syst)(const char*) = system;
syst("rm *");
I'm still open to the threat. However, I can't just search for the string "system", for example, since it's otherwise a valid name - if the student didn't include cstdlib, they could use that name as a variable name. Since this is a beginning programming course, having a blacklist of variable names ten miles long is a bad idea.
Is there a way to define the functions other than by name and allow me to search for that other designation before compiling their code?
By far the easiest solution is to compile the code - that's pretty harmless - and then look at the actual library imports. Users may have defined their own system, but that wouldn't cause system to be imported from glibc.
Showing imported symbols
The main reason you can't look at the raw source code is because #define allows malicious users to hide the blacklisted symbol names. But there are plenty of other possibilities to do that, including
auto hidden = &sys\
tem;
So you need some processing of the source, and it's probably easiest just to fully process the whole source.
I would also suggest running this inside a chroot as a non-privileged user. It's lighter weight than a VM.
Alas, it's not possible (easily) to get a functions name from a pointer
How to get function's name from function's pointer in C? That question is from a C perspective, but it's the same problem, essentially.

Calling a python 2.7 function in c++ using the default api?

Say I have a function
def pyfunc():
print("ayy lmao")
return 4
and I want to call it in c++
int j = (int)python.pyfunc();
how exactly would I do that?
You might want to have a look into this:https://docs.python.org/2/extending/extending.html
In order to call a Python function from C++, you have to embed Python
in your C++ application. To do this, you have to:
Load the Python DLL. How you do this is system dependent:
LoadLibrary under Windows, dlopen under Unix. If the Python DLL is
in the usual path you use for DLLs (%path% under Windows,
LD_LIBRARY_PATH under Unix), this will happen automatically if you try
calling any function in the Python C interface. Manual loading will
give you more control with regards to version, etc.
Once the library has been loaded, you have to call the function
Py_Initialize() to initialize it. You may want to call
Py_SetProgramName() or Py_SetPythonHome() first to establish the
environment.
Your function is in a module, so you'll have to load that:
PyImport_ImportModule. If the module isn't in the standard path,
you'll have to add its location to sys.path: use
PyImport_ImportModule to get the module "sys", then
PyObject_GetAttrString to get the attribute "path". The path
attribute is a list, so you can use any of the list functions to add
whatever is needed to it.
Your function is an attribute of the module, so you use
PyObject_GetAttrString on the module to get an instance of the
function. Once you've got that, you pack the arguments into a tuple or
a dictionary (for keyword arguments), and use PyObject_Call to call
it.
All of the functions, and everything that is necessary, is documented
(extremely well, in fact) in https://docs.python.org/2/c-api/. You'll
be particularly interested in the sections on "Embedding Python" and
"Importing Modules", along with the more general utilities ("Object
Protocol", etc.). You'll also need to understand the general principles
with regards to how the Python/C API works—things like reference
counting and borrowed vs. owned references; you'll probably want to read
all of the sections in the Introduction first.
And of course, despite the overall quality of the documentation, it's
not perfect. A couple of times, I've had to plunge into the Python
sources to figure out what was going on. (Typically, when I'm getting
an error back from Python, to find out what it's actually complaining
about.)