Setting timeout for embedded Lua - c++

I have embedded Lua in a C/C+= application. I want to be able to set a timeout value to prevent getting trapped with badly written scripts that can result in infinite loops (or even string searches that take an infinite time to complete).
Basically, I want to be able to set a time interval and if the script fails to complete running at the end of that time interval, I want to be able to kill the Lua script engine (gracefully, if possible).
Anyone knows of best practise way to do this?

One way to control the amount of time a script takes is to set a count hook and then raise an error in the hook. But this does not work if the script can call C functions that may take a long time.

Related

Creating an update method with custom rate processing in c++

If you've ever used XNA game studio 4 you are familiar with the update method. By default the code within is processed at 60 times per second. I have been struggling to recreate such an effect in c++.
I would like to create a method where it will only process the code x amount of times per second. Every way I've tried it processes all at once, as loops do. I've tried for loops, while, goto, and everything processes all at once.
If anyone could please tell me how and if I can achieve such a thing in c++ it would be much appreciated.
With your current level of knowledge this is as specific as I can get:
You can't do what you want with loops, fors, ifs and gotos, because we are no longer in the MS-DOS era.
You also can't have code running at precisely 60 frames per second.
On Windows a system application runs within something called an "event loop".
Typically, from within the event loop, most GUI frameworks call the "onIdle" event, which happens when an application is doing nothing.
You call update from within the onIdle event.
Your onIdle() function will look like this:
void onIdle(){
currentFrameTime = getCurrentFrameTime();
if ((currentFrameTime - lastFrameTime) < minUpdateDelay){
sleepForSmallAmountOfTime();//using Sleep or anything.
//Delay should be much smaller than minUPdateDelay.
//Doing this will reduce CPU load.
return;
}
update(currentFrameTime - lastFrameTime);
lastFrameTime = currentFrameTime;
}
You will need to write your own update function, your update function should take amount of time passed since last frame, and you need to write a getFrameTime() function using either GetTickCount, QueryPerformanceCounter, or some similar function.
Alternatively you could use system timers, but that is a bad idea compared to onIdle() event - if your app runs too slowly.
In short, there's a long road ahead of you.
You need to learn some (preferably cross-platform) GUI framework, learn how to create a window, the concept of an event loop (can't do anything without it today), and then write your own "update()" and get a basic idea of multithreading programming and system events.
Good luck.
As you are familiar with XNA then i assume you also are familiar with "input" and "draw". What you could do is assign independant threads to these 3 functions and have a timer to see if its time to run a thread.
Eg the input would probably trigger draw, and both draw and input would trigger the update method.
-Another way to handle this is my messages events. If youre using Windows then look into Windows messages loop. This will make your input, update and draw event easier by executing on events triggered by the OS.

C++: How to set a timeout (not reading input, not threaded)?

Got a large C++ function in Linux that calls a whole lot of other functions, making up an algorithm. At various points given certain bad inputs, the algorithm can get "stuck" and go on forever. Adding a timeout seems appropriate as all potential "stuck" points cannot be predicted. But despite scouring the Internet for timeout examples I've only found how to apply timeouts when either the thing your timing is a separate thread or it's reading inputs. My code is a single thread and does not modify file descriptors, so not coming up with any luck. Do I basically have no choice but to thread it?
I am not sure about the situation, actually server applications or embedded applications often run for years in background without stopping. I think one option is to let your program run in background and log to a file(or screen) timely, and, if you really want to stop the program after certain time, you can use timeout command or a script to kill your program after that time, say, timeout 15s your-prog.

C++ executing a bash script which terminates and restarts the current process

So here is the situation, we have a C++ datafeed client program which we run ~30 instances of with different parameters, and there are 3 scripts written to run/stop them: start.sh stop.sh and restart.sh (which runs stop.sh and then start.sh).
When there is a high volume of data the client "falls behind" real time. We test this by comparing the system time to the most recent data entry times listed. If any of the clients falls behind more than 10 minutes or so, I want to call the restart script to start all the binaries fresh so our data is as close to real time as possible.
Normally I call a script using System(script.sh), however the restart script looks up and kills the process using kill, BUT calling System() also makes the current program execution ignore SIGQUIT and SIGINT until system() returns.
On top of this if there are two concurrent executions with the same arguments they will conflict and the program will hang (this stems from establishing database connections), so I can not start the new instance until the old one is killed and I can not kill the current one if it ignores SIGQUIT.
Is there any way around this? The current state of the binary and missing some data does not matter at all if it has reached the threshold, I also can not just have the program restart itself, since if one of the instances falls behind, we want to restart all 30 of the instances (so gaps in the data are at uniform times). Is there a clean way to call a script from within C++ which hands over control and allows the script to restart the program from scratch?
FYI we are running on CentOS 6.3
Use exec() instead of system(). It will replace your process with the new one. Note there is a significant different in how exec() is called and how it behaves: system() passes its string argument to the system shell to run. exec() actually executes an executable file, and you need to supply the arguments to the process one at a time, instead of letting the shell parse them apart for you.
Here's my two cents.
Temporary solution: Use SIGKILL.
Long-term solution: Optimize your code or the general logic of your service tree, using other system calls like exec or by rewritting it to use threads.
If you want better answers maybe you should post some code and or degeneralize the issue.

Inject sleep() into a function of an external process

I know how to inject a DLL into a running process and also how to utilize functions used internally by the process e.g.
void__stdcall remoteMethod(unsigned short id)
{
typedef void (__stdcall *pFunctionAddress)(unsigned short);
pFunctionAddress pMyFunction = (pFunctionAddress)(0xCAFEBABE);
pMyFunction(id);
}
Now i want to add a sleep() into an existing method in the running process - this is the main loop of the program and doesnt stop for a sec and uses up all processing power.
I know that with frameworks like detours i could make a trampoline function which calls my function and then the original one - however my problem is that the while(1) loop is somewhere within the function of the external process. So i know the offset where the loop starts - and after that i would like to first call sleep() and then continue with the normal route of the loop.
The only alternative i saw so far is binary editing the program but this is not a good solution.
Any suggestion? Thanks
I think you are trying to be too cute here. Just call SuspendThread/ResumeThread alternately on a timer. I know it's ugly, but you aren't going to enter your solution in any beauty pageant I suspect.
Post the name of the spin-waiting program.
Wait for SO-ers to send hate mail to the developer.
Install the update the developer sends you as a bribe to stop the hate mail.
In principle, as long as you've been executed once within the space of the other process, and you know that the loop isn't executing, then you could enabling writing to text pages and patch the actual loop code in situ. You'll need a few redundant bytes to write a call to your function over (extending the function will need a lot of rewriting as all relative offsets will break).
This is not, however, terribly easy nor terribly robust. Consider why you want to to this, and if you can achieve the goal another way.

How to specify a maximum amount of time a program can run in C++

In my C++ program, I will start other programs with exec. However, I want to be able to specify a maximum amount of time that the programs can run. How can that be done?
Is setrlimit the right thing to use?
Bit of a brute-force version, but... save/get the handle of the started programm/process, start a timer and kill the other process after the timer has expired?
2 solutions that comes to mind.
1- Send the duration to the second program via the command line and manage the duration internally in the 2nd exe.
2- Create a timer in the first exe and when the timer is triggered kill the 2nd process.
Max.
In general, it can't be done using standard c++ - you will have to use whatever scheduling functions your operating system (which you haven't specified) provides.