I'm trying to run a program with another program. For this I use a class QProcess.
Program must be run with administrator privileges. To keep things simple step debugging and lead an example here, I started qt creator with privileges administator.
Now the fun part.
The following code runs the calculator.
QProcess * p = new QProcess();
p->start("C:\\Windows\\System32\\calc.exe");
p->waitForStarted();
delete p;
This code works.
Now another example, which already runs the service window windows.
QProcess * p = new QProcess();
p->start("C:\\Windows\\System32\\services.msc");
p->waitForStarted();
delete p;
This code does not run the program services.msc. File exists and is run from the command line without any problems.
Why one works and the other not? How to fix?
Windows 7 x86.
Short answer: .msc is not an executable file type.
Long answer:
.msc is what's called a snap-in for the Microsoft Management Console.
From the command prompt or even from Start -> Run (win + R), running services.msc tells the operating system Hey, run this file with whatever program is associated with .msc files.
That program in particular is called mmc.exe, and even when run services.msc from the command prompt and look in the Task Manager, you'll see the window actually belongs to services.exe.
Try to start either mmc.exe services.msc or cmd.exe /C services.msc instead.
Related
According to Microsoft, "If [the system] cannot locate the DLL, the system terminates the process and displays a dialog box that reports the error. " This is the result I get when I run my application outside of the command line, but I do not get the same system error when I run the application from a shell environment such as command prompt or powershell.
Is there a way to show the same error message when the application is run from a command line interface?
https://msdn.microsoft.com/en-us/library/aa271571(v=vs.60).aspx
SetErrorMode(GetErrorMode() & ~SEM_FAILCRITICALERRORS);
but I don't think you want to do this, as you do not know in which environment the user will run your application.
It is usually now a good idea to popup a dialogbox in e.g. a service environment.
What is the problem with examining the error code of whatever is failing e.g. LoadLibrary() and reacting to this error?
I run command:
multichaind mychain -deamon
to start up the multichain program.
As per: https://www.multichain.com/developers/creating-connecting/
I'm running on Win 2012. I have tried putting this in a startup script using Windows Task Scheduler so that when the machine reboots, it will restart.
The task scheduler history shows it ran successfully, but it's not running. When I run the command
multichain-cli mychain getinfo
it says it cannot connect to the server.
And of course, that same command works when I start up multichain in a separate command prompt window.
My theory is that the command runs, but then the command windows is closed, but needs to remain open. Is there any way to do this, or debug what is going on?
I tried adding the parms as:
mychain -daemon >>d:\Software\Multichain\log.txt
hoping that it would write to the file so I could read it. But no file was created.
Update: I tried the .cmd file approach as recommended in the comment:
multichaind mychain -daemon
pause
That doesn't make sense to me because the first line should never finish.
It did create the log.txt file, but the whole thing still ran and quit in a few seconds:
d:\Software\Multichain>multichaind mychain -daemon 1>>d:\Software\Multichain\log.txt
d:\Software\Multichain>pause
Press any key to continue . . .
I want to make a console application in C++ and then when the information is displayed, close the console and run in background. Is this possible? Is another way to do that? Python maybe?
You will have to either close the console window while the process is still running, which is system dependent, or start another process, and even though the standard library offers the system function to do that, its argument is a system dependent command line.
So the upshot is: this is system dependent.
In Windows the full-version of Microsoft's Visual Studio IDE has always, as far back as I can remember, used a peculiar approach for this, with two executable files devenv.com and devenv.exe. The former is a console subsystem executable, which by default runs the latter, which is a GUI subsystem executable:
[C:\]
> where devenv
C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\devenv.com
C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\devenv.exe
[C:\]
> _
The basic idea here is that for historical reasons the command interpreter's search for an executable finds the .com file first, so the command devenv just works, either for starting the IDE or just getting the help text via the /? option.
Yes, this is possible with a small variant:
fork another process. But this is heavily system dependent:
posix/linux allow to simply clone the process;
windows requires new process to be created from an executable. You then have to communicate the state. Its less trivial as explained in this article, in the paragraph on porting fork())
then exit the program (it's the only way to give back control to the console).
On Windows use ShowWindow(FindWindowA("ConsoleWindowClass", NULL), false) to hide the console window. It will still be running in the background and is not visible on the taskbar.
However you will have to run a task manager like Taskmgr.exe to find it and shut it down.
#include <windows.h>
#include <iostream>
using namespace std;
int main () {
cout<<"Some information is displayed.. \n\n";
Sleep(5000);
cout<<"wait.. the console is going to hide and run in background.. \n";
Sleep(5000);
ShowWindow(FindWindowA("ConsoleWindowClass", NULL), false);
while(true) {
// Do your hidden stuff in here
}
return 0;
}
The other answers given here overcomplicate things. The most easy way to close the console window in Windows is to simply detach from it. Once the last user of a console window datached, the console window gets closed.
If you start a program from a CLI (e.g. cmd.exe) then this CLI is also attaches to to the console and thus the console window will not close.
Anyway, detaching from a console is as simple as calling
FreeConsole();
… done!
Also you can attach to a newly created console at any time using AttachConsole, which takes a process ID. Now in a CLI situation the parent will usually be the CLI shell, so you may want to attach to the console of that.
This is a weird one for sure.
If I open a command prompt window directly (searching cmd in start, right click > open command window here, cmd within bat file, etc....) all commands entered run perfectly fine.
If I open a command prompt window from within my C++ application (system("cmd"); or QProcess::startDetached("cmd"); etc....) the commands I enter throw errors.
Here are a few commands that don't work in the cmd opened from my app:
vssadmin delete shadows /all
vssadmin list shadows
wmic
shadowcopy
and so on... I get Class not registered and Initialization failure errors all around. Anything to do with shadow copies isn't working at all. But again, the weird thing is, those same commands work perfectly fine when cmd was opened traditionally (not from a program). Both instances of cmd have admin privileges.
So my question is, how come the way I open cmd affects whether or not some commands work? Everything I can see says there should be no difference.
32-bit applications running on WOW64 will be put under file system redirection. Therefore if your app is a 32-bit one, the call system("c:\\windows\\system32\\cmd.exe"); will be redirected to C:\Windows\SysWOW64\cmd.exe and 32-bit cmd will always be invoked. You have some solutions:
Use system("c:\\windows\\sysnative\\cmd.exe"); to access the real system32 folder and get the 64-bit cmd
Turn off file system redirection explicitly (should be avoided in general)
Or better compiling it as a 64-bit app.
I'm working on implementing a self-updater for a daemon on OS X. The update is published as a .pkg file, so what I'm trying to do is as follows:
When the daemon is notified that an update is available, it calls installer via the system() call to install the package. The package contains a newer version of the daemon, a preupgrade script that stops the daemon (launchctl unload /Library/LaunchDaemons/foo.plist), and a postflight script that starts it back up after the new version is installed. The problem I'm having is that the installer process is quitting prematurely. I suspect that it may be because the installer kills its parent process in order to update it, and then gets killed itself instead of continuing as its own orphan process. I've tried the following with no luck:
Postpending the installer command with '&' to run it in the
background
Wrapping the installer command with nohup
The install command completes consistently without error when I run it from the command line, and fails consistently when run from the installer. When called from the installer, I'm piping the output to a file, and sometimes it has nothing, and sometimes it shows the install getting to about 41% completion before output stops. Any ideas on how I can figure out what's happening to the process or make sure it stays alive without its parent?
When you call launchctl unload, it kills the entire process group (unlike a simple kill). You want to move your subprocess into a separate process group. The easiest way is with the C call setsid().
If you're in the middle of a shell script, you should look at the following approaches. I haven't tried these since I was dealing with a C program and could setsid():
Prior to calling the installer, use set -m. This is supposed to turn on monitor mode, which says "Background processes run in a separate process group and a line containing their exit status is printed upon their completion."
Try this sub-interative shell trick: New process group in shell script