I have looked at this question but I am not sure I got it correctly or not.
I have opened pycharm and one python script and its running (it's topic modeling).
Also I have another python script in which I opened in another pycharm in the same server. I also run it.
Now these two program are running in the same server, I should mention that I have not changed any configuration neither server nor pycharm.
Do you think its ok in this way? or one script technically won't run(in terms of progressing I mean it just show its running but practically wont run) until the other script finished?
Edit Configurations -> Allow parallel run. Done
First, PyCharm will create independent processes on the server, so both scripts will run. You can check it with something like htop - search for processes and verify that they're running.
Second, you don't have to open second PyCharm window to run the second script. You can run both of them from the single one. There are at least two ways: with run configurations or by spawning multiple terminal windows and running scripts from there.
From the Run/Debug Configurations windows you can add a Compound configuration that contains multiple configurations that will run in parallel. The Allow parallel run option for child configurations make no difference in this case.
The default behaviour was changed starting from version 2018.3. You can allow multiple runs by selecting Allow parallel run within the Edit Configurations menu.
Related
We developed a Flask webapp, and want to deploy it on IIS. During development, we started the app via flask run, which lanches a single instance of our app. On IIS, however, we observed (via the task manager) that our app runs multiple instances concurrently.
The problem is that our app is not designed to run in parallel. For example, our app reads a file from the file system and keeps it in memory for efficiency. This optimization is correct only if it is guaranteed that no other process changes the content of the file.
Is there a way to prevent IIS from starting multiple instances?
In IIS, you can go to FastCGI settings, in there you can see all the applications used by websites on your server. In the column "Max. Instances", the script you are talking about is probably set to 0 (or some value greater than 1), meaning it can be started multiple times. Limiting this to 1 will solve your problem.
you could use below code to run only one instance of a program:
from tendo import singleton
me = singleton.SingleInstance() # will sys.exit(-1) if other instance is running
the command to install:
pip install tendo
Reference link:
Check to see if python script is running
How to make only one instance of the same script running at any time
https://github.com/pycontribs/tendo/blob/master/tendo/singleton.py
In WebStorm I can very easy setup JavaScript Debug and then when I run this configuration, IDE opens the Chrome browser and all breakpoints are active. The problem begins when I need to run specific tasks prior to starting debugging, for example running npm build script. When I define it in Before launch (see the picture below), the Chrome browser not being opened when I activate this debug configuration but being opened after I stop it.
This requires from me, manually run a project from command line and then run Browser Debug
Can I define the additional tasks in a way that Chrome will be opened as usually?
Thank you.
A process added to Before launch section has to return an exit code, the main process is waiting for it to start and thus doesn't start until the first process terminates. This is the way Before launch is designed - it's supposed to be used to run some sort of pre-processing before running the main process. You can add a build task (a script that builds your app and then exits) to this section; but start:dev likely doesn't exit, it starts the server your application is hosted on, and it has to be running to make your application work, doesn't it? Please remove your npm script from Before launch, start it separately or use the Compound configuration to start both npm script and Javascript Debug run configuration concurrently
I am trying to find a way of running my Nodejs app in the background. I did a lot of research and I am aware of them (node-windows, forever, nssm, ...).
During this what came to my mind was to create my OWN service wrapper in c++ which executes the script (windows) as a child process.
Therefore my question: Is it possible? and what are the possibilities to communicate with the node.exe executing my script? In Google in find tons of articles about the node "childprocess" module but nothing where the node.exe is the childprocess.
BTW: In one of the answers here on SO I found a solution with the sc.exe, but when I am installing the node.exe with the script it gets terminated because it does not respond to the SCM commands. Did this ever work?
Thank you alot in advance
You can make the process run in background using pm2
pm2 start app.js --watch
This will start the process and will also look for changes in the file. More about watch flag
At work we fully test the GUI components. The problem arises from the fact that, while the testsuite is running, the various components pop up, stealing the focus or making it impossible to continue working. The first thing I thought of was Xnest, but I was wondering if there's a more elegant solution to this problem.
I think what you need to do here is have your tests run on a different Display than the one you're working on.
When we moved our TeamCity agents to EC2, we had to figure out a solution to running our UI unit tests on a headless Linux server. I found a way to do it in this blog post, which outlines how to use Xvfb.
For my case, all I had to do was:
yum install xorg-x11-server-Xvfb
Xvfb :100 -ac to run the server. I added this to my rc.local file on my EC2 agents to start it at machine startup.
Then I added env.DISPLAY :100 to my TeamCity build configuration
I have a weird problem with Django-Kronos.
I've got it running successfully on my local machine and on our development server. However, on the production server, I can't get kronos to acknowledge my cron.py file. When I run installtasks, it runs but says "0 tasks installed". I've also tried running the tasks manually and kronos tells me the task doesn't exist.
We use git to push everything through to the server, so all the files and the structures are identical between the three locations. I've also checked and the cron.py file exists and has exactly the same content as the working servers.
The only differences between the servers is that the production server is running Postgres (SQlite on the dev server) and it's Ubuntu 12.10, whereas the dev server is 12.01.
Kronos is functioning properly, but it's not picking up our cron.py file for some reason....
Any got any ideas?!
Well, unfortunately, our solution was to scrap Django-Kronos altogether and create a custom management command which we're running from the crontab.
This happens when one of import you are trying to make is not there, your production system might be missing some Python package which is included in your cron.py.