Waiting on another python process to continue - python-2.7

Python version: Python 2.7.13
I am trying to write a script that is able to go through a *.txt file and launch a batch file to execute a particular test.
The code below goes through the input file, changes the string from 'N' to 'Y' which allows the particular test to be executed. I am in the process of creating a for loop to go through all the lines within the *.txt file and execute all the test in a sequence. However, my problem is that I do not want to execute the test at the same time (which is what would happen if I just write the test code).
Is there a way to wait until the initial test is finished to launch the next one?
Here is what I have so far:
from subprocess import Popen
import os, glob
path = r'C:/Users/user1/Desktop/MAT'
for fname in os.listdir(path):
if fname.startswith("fort"):
os.remove(os.path.join(path, fname))
with open('RUN_STUDY_CHECKLIST.txt', 'r') as file:
data = file.readlines()
ln = 4
ch = list(data[ln])
ch[48] = 'Y'
data[ln] = "".join(ch)
with open('RUN_STUDY_CHECKLIST.txt', 'w') as file:
file.writelines(data)
matexe = Popen('run.bat', cwd=r"C:/Users/user1/Desktop/MAT")
stdout, stderr = matexe.communicate()
In this particular instance I am changing the 'N' in line 2 of the *.txt file to a 'Y' which will be used as an input for another python script.
I have to mention that I would like to do this task without having to interact with any prompt, I would like to do execute the script and leave it running (since it would take a long time to go through all the tests).
Best regards,
Jorge

After further looking through several websites I managed to get a solution to my question.
I used:
exe1 = subprocess.Popen(['python', 'script.py'])
exe1.wait()
I wanted to post the answer just in case this is helpful to anyone.

Related

How to output to command line when running one python script from another python script

I have multiple python scripts, each with print statements and prompts for input. I run these scripts from a single python script as below.
os.system('python script1.py ' + sys.argv[1])
os.system('python script2.py ' + sys.argv[1]).....
The run is completed successfully, however, when I run all the scripts from a single file, I no longer see any print statements or prompts for input on the run console. Have researched and attempted many different ways to get this to work w/o success. Help would be much appreciated. Thanks.
If I understand correctly you want to run multiple python scripts synchronously, i.e. one after another.
You could use a bash script instead of python, but to answer your question of starting them from python...
Checkout out the subprocess module: https://docs.python.org/3.4/library/subprocess.html
In particular the call method, it accepts a stdin and stdout which you can pass sys.stdin and sys.stdout to.
import sys
import subprocess
subprocess.call(['python', 'script1.py', sys.argv[1]], stdin=sys.stdin, stdout=sys.stdout)
subprocess.call(['python', 'script2.py', sys.argv[1]], stdin=sys.stdin, stdout=sys.stdout)
^
This will work in python 2.7 and 3, another way of doing this is by importing your file (module) and calling the methods in it. The difference here is that you're no longer running the code in a separate process.
subroutine.py
def run_subroutine():
name = input('Enter a name: ')
print(name)
master.py
import subroutine
subroutine.run_subroutine()

How do I store the output into a variable or into a .txt file?

I am running the following code in Python 2.7:
values = os.system("bazel build tensorflow/examples/image_retraining:"
"label_image && bazel-bin/tensorflow/examples/image_retraining/label_image "
"--graph=/tmp/output_graph.pb --labels=/tmp/output_labels.txt "
"--output_layer=final_result:0 --image=$HOME/Desktop/Image-3/image1.png")
print values
But for the values variable I am returned a 0. I believe this means that I am not getting any errors. How do I store the output into a variable or into a .txt file?
You can just redirect the output of the system call appending > output.txt to your command.
The output of the command will be in file output.txt in the directory where you invoke the command (likely the very same one you invoke your python script in).
Since I can't readily reproduce your command, I used a simple example - try to switch to Pyopen in the subprocess module:
from subprocess import Popen
proc = Popen(['ls', '-t'], stdout = open('/path/redir.txt', 'w'))
Here you run the command in square brackets and redirect the output from stdout i.e. the terminal to a file redir.txt.

Open file inside Python

How can I open a file once I am inside Python, that is, once I have typed "python" in the terminal? I know how to open a file by typing something similar to the following in a script, and then running it:
from sys import argv
script, filename = argv
txt = open(filename)
print txt.read()
But I have no idea how to do it once I'm inside the Python interpreter. I've tried to type open (file.txt) and also open ("file.txt"), but I get a long error message either way.
Which is the correct way to do this?
You have to add a mode to the call txt = open('filename.txt', 'r') if you want to read (or w/a for writing or appending). I just tried it, it works :)

Run LIWC as external program to python - subprocess

I would like to run LIWC (installed in my Mac) within a python 2.7 script.
I have been reading about subprocess (popen and check_output seem the way to go), but I do not get the syntax for:
opening the program;
getting a text file to be analysed;
running the program;
getting the output (analysis) and storing it in a text file.
This is my first approach to subprocess, is this possible?
I appreciate the suggestions.
EDIT
This is the closest to implementing a solution (still does not work):
I can open the application.
subprocess.call(['open', '/file.app'])
But cannot make it process the input file and get an output one.
subprocess.Popen(['/file.app', '-input', 'input.txt', '-output', 'output.txt'])
Nothing comes out of this code.
EDIT 2
After reading dozens of posts, I am still very confused about the syntax for the solution.
Following How do I pipe a subprocess call to a text file?
I came out with this code:
g = open('in_file.txt', 'rb', 0)
f = open('out_file.txt', 'wb')
subprocess.call(['open', "file.app"] stdin=g, stdout=f)
The output file comes out empty.
EDIT 3
Following http://www.cplusplus.com/forum/unices/40680/
When I run the following shell script on the Terminal:
cat input.txt | /Path/LIWC > output.txt
The output txt file is empty.
EDIT 4
When I run:
subprocess.check_call(['/PATH/LIWC', 'PATH/input.txt', 'PATH/output.txt'])
It opens LIWC, does not create an output file and freezes.
EDIT 5
When I run:
subprocess.call(['/PATH/LIWC', 'PATH/input.txt', 'PATH/output.txt'])
It runs LIWC, creates an empty output.txt file and freezes (the process does not end).
The problem with using 'open' in subprocess.call(['open', "file.app"] stdin=g, stdout=f) is that it requests that a file be opened through a service, and doesn't directly attach it to your python process. You'll need to instead use the path to LIWC. I'm not sure that it supports reading from stdin, though, so you might need to even pass in the path to the file you'd like it to open.

Condor output file updating

I'm running several simulations using Condor and have coded the program so that it outputs a progress status in the console. This is done at the end of a loop where it simply prints the current time (this can also be percentage or elapsed time). The code looks something like this:
printf("START");
while (programNeedsToRum) {
// Run code repetitive code...
// Print program status update
printf("[%i:%i:%i]\r\n", hours, minutes, seconds);
}
printf("FINISH");
When executing normally (i.e. in the terminal/cmd/bash) this works fine, but the condor nodes don't seem to printf() the status. Only once the simulation has finished, all the status updates have been outputted to the file but then it's no longer of use. My *.sub file that I submit to condor looks like this:
universe = vanilla
executable = program
output = out/out-$(Process)
error = out/err-$(Process)
queue 100
When submitted the program executes (this is confirmed in condor_q) and the output files contain this:
START
Only once the program has finished running its corresponding output file shows (example):
START
[0:3:4]
[0:8:13]
[0:12:57]
[0:18:44]
FINISH
Whilst the program executes, the output file only contains the START text. So I came to the conclusion that the file is not updated if the node executing program is busy. So my question is, is there a way of updating the output files manually or gather any information on the program's progress in a better way?
Thanks already
Max
What you want to do is use the streaming output options. See the stream_error and stream_output options you can pass to condor_submit as outlined here: http://research.cs.wisc.edu/htcondor/manual/current/condor_submit.html
By default, HTCondor stores stdout and stderr locally on the execute node and transfers them back to the submit node on job completion. Setting stream_output to TRUE will ask HTCondor to instead stream the output as it occurs back to the submit node. You can then inspect it as it happens.
Here's something I used a few years ago to solve this problem. It uses condor_chirp which is used to transfer files from the execute host to the submitter. I have a python script that executes the program I really want to run, and redirects its output to a file. Then, periodically, I send the output file back to the submit host.
Here's the Python wrapper, stream.py:
#!/usr/bin/python
import os,sys,time
os.environ['PATH'] += ':/bin:/usr/bin:/cygdrive/c/condor/bin'
# make sure the file exists
open(sys.argv[1], 'w').close()
pid = os.fork()
if pid == 0:
os.system('%s >%s' % (' '.join (sys.argv[2:]), sys.argv[1]))
else:
while True:
time.sleep(10)
os.system('condor_chirp put %s %s' % (sys.argv[1], sys.argv[1]))
try:
os.wait4(pid, os.WNOHANG)
except OSError:
break
And my submit script. The problem ran sh hello.sh, and redirected the output to myout.txt:
universe = vanilla
executable = C:\cygwin\bin\python.exe
requirements = Arch=="INTEL" && OpSys=="WINNT60" && HAS_CYGWIN==TRUE
should_transfer_files = YES
transfer_input_files = stream.py,hello.sh
arguments = stream.py myout.txt sh hello.sh
transfer_executable = false
It does send the output in its entirety, so take that in to account if you have a lot of jobs running at once. Currently, its sending the output every 10 seconds .. you may want to adjust that.
with condor_tail you can view the output of a running process.
to see stdout just add the job-ID (and -f if you want to follow the output and see the updates immediately. Example:
condor_tail 314.0 -f