Python : Use Popen and Communicate and get the stdout in a loop - python-2.7

I want to execute bash command and get the output and I decided to use Popen and Communicate to do this. this job has to be done in a loop. The problem is that in the first round of loop every thing is OK but in the next loops I get Error when using communicate function. I know that communicate function can only be used once but I create a new subprocess each loop and therefore running communicate should be possible .
Here's the Code:
with open('ComSites2.txt') as file:
for site in file:
sf = site.split()
temp = "http://wwww." + sf[0]
command = "wget --spider -S '%s' 2>&1 | grep 'HTTP/' | awk '{print $2}'" %temp
print command
output = subprocess.Popen(command,stdout=subprocess.PIPE, shell=True)
StatusCode = int(output.communicate()[0])
if StatusCode == 200:
print "page found successfully"
elif StatusCode == 404:
print "Page not found!!"
else:
print "no result got!!"
when I run this I get this output :
wget --spider -S 'http://wwww.ACAServices.com' 2>&1 | grep 'HTTP/' | awk '{print $2}'
page found successfully
wget --spider -S 'http://wwww.ACI.com' 2>&1 | grep 'HTTP/' | awk '{print $2}'
Traceback (most recent call last):
File "/root/AghasiTestCases/URLConnector.py", line 17, in <module>
StatusCode = int(output.communicate()[0])
ValueError: invalid literal for int() with base 10: ''

Related

gdb - assign pipe output to a variable

I have the following command , I would like to assign his output to a variable:
(gdb) pipe monitor get info | grep cross2_Release.nss | cut -c 3-13
0x566f80400
Tried this
set $main = 'pipe monitor get info | grep cross2_Release.nss | cut -c 3-14'
No symbol table is loaded. Use the "file" command.
Expected result would be to have $main equals to 0x566f80400
At first, I thought this would work:
python output = gdb.execute('pipe monitor get info | grep cross2_Release.nss | cut -c 3-13', to_string=True)
python gdb.set_convenience_variable('main', gdb.parse_and_eval(output))
But this fails because the pipe command sends output to GDB's stdout rather than to a stream that gdb.execute can capture and put into a string. So here's something a little more crude:
shell printf 'set $main = ' > ~/.gdbtmp
pipe monitor get info | grep cross2_Release.nss | cut -c 3-13 >> ~/.gdbtmp
source ~/.gdbtmp
Using the python subprocess.check_output() function it's relatively simple to do that:
py gdb.set_convenience_variable('main', gdb.parse_and_eval(subprocess.check_output('grep cross2_Release.nss | cut -c 3-13', shell=True, input=gdb.execute('monitor get info', to_string=True), text=True)))
But that's a lot to type, so I would pack that into a gdb convenience function:
import subprocess
class Pipe(gdb.Function):
def __init__(self):
super(Pipe, self).__init__("pipe")
def invoke(self, gdb_cmd, pipe_cmd):
return gdb.parse_and_eval(subprocess.check_output(pipe_cmd.string(), shell=True, input=gdb.execute(gdb_cmd.string(), to_string=True), text=True))
Pipe()
Which then could be used like this:
set $main = $pipe("monitor get info", "grep cross2_Release.nss | cut -c 3-14")

Creating an alert function in Bash

I wanted to create a function in bash similar to a default alias I got in Ubuntu, looking like:
alias alert='notify-send --urgency=low -i "$([ $? = 0 ] && echo terminal || echo error)" "$(history|tail -n1|sed -e '\''s/^\s*[0-9]\+\s*//;s/[;&|]\s*alert$//'\'')"'
This creates a simple notification after a command has been issued with it.
For example, using
history | grep vim; sleep 5; alert
gives a notification after the sleep is done, simply saying
history | grep vim; sleep 5;
I would like to write the alert into a bash function instead, which have given some trouble with the regex.
I have tried:
function alert2 () {
ICON=$([ $? = 0 ] && echo terminal || echo error)
MSG=$(history | tail -n1 | sed -e s/^\s*[0-9]\+\s*//\;s/[\;\&\|]\s*alert$//)
notify-send --urgency=low -i $ICON $MSG
}
which would output both the linenumber in history when called itself, and give an Invalid number of options when called such as the first example.
Is this possible, and if so, how? Is it simply my regex that is faulty?
I'm running on WSL, so don't have notify-send installed:
function alert2 () {
ICON=$([ $? = 0 ] && echo terminal || echo error);
MSG=$(history | tail -n1| sed -e 's/^\s*[0-9]\+\s*//;s/[;&|]\s*alert2$//');
echo $ICON $MSG;
}
jadams#Temp046317:~/code/data-extract$ cat /etc/apt/sources.list > /dev/null ; alert2
terminal cat /etc/apt/sources.list > /dev/null
I'm hoping that this would work for you (instead of the echo):
notify-send --urgency=low -i "$ICON $MSG"

Create conditional bash script code

I use the following code to load some text file with emails
and create users in the system with user password.
the text file contain emails like following
abc#gmail.com
BDD#gmail.com
ZZZ#gmail.com
In case the name is coming with upper case I convert it to lower case, I was able to make it work.
Now I need to support another input instead of email
e.g.
P123456
Z877777
but now I dont want for this type of input to convert it to lower case
someting like
if(emailpattern )
convert to lower
else
Not
This is the code which works but I failed to make it work...
for user in $(cat ${users} | awk -F";" '{ print $1 }'); do
user=$(echo ${user} | tr "[:upper:]" "[:lower:]")
log "cf create-user ${user} ${passwd}"
#Here we are creating email user in the sys
cf create-user ${user} ${passwd} 2>&1 |
tee -a ${dir}/${scriptname}.log ||
{ log "ERROR cf create-user ${user} failed" ;
errorcount=$[errorcount + 1]; }
done
You can use:
while IFS= read -r user; do
# convert to lowercase only when $user has # character
[[ $user == *#* ]] && user=$(tr "[:upper:]" "[:lower:]" <<< "$user")
log "cf create-user ${user} ${passwd}"
cf create-user ${user} ${passwd} 2>&1 |
tee -a ${dir}/${scriptname}.log ||
{ log "ERROR cf create-user ${user} failed" ;
errorcount=$[errorcount + 1]; }
done < <(awk -F ';' '{ print $1 }' "$users")
Assumptions:
input file consists of email addresses or names, each on a separate line
email addresses are to be converted to lower case
names are to be left as is (ie, no conversion to lower case)
all of the log/cf/tee/errorcount code functions as desired
Sample input file:
$ cat userlist
abc#gmail.com
BDD#gmail.com
ZZZ#gmail.com
P123456
Z877777
We'll start by using awk to conditionally convert email addresses to lower case:
$ awk '/#/ {$1=tolower($1)} 1' userlist
abc#gmail.com
bdd#gmail.com
zzz#gmail.com
P123456
Z877777
first we'll run the input file (userlist) through awk ...
/#/ : for lines that include an email address (ie, contains #) ...
$1=tolower($1) : convert the email address (field #1) to all lowercase, then ...
1 : true for all records and implies print all inputs to output
Now pipe the awk output to a while loop to perform the rest of the operations:
awk '/#/ {$1=tolower($1} 1}' userlist | while read user
do
log "cf create-user ${user} ${passwd}"
#Here we are creating email user in the sys
cf create-user ${user} ${passwd} 2>&1 |
tee -a ${dir}/${scriptname}.log ||
{ log "ERROR cf create-user ${user} failed" ;
errorcount=$((errorcount + 1)) ;
}
done
updated to correctly increment errorcount by 1
bash can lower-case text:
while IFS= read -r line; do
[[ $line == *#* ]] && line=${line,,}
# do stuff with "$line"
done

Is there a way to pipe binary stream to remote server using ssh and Python?

NOTE: not interested in any modules like Pramiko
I'm trying to save some binary data on remote server without creating local file.
As a test I read from file but later I'm replacing it with data feed:
ps = subprocess.Popen(['cat', "/delta/ftp/GSM.PRICINT_TBL.dmp"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
Next step I want to ssh data to remote server
ssh = subprocess.Popen(["ssh", '-XC', '-c', 'blowfish-cbc,arcfour', 'deltadmin#archiveserver', 'echo - >/tmp/test.log'],
shell=False,
stdin = ps.stdout,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
I use '-' so cat can accept standard input.
Expected result is data in /tmp/test.log but i see only
'-\n'
Any idea how to make it work?
I figured it:
echo 'test'|ssh -XC -c blowfish-cbc,arcfour bicadmin#nitarchive -T 'gzip - >/tmp/test.gz'
then on remote server:
zcat /tmp/test.gz
test
For cat we need space after redirect:
cat - > /tmp/test.txt

passing Python variable to triple quoted curl command

SIZ=100
imap_cmd="""
curl -s -X GET --insecure -u xxx https://xxxxx/_search?pretty=true -d '{
"from":0,
"size":%SIZ,
"query":{ "match_all": {} },
"_source":["userEmail"]
}' | grep -i userEmail|awk {'print $3'} | cut -d ',' -f1
"""
def run_cmd(cmd):
p = Popen(cmd, shell=True, stdout=PIPE)
output = (p.communicate()[0])
return output
I'm trying to pass the SIZ (python) variable to the curl command, but it is not interpreting the value when i execute the command. what I'm missing here
It looks like you're trying to use the % formatter in this line,
"size":%SIZ,
try
imap_cmd="""
curl -s -X GET --insecure -u xxx https://xxxxx/_search?pretty=true -d '{
"from":0,
"size":%d,
"query":{ "match_all": {} },
"_source":["userEmail"]
}' | grep -i userEmail|awk {'print $3'} | cut -d ',' -f1
""" % SIZ
Here is more info on formatting strings.