SSH to singleconnect (SSO) server using paramiko - python-2.7

I wrote a script that use ssh to connect (single-sign-on server) a single connect server from there i would like to connect a node. But my problem is when i send a command through ssh invoke_shell, i can see it on the output but the command didn't executed. Any help related to single connect and paramiko would be well appreciated.
#!/usr/bin/env python
import paramiko
import time
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect("host",port=2222,username="username",password="pass")
remote_con = ssh.invoke_shell()
print "Interactive SSH session established"
output = remote_con.recv(5000)
remote_con.send("device_ip_here_to_connect_from_SSO \n")
print output
time.sleep(3)
output = remote_con.recv(5000)
print output

Make sure you use send('command\n') instead of send('command') when sending the command.

I found the solution,
The problem was sending the command like this
remote_con.send("device_ip_here_to_connect_from_SSO \n")
it should be like this
remote_con.send("device_ip_here_to_connect_from_SSO \r")

Related

How can I send an email from Apache superset?

I'm facing Authentication unsuccessful error when I try to send mail from Apache Superset.
I checked the document's SMTP integration chapter:https://apache-superset.readthedocs.io/en/latest/installation.html
I changed SMTP_PASSWORD parameter and I entered password manually in my config.py file like SMTP_PASSWORD = "'xxx!!'" (I added '' because my password includes special character like '!'. Also, I tried SMTP_PASSWORD="xx!!" but also I'm getting error.
smtplib.SMTPAuthenticationError: (535, b'5.7.3 Authentication unsuccessful')
I can login with my SMTP user via browser (Exchange). Could you please help me to solve this issue?
Apache Superset version: 1.3.2
Python version:3.8
OS version: RHEL 8
Thanks
You might be able to poke at this by checking your SMTP connection from inside your Superset instance.
From where your Superset instance should be sending the emails, start a Python script and try:
import smtplib
server = smtplib.SMTP('smtpservername.yourdomain', yourportnumber)
If that command fails or hangs indefinitely, you know there's a connection problem. You might try connecting to the GMail SMTP server or another known-to-work set of credentials and seeing if that works to narrow it down.
For instance, if running Superset with docker-compose you would enter the worker container with docker exec -it superset_worker /bin/bash, run python, and try this there.
My scheduled reports were failing and doing this helped me isolate a cause: my Superset instance could not connect to the target unauthenticated SMTP server.

Shell Script stops after connecting to external server

I am in the process of trying to automate deployment to an AWS Server as a cool project to do for my coding course. I'm using ShellScript to automate different processes but when connecting to the AWS E2 Ubuntu server. When connected to the server, it will not do any other shell command until I close the connection. IS there any way to have it continue sending commands while being connected?
read -p "Enter Key Name: " KEYNAME
read -p "Enter Server IP With Dashes: " IPWITHD
chmod 400 $KEYNAME.pem
ssh -i "$KEYNAME.pem" ubuntu#ec2-$IPWITHD.us-east-2.compute.amazonaws.com
ANYTHING HERE AND BELOW WILL NOT RUN UNTIL SERVER IS DISCONNECTED
A couple of basic points:
A shell script is a sequential set of commands for the shell to execute. It runs a program, waits for it to exit, and then runs the next one.
The ssh program connects to the server and tells it what to do. Once it exits, you are no longer connected to the server.
The instructions that you put in after ssh will only run when ssh exits. Those commands will then run on your local machine instead of the server you are sshed into.
So what you want to do instead is to run ssh and tell it to run a set of steps on the server, and then exit.
Look at man ssh. It says:
ssh destination [command]
If a command is specified, it is executed on the remote host instead of a login shell
So, to run a command like echo hi, you use ssh like this:
ssh -i "$KEYNAME.pem" ubuntu#ec2-$IPWITHD.us-east-2.compute.amazonaws.com "echo hi"
Or, for longer commands, use a bash heredoc:
ssh -i "$KEYNAME.pem" ubuntu#ec2-$IPWITHD.us-east-2.compute.amazonaws.com <<EOF
echo "this will execute on the server"
echo "so will this"
cat /etc/os-release
EOF
Or, put all those commands in a separate script and pipe it to ssh:
cat commands-to-execute-remotely.sh | ssh -i "$KEYNAME.pem" ubuntu#ec2-$IPWITHD.us-east-2.compute.amazonaws.com
Definitely read What is the cleanest way to ssh and run multiple commands in Bash? and its answers.

Python connect to Hadoop Kerberos with username-password

I need connecting to a Hadoop cluster that has a Kerberos authentication using Python 2.7.15. First I connect to the Hadoop Edge Node via SSH:
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy)
client.connect("myhost", port=22, username="myusername_1", password="mypwd_1")
Second I ask the Kerberos ticket. Because of company security policy I use different credentials (here "myusername_2" and "mypwd_2").
kinitcommand = "echo mypwd_2 | kinit myusername_2#MYDOMAIN"
stdin, stdout, stderr = client.exec_command(kinitcommand)
Third I can run a command as "ls"
lscommand = "hadoop fs -ls"
stdin, stdout, stderr = client.exec_command(lscommand)
What I am looking for is a Python library that allows the connection described above without using SSH. I tried hdfs but the class hdfs.ext.kerberos.KerberosClient does not allow user credentials as input, as the hdfs webpage shows. Thank you in advance

snmpd pass to run python

Im trying to query a modbus device trough snmp using snmpd to pass a python script to retrieve data.
#! /bin/bash
if [ "$1" = "-g" ]
then
echo .1.3.6.1.4.1.52612.10.3.1
echo string
python /usr/local/bin/readvolt.py
fi
exit 0
And this is the readvolt.py looks like :
#!/usr/bin/python
import minimalmodbus
eqp = minimalmodbus.Instrument('/dev/ttyUSB0',1) # port name, slave address (in decimal)
# skip some other lines for serial port initialization
volt = eqp.read_float(0,4,2) # getting data from serial modbus
print volt
and this line from my snmpd.conf :
pass .1.3.6.1.4.1.52612.10.3.1 /bin/sh /usr/local/bin/volt.sh
my question : I got traceback from python, couldnot find minimalmodbus module, but when i tried to run the readvolt.py from directly from host, it is working as expected (it can print out the result (volt) )
pi#raspberrypi:/usr/local/bin $ readvolt.py
220.25
I also tried using simple python script (test.py) just to make sure if snmpd pass can run python script on respond of snmpget from snmp manager
#!/usr/bin/python
import sys
print "test"
It run OK :
suryo#r50e:~$ snmpwalk -v2c -c public 192.168.1.5 .1.3.6.1.4.1.52612.10.3.1
iso.3.6.1.4.1.52612.10.3.1 = STRING: "test"
suryo#r50e:~$
what is the problem here ? seems that python could not import external module when it is run by snmpd pass.
I'm thinking if this is an access control issue, Debian-snmp doesnt have right to access serial port..
Problem is solved, by finding out the username of the snmpd daemon. I put whoami in the script and got 'Debian-snmp', then becoming straight forward, checking group membership by running :
pi#rraspberrypi:~$ groups Debian-snmp
Debian-snmp : Debian-snmp
Put Debian-snmp in the dialout membership to grant full access to serial ports:
pi#raspberrypi:~ $ sudo usermod -a -G dialout Debian-snmp
Restart the snmpd to logon with new membership, and voilla..It can read the slave modbus device from snmp command /snmpget

How to connect to memcachedb and use API

I have installed memcachedb according to the Memcachedb: The complete guide, and I am able to set and get the key,values using telnet as explained in the guide.
What I really want to do is to set and get the key, value pairs from a python script.
I have the memcachedb running on the Ubuntu machine by following command:
sudo ./memcachedb -vv -u root -H ~/mcDB/ -N
I read and found out that libmemcached python client can be used to communicate with memcachedb.
So, I am using the following test script
import memcache
client=memcache.Client([('localhost',21201)]) # port for memcachedb
print "return value " + str(client.set("key","value"))
print "get result " + str(client.get("key"))
But it gives the following output:
return value 0
get result None
I have also tried replacing localhost with 127.0.0.1, does not work either.
In fact, there is no output by memcachedb (-vv option) on running the python script while there is when I use telnet to set and get.
So how can I connect to memcachedb and execute commands through python (get and set)?
So instead of python-memcached, I tried pylibmc and now that script is working.
There is probably some problem with python-memcached.
Updated script looks as follows:
import pylibmc
client=pylibmc.Client(["127.0.0.1:21201"]) # port for memcachedb
print "return value " + str(client.set("key","value"))
print "get result " + str(client.get("key"))