I need connecting to a Hadoop cluster that has a Kerberos authentication using Python 2.7.15. First I connect to the Hadoop Edge Node via SSH:
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy)
client.connect("myhost", port=22, username="myusername_1", password="mypwd_1")
Second I ask the Kerberos ticket. Because of company security policy I use different credentials (here "myusername_2" and "mypwd_2").
kinitcommand = "echo mypwd_2 | kinit myusername_2#MYDOMAIN"
stdin, stdout, stderr = client.exec_command(kinitcommand)
Third I can run a command as "ls"
lscommand = "hadoop fs -ls"
stdin, stdout, stderr = client.exec_command(lscommand)
What I am looking for is a Python library that allows the connection described above without using SSH. I tried hdfs but the class hdfs.ext.kerberos.KerberosClient does not allow user credentials as input, as the hdfs webpage shows. Thank you in advance
Related
What I am trying to achieve: I have a Mosquitto MQTT Broker running on a Google Cloud virtual machine (Ubuntu), and I want to be able to connect to it from my local PC using mqtt.js
My setup
I have created a VM instance in Google Cloud, running Ubuntu 20.04.LTS.
Some of the settings:
Firewall – allow HTTPS and allow HTTP
Firewall rule – opens port 1883
I installed Mosquitto MQTT Broker (version 1.6.9) on this VM.
I was able to verify the installation and that it was running, by opening to SSH terminals, one to publish, one to subscribe
mosquitto_sub -t test
mosquitto_pub -t test -m “hello”
Then I read that when I want to connect to VMs using third-party tools, I must create and upload my own SSH keys to VMs:
ssh-keygen -t rsa -f C:\keys\VM_KEYFILE -b 2048 pwd: ****
I got two files now, the private and public keys:
VM_KEYFILE
VM_KEYFILE.pub
I then used icacls to modify the private key’s permissions:
icacls.exe VM_KEYFILE /reset
icacls.exe VM_KEYFILE /grant:r “$($env:username):(r)”
icacls.exe VM_KEYFILE /inheritance:r
I then successfully connected ot the VM from a Windows console:
ssh -i "VM_KEYFILE" username#vm_public_ip_address
So now I want to try and connect using node.js
I already have a javascript file that uses mqtt.js to connect to some of the public MQTT brokers, e.g. HiveMQ
Some of its settings are:
let broker_host = 'broker.hivemq.com';
let broker_port = 1883;
let client_id = 'my_client_1';
const connection_options = {
port: broker_port,
host: broker_host,
clientId: client_id,
clean: true,
keepalive: false
};
My question: How would I modify this JavaScript file to connect to the MQTT broker that is running in the Google Cloud VM
There is no username/password/authentication set up for the broker itself, just the VM.
I tried something like this, but I have no idea how to use the SSH key
let broker_host_gcm_vm = 'https://<vm_public_ip_address>
UPDATE
I can connect to the broker from both (a) MQTT Explorer, and (b) MQTTX deskptop app.
All I have to enter for the connection details is:
Host: mqtt://<ip address>
Port: 1883
Then I can publish / subscribe successfully.
I tried changing my JavaScript connection to the following, but I still can't connect from here:
let broker_host_gcm_vm1 = 'mqtt://<ip address>';
I found the problem.
Let's say the host IP address is 11.22.33.44
The host was none of these:
let broker_host = 'http://11.22.33.44';
let broker_host = 'https://11.22.33.44';
let broker_host = 'mqqt://11.22.33.44';
let broker_host = 'mqtts://11.22.33.44';
But was simply this:
let broker_host = '11.22.33.44';
Simple when you know how :)
I'm facing Authentication unsuccessful error when I try to send mail from Apache Superset.
I checked the document's SMTP integration chapter:https://apache-superset.readthedocs.io/en/latest/installation.html
I changed SMTP_PASSWORD parameter and I entered password manually in my config.py file like SMTP_PASSWORD = "'xxx!!'" (I added '' because my password includes special character like '!'. Also, I tried SMTP_PASSWORD="xx!!" but also I'm getting error.
smtplib.SMTPAuthenticationError: (535, b'5.7.3 Authentication unsuccessful')
I can login with my SMTP user via browser (Exchange). Could you please help me to solve this issue?
Apache Superset version: 1.3.2
Python version:3.8
OS version: RHEL 8
Thanks
You might be able to poke at this by checking your SMTP connection from inside your Superset instance.
From where your Superset instance should be sending the emails, start a Python script and try:
import smtplib
server = smtplib.SMTP('smtpservername.yourdomain', yourportnumber)
If that command fails or hangs indefinitely, you know there's a connection problem. You might try connecting to the GMail SMTP server or another known-to-work set of credentials and seeing if that works to narrow it down.
For instance, if running Superset with docker-compose you would enter the worker container with docker exec -it superset_worker /bin/bash, run python, and try this there.
My scheduled reports were failing and doing this helped me isolate a cause: my Superset instance could not connect to the target unauthenticated SMTP server.
I wrote a script that use ssh to connect (single-sign-on server) a single connect server from there i would like to connect a node. But my problem is when i send a command through ssh invoke_shell, i can see it on the output but the command didn't executed. Any help related to single connect and paramiko would be well appreciated.
#!/usr/bin/env python
import paramiko
import time
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect("host",port=2222,username="username",password="pass")
remote_con = ssh.invoke_shell()
print "Interactive SSH session established"
output = remote_con.recv(5000)
remote_con.send("device_ip_here_to_connect_from_SSO \n")
print output
time.sleep(3)
output = remote_con.recv(5000)
print output
Make sure you use send('command\n') instead of send('command') when sending the command.
I found the solution,
The problem was sending the command like this
remote_con.send("device_ip_here_to_connect_from_SSO \n")
it should be like this
remote_con.send("device_ip_here_to_connect_from_SSO \r")
I have to connect to a remote sftp server to upload some files using python 2.7. i have been given a username, password and the authentication key file. when i execute this code:
srv= pysftp.Connection(host='54.172.33.121',username='xxx',password='xxx',private_key='c:\SFTPKey\hajjcoreit.ppk')
i get this error:
BadAuthenticationType: ('Bad authentication type', [u'publickey']) (allowed_types=[u'publickey'])
i can access the server through winSCP though.
Edits in your code
You probably have done something wrong with the hostname. As the proper syntax for pysftp is as
import pysftp
sftp = pysftp.Connection('hostname', username='me', password='secret')
#
# ... do sftp operations
#
sftp.close() # close your connection to hostname
so now this is what you do,
srv=pysftp.Connection('sftp.54.172.33.121',username='xxx',password='xxx',private_key='c:\SFTPKey\hajjcoreit.ppk')
so you won't get any error.Also cross check with the filezilla that the link, uid and password are correct.
I want to use a batch file to connect to a amazon server by using the ssh private or public key.
I have tried
open sftp://user:password#example.com/ -hostkey="ssh-rsa 2048 xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx"
psftp fred#server.example.com
in the script but I am not able to connect to the server. I am getting this error.
disconnected no supported authentication methods available (server sent publickey)
I am able to connect it from putty. I want a batch file to connect to the server and restart the django project in that.
If I am able to connect then I just need to go the path of my django project and run the command
manage.py runserver
ssh -i PrivateKey.pem username#server.example.com