Scapy sniff in non blocking way - python-2.7

In the blocking way I can do this:
from scapy.all import *
sniff(filter"tcp and port 80", count=10, prn = labmda x:x.summary())
# Below code will be executed only after 10 packets have been received
do_stuff()
do_stuff2()
do_stuff3()
I want to be able to sniff packets with scapy in a non blocking way, something like this:
def packet_recevied_event(p):
print "Packet received event!"
print p.summary()
# The "event_handler" parameter is my wishful thinking
sniff(filter"tcp and port 80", count=10, prn=labmda x:x.summary(),
event_handler=packet_received_event)
#I want this to be executed immediately
do_stuff()
do_stuff2()
do_stuff3()
To sum-up: My question is pretty clear, I want to be able to continue executing code without the sniff function blocking it.
One option is to open a separate thread for this, but I would like to avoid it and use scapy native tools if possible.
Environment details:
python: 2.7
scapy: 2.1.0
os: ubuntu 12.04 64bit

This functionality was added in https://github.com/secdev/scapy/pull/1999.
I'll be available with Scapy 2.4.3+ (or the github branch). Have a look at the doc over: https://scapy.readthedocs.io/en/latest/usage.html#asynchronous-sniffing
>>> t = AsyncSniffer(prn=lambda x: x.summary(), store=False, filter="tcp")
>>> t.start()
>>> time.sleep(20)
>>> t.stop()

Scapy doesn't have an async version of the sniff function. You're going to have to fire threads.
There may be other issues with this, mostly having to do with resource locking.

Related

Process Several Pcap Files Simultaneously - Django

In essence, the following function, called by the user of the django application that I am developing, uses the Scapy library to process 80-odd fairly large pcaps in order to initially parse their destination IP addresses.
I was wondering whether it would be possible to process several pcaps simultaneously, as the CPU is not being utilised to it's full capacity, ideally using multi-threading
def analyseall(request):
allpcaps = Pcaps.objects.all()
for individualpcap in allpcaps:
strfilename = str(individualpcap.filename)
print(strfilename)
pcapuuid = individualpcap.uuid
print(pcapuuid)
packets = rdpcap(strfilename)
print("hokay")
for packet in packets:
if packet.haslayer(IP):
# print(packet[IP].src)
# print(packet[IP].dst)
dstofpacket = packet[IP].dst
PcapsIps.objects.update_or_create(ip=dstofpacket, uuid=individualpcap)
return render(request, 'about.html', {"list": list})
You can use above answer (multiprocessing), and also improve scapy’s reading speed, by using the PcapReader generator rather than rdpcap
with PcapReader(filename) as fdesc:
for pkt in fdesc:
[actions on the pkt]
I consider mixing multiprocessing and Django tricky. I was working on such solution once and finally I decided to use Celery and RabbitMQ.
Using Celery you can easily define task of processing single pcap. Then you can start a few independent workers for processing files in the background. Such solution will result in a little more complicated architecture (you need to provide message queue e. g. RabbitMQ and the Celery workers), however you can gain a much simpler code.
http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html
In my case Celery saved a lot of time.
You can also check this question and answers:
How to use python multiprocessing module in django view

Poloniex & websockets

===SIMPLE & SHORT===
Does anybody have working application that talks with Poloniex through WAMP in these days (January, 2018)?
===MORE SPECIFIC===
I used several info sources to make it work using combo: autobahn-cpp & C++. Windows 10 OS.
I was able to connect to wss://api.poloniex.com, realm1. Plus I was able to subscribe and get subscription ID. But I never got any events even when everything established.
===RESEARCH===
During research in the web I saw a lot of controversial information:
1. Claims, that wss://api2.poloniex.com should be used, and channels names are actually numbers - How to connect to poloniex.com websocket api using a python library
2. This answer gave me base code, but I am getting anything more than just connections, also by following this answer - wss://api.poloniex.com is correct address - Connecting to Poloniex Push-API
3. I saw post (sorry, lost the link), there were comments made that websockets implementation are basically broken on poloniex. They were posted 6 months ago.
===SPECS===
1. Windows 10
2. Autobahn-Cpp
3. wss://api.poloniex.com:443 ; realm1
4. Different subscriptions: ticker, BTC_ETH, 148, 1002, etc..
5. Source code I got from here
===WILL HELP AS WELL===
Is there any way to get all valid subscriptions or, probably, those, that have more than 0 subscribers? I mean, does WAMP have a way to do that?
Is there any known issues with Autobahn-Cpp and poloniex combo?
Is there any simpler way to test WAMP elsewhere to make sure Autobahn isn't a problem? Like any other well documented & supported online projects that accept WAMP websocket communication?
I can receive the correct tick order book data from wss://api2.poloniex.com use python3
but sometime The channel 1002 may stop sending the new tick info.
wss://api.poloniex.com:443 ; realm1
This may be the issue as I've been using api2 and here is the code that works, and has been working for the past 2 quarters non-stop. Its in python, but should be easy enough to port to C++.
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import websocket
import json
def on_error(ws, error):
print(error)
def on_close(ws):
print("### closed ###")
connection.close()
def on_open(ws):
print("ONOPEN")
ws.send(json.dumps({'command':'subscribe','channel':'BTC_ETH'}))
def on_message(ws, message):
message = json.loads(message)
print(message)
websocket.enableTrace(True)
ws = websocket.WebSocketApp("wss://api2.poloniex.com/",
on_message = on_message,
on_error = on_error,
on_close = on_close)
ws.on_open = on_open
ws.run_forever()
the code is pretty much self-explanatory (You can check all channels/pairs on Poloniex API website), just save it and run in terminal
python3 fileName.py
should provide You with BTCETH raw stream of orders and trades on console output.
Playing with the message/subscriptions You can then do as You please with it.
It seems that websockets in Poloniex are unstable. Therefore I can stop my attempts make Autobahn-Cpp work with it at least by now and move on.

Killing a blocking thread

I'm having a tough time trying to develop a threaded app wherein the threads are each doing REST calls over the network.
I need to kill active threads after a certain timeout. I've tried every python 2.7.x approach I've seen on here and can't get this working.
I'm using python 2.7.6 on OEL linux (3.8.13-44.1.1.el6uek.x86_64).
Here is a simplified snippet of code:
class cthread(threading.Thread):
def __init__(self, cfg, host):
self.cfg = cfg
self.host = host
self.runf = False
self.stop = threading.Event()
threading.Thread.__init__(self. target=self.collect)
def terminate(self):
self.stop.set()
def collect(self):
try:
self.runf = True
while (not self.stop.wait(1)):
# Here I do urllib2 GET request to a REST service which could hang
<rest call>
finally:
self.runf = False
timer_t1 = 0
newthr = cthread(cfg, host)
newthr.start()
while True:
if timer_t1 > 600:
newthr.terminate()
break
time.sleep(30)
timer_t1 += 30
Basically after my timeout period I need to kill all remaining threads, either gracefully or not.
Haven't a heck of a time getting this to work.
Am I going about this the correct way?
There's no official API to kill a thread in python.
In your code relying on urllib2 you might periodically pass timeout left for your threads to run from the main loop and use urllib2 with the timeout option. Or even track the timers in threads exploiting the same approach with urllib2.

Python - Passing a TCP socket object to a multiprocessing Queue

I have a TCP server and client. At some point in the server script, I start a process, which needs to be able to get every new connection and send data to it. In order to do so, I have a multiprocessing.Queue(), to which I want to put every new connection from the main process, so that the process I opened can get the connections from it and send data to them. However, it seems that you cannot pass anything you want to a Queue. When I try to pass the connection (a socket object), I get:
Traceback (most recent call last):
File "/usr/lib/python2.7/multiprocessing/queues.py", line 266, in _feed
send(obj)
TypeError: expected string or Unicode object, NoneType found
Are there any alternatives that I could use?
Sending a socket through a multiprocessing.Queue works fine starting with python3.4 because from that version a ForkingPickler is used to serialize the objects to be put in the queue, and that pickler knows how to serialize sockets and other objects containing a file handle.
The multiprocessing.reduction.ForkingPickler class does already exist in python2.7 and can pickle sockets, it's just not used by multiprocessing.Queue.
If you can't switch to python3.4+ and really need similar functionality in python2.7 a workaround would be to create a function that uses the ForkingPickler to serialize objects, e.g:
from multiprocessing.reduction import ForkingPickler
import StringIO
def forking_dumps(obj):
buf = StringIO.StringIO()
ForkingPickler(buf).dump(obj)
return buf.getvalue()
Instead of sending the socket directly you then need to send its pickled version and unpickle it in the consumer. Simple example:
from multiprocessing import Queue, Process
from socket import socket
import pickle
def handle(q):
sock = pickle.loads(q.get())
print 'rest:', sock.recv(2048)
if __name__ == '__main__':
sock = socket()
sock.connect(('httpbin.org', 80))
sock.send(b'GET /get\r\n')
# first bytes read in parent
print 'first part:', sock.recv(50)
q = Queue()
proc = Process(target=handle, args=(q,))
proc.start()
# use the function from above to serialize socket
q.put(forking_dumps(sock))
proc.join()
Making sockets pickleable only makes sense here in the context of multiprocessing, it would not make sense to write it to a file and use later or try to use it on a different pc or after the original process has ended. Therefore it wouldn't be a good idea to make sockets pickleable globally (e.g. by using the copyreg mechanisms).

pySerial could not open port COM6: Element not found

i tried to use the following code:
import serial
ser = serial.Serial()
ser.baudrate = 56700
ser.port = 'COM6'
ser.open() # HERE IS THE ERROR
When i do it from Python shell, typing line by line, it works and i can recieve data from a bluetooth device. But when i run it from cmd (C:\>python serial.py) it raises the error "could not open port COM6: Element not found". I can see in the bluetooth device that it connects for a second and then disconnects.
Anyone know what this is?
BTW, Im using Windows 7 64-Bits and Python 2.7. Thanks!
Perhaps you forgot to run the Command prompt as Administrator?
I had the same problem. I fixed it by adding time.sleep(5) around ser.open(). So it would look like this:
import serial
import time
ser = serial.Serial()
ser.baudrate = 56700
ser.port = 'COM6'
time.sleep(5)
ser.open()
time.sleep(5)
I didn't experiment much with the code, but you probably don't have to sleep for 5 seconds, you can probably just sleep for 0.1 seconds and it would still work. This is probably not the best way to fix it, but it works.