Tweepy location on Twitter API filter always throws 406 error - python-2.7

I'm using the following code (from django management commands) to listen to the Twitter stream - I've used the same code on a seperate command to track keywords successfully - I've branched this out to use location, and (apparently rightly) wanted to test this out without disrupting my existing analysis that's running.
I've followed the docs and have made sure the box is in Long/Lat format (in fact, I'm using the example long/lat from the Twitter docs now). It looks broadly the same as the question here, and I tried using their version of the code from the answer - same error. If I switch back to using 'track=...', the same code works, so it's a problem with the location filter.
Adding a print debug inside streaming.py in tweepy so I can see what's happening, I print out the self.parameters self.url and self.headers from _run, and get:
{'track': 't,w,i,t,t,e,r', 'delimited': 'length', 'locations': '-121.7500,36.8000,-122.7500,37.8000'}
/1.1/statuses/filter.json?delimited=length and
{'Content-type': 'application/x-www-form-urlencoded'}
respectively - seems to me to be missing the search for location in some way shape or form. I don't believe I'm/I'm obviously not the only one using tweepy location search, so think it's more likely a problem in my use of it than a bug in tweepy (I'm on 2.3.0), but my implementation looks right afaict.
My stream handling code is here:
consumer_key = 'stuff'
consumer_secret = 'stuff'
access_token='stuff'
access_token_secret_var='stuff'
import tweepy
import json
# This is the listener, resposible for receiving data
class StdOutListener(tweepy.StreamListener):
def on_data(self, data):
# Twitter returns data in JSON format - we need to decode it first
decoded = json.loads(data)
#print type(decoded), decoded
# Also, we convert UTF-8 to ASCII ignoring all bad characters sent by users
try:
user, created = read_user(decoded)
print "DEBUG USER", user, created
if decoded['lang'] == 'en':
tweet, created = read_tweet(decoded, user)
print "DEBUG TWEET", tweet, created
else:
pass
except KeyError,e:
print "Error on Key", e
pass
except DataError, e:
print "DataError", e
pass
#print user, created
print ''
return True
def on_error(self, status):
print status
l = StdOutListener()
auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_token, access_token_secret_var)
stream = tweepy.Stream(auth, l)
#locations must be long, lat
stream.filter(locations=[-121.75,36.8,-122.75,37.8], track='twitter')

The issue here was the order of the coordinates.
Correct format is:
SouthWest Corner(Long, Lat), NorthEast Corner(Long, Lat). I had them transposed. :(

The streaming API doesn't allow to filter by location AND keyword simultaneously.
you must refer to this answer i had the same problem earlier
https://stackoverflow.com/a/22889470/4432830

Related

Typeform Security API and Django: Not Verifiying Hash Correctly

I am trying to use Typeform's security for their webhooks. This involves
1) Receiving the signed packets and extracting the signature
2) Getting the body of the requst
3) Creating a hash with a secret key on the payload
4) Matching the hash with the received signature
My web framework is Django (Python based). I am following the example at the TypeForm link here: https://developer.typeform.com/webhooks/secure-your-webhooks/.
For the life of me, I can't figure out what's going on. I've tried it in both Python and Ruby, and I can't get the hash right. I call a Ruby script from Python to match the output, but they are different and neither work. Does anyone have any insight? I'm starting to think that it might have something to do with the way that Django sends request bodies. Does anyone have any input?
Python implementation:
import os
import hashlib
import hmac
import base64
import json
class Typeform_Verify:
# take the request body in and encrypt with string
def create_hash(payload):
# convert the secret string to bytes
file = open("/payload.txt", "w")
# write to a payload file for the ruby script to read later
file.write(str(payload))
# access the secret string
secret = bytearray(os.environ['DT_TYPEFORM_STRING'], encoding="utf-8")
file.close()
# need to have the ruby version also write to a file
# create a hash with payload as the thing
# and the secret as the key`
pre_encode = hmac.new(secret,
msg=payload, digestmod=hashlib.sha256).digest()
post_encode = base64.b64encode(pre_encode)
return post_encode
# another approach is to make a ruby script
# that returns a value and call it from here
def verify(request):
file = open("/output.txt", "w")
# check the incoming hash values
received_hash = request.META["HTTP_TYPEFORM_SIGNATURE"]
# create the hash of the payload
hash = Typeform_Verify.create_hash(request.body)
# call ruby script on it
os.system(f"ruby manager/ruby_version.rb {received_hash} &> /oops.txt")
# concatenate the strings together to make the hash
encoded_hash = "sha256=" + hash.decode("utf-8")
file.write(f"Secret string: {os.environ['DT_TYPEFORM_STRING']}\n")
file.write(f"My hash : {encoded_hash}\n")
file.write(f"Their hash : {received_hash}\n")
file.close()
return received_hash == encoded_hash
Ruby script (called from Python)
require 'openssl'
require 'base64'
require 'rack'
def verify_signature(received_signature, payload_body, secret)
hash = OpenSSL::HMAC.digest(OpenSSL::Digest.new('sha256'), secret, payload_body)
# the created signature
actual_signature = 'sha256=' + Base64.strict_encode64(hash)
# write created signature to the file
out_file = File.new("/output.txt", "a")
out_file.write("Ruby output: ")
out_file.write(actual_signature)
out_file.close()
return 500, "Signatures don't match!" unless Rack::Utils.secure_compare(actual_signature, received_signature)
end
# MAIN EXECUTION
# get the hash from the python scriupt
received_hash = ARGV[0]
# read the content of the file into the f array
# note that this is the json payload from the python script
f = IO.readlines("/payload.txt")
# declare the secret string
secret = "SECRET"
# call the funtion with the recieved hash, file data, and key
result = verify_signature(received_hash, f[0], secret)
Code output:
Typeform hash: sha256=u/A/F6u3jnG9mr8KZH6j8/gO+Uny6YbSYFz7+oGmOik=
Python hash: sha256=sq7Kl2qBwRrwgGJeND6my4UPli8rseuwaK+f/sl8dko=
Ruby output: sha256=BzMxPZGmxgOMeJ236eAxSOXj85rEWI84t+6CtQBYliA=
UPDATED First see this github article as the one you referred to may be based on it.
The idea is that your requests should be signed. Here is a more basic pure ruby example which should illustrate how this should work.
# test.rb
ENV['SECRET_TOKEN'] = 'foobar'
require 'openssl'
require 'base64'
require 'rack'
def stub_request(body)
key = ENV['SECRET_TOKEN']
digest = OpenSSL::Digest.new('sha256')
hmac_signature = OpenSSL::HMAC.hexdigest(digest, key, body)
{ body: body, hmac_signature: hmac_signature }
end
def verify_signature(payload_body, request_signature)
digest = OpenSSL::Digest.new('sha256')
hmac = OpenSSL::HMAC.hexdigest(OpenSSL::Digest.new('sha256'), ENV['SECRET_TOKEN'], payload_body)
if Rack::Utils.secure_compare(request_signature, hmac)
puts "They match"
else
puts "They don't match"
end
puts "request_signature: #{request_signature}"
puts " hmac: #{hmac}"
puts " body: #{payload_body}"
end
request = stub_request(ARGV[0])
verify_signature(request[:body], request[:hmac_signature])
Now to test this, just run:
ruby test.rb 'this is some random body string'
Here is a Python version of the same code. But this is vulnerable to timing attack vulnerability. There is probably a Python equivalent somewhere to mitigate this but I didn't do the research to find it. It shouldn't be hard to write something like the Ruby Rack version here in Python if your server doesn't already have something like it.
#test.py
import sys
import hashlib
import binascii
import hmac
import base64
KEY = 'foobar'
def stub_request(body):
key = bytes(KEY, 'utf-8')
body_bytes = bytes(body, 'utf-8')
hmac_signature = hmac.new(key,
msg=body_bytes, digestmod=hashlib.sha256).digest()
return {'body': body, 'hmac_signature': hmac_signature}
def verify_signature(payload_body, request_signature):
key = bytes(KEY, 'utf-8')
hmac_sig = hmac.new(key, msg=bytes(payload_body,'utf-8'), digestmod=hashlib.sha256).digest()
if hmac_sig == request_signature:
print("They match")
else :
print("They don't match")
print(f"request_signature: {binascii.hexlify(request_signature)}")
print(f" hmac: {binascii.hexlify(hmac_sig)}")
print(f" body: {payload_body}")
return request_signature
body = sys.argv[-1]
request = stub_request(body)
verify_signature(request['body'], request['hmac_signature'])
I ended up figuring it out. The Python implementation I had worked fine. The problem was in how I was saving the secret string. Apparently, environment variables in Python will not allow characters like $ or *. My Ruby implementation started working when I hardcoded my secret into the code, which led me to believe that the problem was in how I was saving the secret string. I recommend the Python implementation to anyone trying to do this kind of authentication. Cheers!

Pulling data from datastore and converting it in Json in python(Google Appengine)

I am creating an apllication using google appengine, in which i am fetching a data from the website and storing it in my Database (Data store).Now whenever user hits my application url as "application_url\name =xyz&city= abc",i am fetching the data from the DB and want to show it as json.Right now i am using a filter to fetch data based on the name and city but getting output as [].I dont know how to get data from this.My code looks like this:
class MainHandler(webapp2.RequestHandler):
def get(self):
commodityname = self.request.get('veg',"Not supplied")
market = self.request.get('market',"No market found with this name")
self.response.write(commodityname)
self.response.write(market)
query = commoditydata.all()
logging.info(commodityname)
query.filter('commodity = ', commodityname)
result = query.fetch(limit = 1)
logging.info(result)
and the db structure for "commoditydata" table is
class commoditydata(db.Model):
commodity= db.StringProperty()
market= db.StringProperty()
arrival= db.StringProperty()
variety= db.StringProperty()
minprice= db.StringProperty()
maxprice= db.StringProperty()
modalprice= db.StringProperty()
reporteddate= db.DateTimeProperty(auto_now_add = True)
Can anyone tell me how to get data from the db using name and market and covert it in Json.First getting data from db is the more priority.Any suggestions will be of great use.
If you are starting with a new app, I would suggest to use the NDB API rather than the old DB API. Your code would look almost the same though.
As far as I can tell from your code sample, the query should give you results as far as the HTTP query parameters from the request would match entity objects in the datastore.
I can think of some possible reasons for the empty result:
you only think the output is empty, because you use write() too early; app-engine doesn't support streaming of response, you must write everything in one go and you should do this after you queried the datastore
the properties you are filtering are not indexed (yet) in the datastore, at least not for the entities you were looking for
the filters are just not matching anything (check the log for the values you got from the request)
your query uses a namespace different from where the data was stored in (but this is unlikely if you haven't explicitly set namespaces anywhere)
In the Cloud Developer Console you can query your datastore and even apply filters, so you can see the results with-out writing actual code.
Go to https://console.developers.google.com
On the left side, select Storage > Cloud Datastore > Query
Select the namespace (default should be fine)
Select the kind "commoditydata"
Add filters with example values you expect from the request and see how many results you get
Also look into Monitoring > Log which together with your logging.info() calls is really helpful to better understand what is going on during a request.
The conversion to JSON is rather easy, once you got your data. In your request handler, create an empty list of dictionaries. For each object you get from the query result: set the properties you want to send, define a key in the dict and set the value to the value you got from the datastore. At the end dump the dictionary as JSON string.
class MainHandler(webapp2.RequestHandler):
def get(self):
commodityname = self.request.get('veg')
market = self.request.get('market')
if commodityname is None and market is None:
# the request will be complete after this:
self.response.out.write("Please supply filters!")
# everything ok, try query:
query = commoditydata.all()
logging.info(commodityname)
query.filter('commodity = ', commodityname)
result = query.fetch(limit = 1)
logging.info(result)
# now build the JSON payload for the response
dicts = []
for match in result:
dicts.append({'market': match.market, 'reporteddate': match.reporteddate})
# set the appropriate header of the response:
self.response.headers['Content-Type'] = 'application/json; charset=utf-8'
# convert everything into a JSON string
import json
jsonString = json.dumps(dicts)
self.response.out.write( jsonString )

Create/Update Tag for Intercom.io User in Python

Unfortunately there's no way to create a user in Intercom.io with a tag, so I'm trying to write some code that will look for an existing tag in Intercom, and if it's there, add a user to that tag, and if it's not, create the tag and add the user to it. I've tried several different variations by looking at the docs for the python-intercom library, but there are conflicting methods (Intercom.update_tag vs. Tag.update), and nothing has worked yet.
Here's how users are created in Intercom (this works):
import time
from members.models import Member
from intercom import Intercom, Tag
Intercom.app_id = settings.INTERCOM_TEST_APP_ID
Intercom.api_key = settings.INTERCOM_TEST_API_KEY
member = Member.objects.get(email="exampleemail#example.com")
Intercom.create_user(
email=member.email,
user_id=member.email,
name="%s %s" % (member.first_name, member.last_name),
created_at=int(time.time()),
city_name=member.city,
last_seen_ip=member.last_ip,
)
Here's what I currently have to look for and create or update tags, which triggers no errors, but doesn't successfully tag the user:
tag = Intercom.get_tag(name=member.referral_code)
if tag['id'] != None:
Intercom.update_tag(member.referral_code, "tag", user_ids=[member.pk])
else:
Intercom.create_tag(tag, "tag", user_ids=[member.pk])
I've also tried variations of the following, but it gets the error "descriptor 'update' requires a 'dict' object but received a 'unicode':
if Tag.find_by_name(member.referral_code) != 0:
Tag.update(member.referral_code, "tag", user_ids=[member.pk])
else:
Tag.create(member.referral_code, "tag", user_ids=[member.pk])
What do I need to change to get tagging to work?
My name's Jeff, I'm one of the customer success engineers at Intercom. Unfortunately the intercom-python library is still using our deprecated V1 API which is likely causing some of the confusion here. Until that library updates to use our newer REST API I would suggest that you use the python requests library and call our API directly. I've got minimal python experience but something like this should get you started on the right track.
import requests
from requests.auth import HTTPBasicAuth
import json
tags_url = 'https://api.intercom.io/tags'
app_id = 'YOUR_APP_ID'
api_key = 'YOUR_API_KEY'
headers = {'content-type': 'application/json', 'Accept': 'application/json'}
tag_name = 'My sweet tag'
# Get tags to then loop through
list_tag_response_as_json = requests.get(tags_url, auth=(app_id, api_key), headers=headers).json()
tag_names = [tag['name'] for tag in list_tag_response_as_json['tags']]
if tag_name not in tag_names
# Create a tag
tag_response = requests.post(tags_url, auth=(app_id, api_key), headers=headers, data={'name': tag_name})
# Tag users
tag_and_users = {'name':tag_name, 'users': [{'email': 'abc#example.com'}, {'email': 'def#example.com'}]}
tagged_user_response = requests.post(tags_url, auth=(app_id, api_key), headers=headers, data=tag_and_users)
Also, feel free to give us a shout in Intercom if you're still having trouble and we can help you there.

Python library to access a CalDAV server

I run ownCloud on my webspace for a shared calendar. Now I'm looking for a suitable python library to get read only access to the calendar. I want to put some information of the calendar on an intranet website.
I have tried http://trac.calendarserver.org/wiki/CalDAVClientLibrary but it always returns a NotImplementedError with the query command, so my guess is that the query command doesn't work well with the given library.
What library could I use instead?
I recommend the library, caldav.
Read-only is working really well with this library and looks straight-forward to me. It will do the whole job of getting calendars and reading events, returning them in the iCalendar format. More information about the caldav library can also be obtained in the documentation.
import caldav
client = caldav.DAVClient(<caldav-url>, username=<username>,
password=<password>)
principal = client.principal()
for calendar in principal.calendars():
for event in calendar.events():
ical_text = event.data
From this on you can use the icalendar library to read specific fields such as the type (e. g. event, todo, alarm), name, times, etc. - a good starting point may be this question.
I wrote this code few months ago to fetch data from CalDAV to present them on my website.
I have changed the data into JSON format, but you can do whatever you want with the data.
I have added some print for you to see the output which you can remove them in production.
from datetime import datetime
import json
from pytz import UTC # timezone
import caldav
from icalendar import Calendar, Event
# CalDAV info
url = "YOUR CALDAV URL"
userN = "YOUR CALDAV USERNAME"
passW = "YOUR CALDAV PASSWORD"
client = caldav.DAVClient(url=url, username=userN, password=passW)
principal = client.principal()
calendars = principal.calendars()
if len(calendars) > 0:
calendar = calendars[0]
print ("Using calendar", calendar)
results = calendar.events()
eventSummary = []
eventDescription = []
eventDateStart = []
eventdateEnd = []
eventTimeStart = []
eventTimeEnd = []
for eventraw in results:
event = Calendar.from_ical(eventraw._data)
for component in event.walk():
if component.name == "VEVENT":
print (component.get('summary'))
eventSummary.append(component.get('summary'))
print (component.get('description'))
eventDescription.append(component.get('description'))
startDate = component.get('dtstart')
print (startDate.dt.strftime('%m/%d/%Y %H:%M'))
eventDateStart.append(startDate.dt.strftime('%m/%d/%Y'))
eventTimeStart.append(startDate.dt.strftime('%H:%M'))
endDate = component.get('dtend')
print (endDate.dt.strftime('%m/%d/%Y %H:%M'))
eventdateEnd.append(endDate.dt.strftime('%m/%d/%Y'))
eventTimeEnd.append(endDate.dt.strftime('%H:%M'))
dateStamp = component.get('dtstamp')
print (dateStamp.dt.strftime('%m/%d/%Y %H:%M'))
print ('')
# Modify or change these values based on your CalDAV
# Converting to JSON
data = [{ 'Events Summary':eventSummary[0], 'Event Description':eventDescription[0],'Event Start date':eventDateStart[0], 'Event End date':eventdateEnd[0], 'At:':eventTimeStart[0], 'Until':eventTimeEnd[0]}]
data_string = json.dumps(data)
print ('JSON:', data_string)
pyOwnCloud could be the right thing for you. I haven't tried it, but it should provide a CMDline/API for reading the calendars.
You probably want to provide more details about how you are actually making use of the API but in case the query command is indeed not implemented, there is a list of other Python libraries at the CalConnect website (archvied version, original link is dead now).

Decoding utf-8 in Django while using unicode_literals in Python 2.7

I'm using Django to manage a Postgres database. I have a value stored in the database representing a city in Spain (Málaga). My Django project uses unicode strings for everything by putting from __future__ import unicode_literals at the beginning of each of the files I created.
I need to pull the city information from the database and send it to another server using an XML request. There is logging in place along the way so that I can observe the flow of data. When I try and log the value for the city I get the following traceback:
UnicodeEncodeError: 'ascii' codec can't encode character u'\xe1' in position 1: ordinal not in range(128)
Here is the code I use to log the values I'm passing.
def createXML(self, dict):
"""
.. method:: createXML()
Create a single-depth XML string based on a set of tuples
:param dict: Set of tuples (simple dictionary)
"""
xml_string = ''
for key in dict:
self.logfile.write('\nkey = {0}\n'.format(key))
if (isinstance(dict[key], basestring)):
self.logfile.write('basestring\n')
self.logfile.write('value = {0}\n\n'.format(dict[key].decode('utf-8')))
else:
self.logfile.write('value = {0}\n\n'.format(dict[key]))
xml_string += '<{0}>{1}</{0}>'.format(key, dict[key])
return xml_string
I'm basically saving all the information I have in a simple dictionary and using this function to generate an XML formatted string - this is beyond the scope of this question.
The error I am getting had me wondering what was actually being saved in the database. I have verified the value is utf-8 encoded. I created a simple script to extract the value from the database, decode it and print it to the screen.
from __future__ import unicode_literals
import psycopg2
# Establish the database connection
try:
db = psycopg2.connect("dbname = 'dbname' \
user = 'user' \
host = 'IP Address' \
password = 'password'")
cur = db.cursor()
except:
print "Unable to connect to the database."
# Get database info if any is available
command = "SELECT state FROM table WHERE id = 'my_id'"
cur.execute(command)
results = cur.fetchall()
state = results[0][0]
print "my state is {0}".format(state.decode('utf-8'))
Result: my state is Málaga
In Django I'm doing the following to create the HTTP request:
## Create the header
http_header = "POST {0} HTTP/1.0\nHost: {1}\nContent-Type: text/xml\nAuthorization: Basic {2}\nContent-Length: {3}\n\n"
req = http_header.format(service, host, auth, len(self.xml_string)) + self.xml_string
Can anyone help me correct the problem so that I can write this information to the database and be able to create the req string to send to the other server?
Am I getting this error as a result of how Django is handling this? If so, what is Django doing? Or, what am I telling Django to do that is causing this?
EDIT1:
I've tried to use Django's django.utils.encoding on this state value as well. I read a little from saltycrane about a possible hiccup Djano might have with unicode/utf-8 stuff.
I tried to modify my logging to use the smart_str functionality.
def createXML(self, dict):
"""
.. method:: createXML()
Create a single-depth XML string based on a set of tuples
:param dict: Set of tuples (simple dictionary)
"""
xml_string = ''
for key in dict:
if (isinstance(dict[key], basestring)):
if (key == 'v1:State'):
var_str = smart_str(dict[key])
for index in range(0, len(var_str)):
var = bin(ord(var_str[index]))
self.logfile.write(var)
self.logfile.write('\n')
self.logfile.write('{0}\n'.format(var_str))
xml_string += '<{0}>{1}</{0}>'.format(key, dict[key])
return xml_string
I'm able to write the correct value to the log doing this but I narrowed down another possible problem with the .format() string functionality in Python. Of course my Google search of python format unicode had the first result as Issue 7300, which states that this is a known "issue" with Python 2.7.
Now, from another stackoverflow post I found a "solution" that does not work in Django with the smart_str functionality (or at least I've been unable to get them to work together).
I'm going to continue digging around and see if I can't find the underlying problem - or at least a work-around.
EDIT2:
I found a work-around by simply concatenating strings rather than using the .format() functionality. I don't like this "solution" - it's ugly, but it got the job done.
def createXML(self, dict):
"""
.. method:: createXML()
Create a single-depth XML string based on a set of tuples
:param dict: Set of tuples (simple dictionary)
"""
xml_string = ''
for key in dict:
xml_string += '<{0}>'.format(key)
if (isinstance(dict[key], basestring)):
xml_string += smart_str(dict[key])
else:
xml_string += str(dict[key])
xml_string += '<{0}>'.format(key)
return xml_string
I'm going to leave this question unanswered as I'd love to find a solution that lets me use .format() the way it was intended.
This is correct approach (problem was with opening file. With UTF-8 You MUST use codecs.open() :
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import codecs
class Writer(object):
logfile = codecs.open("test.log", "w", 'utf-8')
def createXML(self, dict):
xml_string = ''
for key, value in dict.iteritems():
self.logfile.write(u'\nkey = {0}\n'.format(key))
if (isinstance(value, basestring)):
self.logfile.write(u'basestring\n')
self.logfile.write(u'value = {0}\n\n'.format( value))
else:
self.logfile.write(u'value = {0}\n\n'.format( value ))
xml_string += u'<{0}>{1}</{0}>'.format(key, value )
return xml_string
And this is from python console:
In [1]: from test import Writer
In [2]: d = { 'a' : u'Zażółć gęślą jaźń', 'b' : u'Och ja Ci zażółcę' }
In [3]: w = Writer()
In [4]: w.createXML(d)
Out[4]: u'<a>Za\u017c\xf3\u0142\u0107 g\u0119\u015bl\u0105 ja\u017a\u0144</a><b>Och ja Ci za\u017c\xf3\u0142c\u0119</b>'
And this is test.log file:
key = a
basestring
value = Zażółć gęślą jaźń
key = b
basestring
value = Och ja Ci zażółcę