I am trying to migrate my code from wsgi to use asgi/channel's asyncHttpConsumer. I can get a http_request from diaglogflow. Then I can either use send() or send_response() for response.
I can do something like
await self.send_response(200, b'response text',
headers=[(b"Content-Type", b"text/plain"),
])
and my heroku server sends it out normally, but dialogflow does not get anything back.
I have another wsgi application that just uses
from django.http import JsonResponse
...
fulfillmentText = {'fulfillmentText': "server works correctly"}
return JsonResponse(fulfillmentText, safe=False)
where this actually returns to dialogflow correctly.
I tried to use JsonResponse on the asgi/channel side but it just gives me an error that just basically say I'm not using send_response correctly.
What do I need to do to convert my response correctly on the asyncHttpConsumer side?
Figured it out.
I just needed to convert my dict to bytes using
response = json.dumps(mydict).encode('utf-8')
await self.send_response(200, fulfillmentText, headers=[(b"Content-Type", b"text/plain"),])
Related
I have a Flask method that gets url and returns the zip file's content. The content should be returned to the axios get method. Flask code:
zip_file = requests.get(url)
if zip_file.ok:
return {
'file': zip_file.content
}
But it doesn't work, it fires exception 'TypeError(repr(o) + " is not JSON serializable").
How to fix it? I saw some solutions with coding-decoding but I'm not sure how to use it, especially after the axios request gets the response (I can't use the direct link because of security reasons).
In tests I am making an api call, which returns 400. It is expected, but I can't find a way to debug this. Does django keep logs in a file somewhere? Or can I enable showing logs please?
res = self.client.post(self.url, data=payload, format='json')
print(res)
// <Response status_code=400, "application/json">
I knot something went wrong but how do I debug the server?
Thanks
You can use response.content to view the final content/error messages that will be rendered on the web-page, as bytestring. docs
>>> response = c.get('/foo/bar/')
>>> response.content
b'<!DOCTYPE html...
If you are returning a json response(which you probably are if using rest framework), you can use response.json() to parse the json. docs
>>> response = client.get('/foo/')
>>> response.json()['name']
'Arthur'
Note: If the Content-Type header is not "application/json", then a ValueError will be raised when trying to parse the response. Be sure to handle it properly.
Request is {"name":"vikas","age":20}
I am using following code to return the response from server
URL : "localhost:3000"
status_code, employee_data = webutils.make_request(settings.URL + "get_data/",
method='POST',
body=json.dumps(request.data),
json=True)
return Response(employee_data )
When I am using the postman,accessing the url localhost:3000/get_data/
It is returning the correct format.
But through django framework returning the response as false.
Please let me know the following code,I have written is correct in django
webutils.make_request(settings.URL + "get_data/", method='POST', body=json.dumps(request.data), json=True)
Please try with httpresponse class json dump data.
Try with Python requests library
How can I test a scrapy spider against online data.
I now from this post that it is possible to test a spider against offline data.
My target is to check if my spider still extracts the right data from a page, or if the page changed. I extract the data via XPath and sometimes the page receives and update and my scraper is no longer working. I would love to have the test as close to my code as possible, eg. using the spider and scrapy setup and just hook into the parse method.
Referring to the link you provided, you could try this method for online testing which I used for my problem which was similar to yours. All you have to do is instead of reading the requests from a file you can use the Requests library to fetch the live webpage for you and compose a scrapy response from the response you get from Requests like below
import os
import requests
from scrapy.http import Response, Request
def online_response_from_url (url=None):
if not url:
url = 'http://www.example.com'
request = Request(url=url)
oresp = requests.get(url)
response = TextResponse(url=url, request=request,
body=oresp.text, encoding = 'utf-8')
return response
I have a view in Django that returns error 500 and I can't figure out why. It looks something like that:
def some_view(request):
result = some_func(request.raw_post_data)
response = HttpResponse(status=200)
response['Content-Type'] = 'application/json'
response.content = simplejson.dumps(result)
# if I log something here, it will be printed, so control reaches here
return response
So it looks like my view is working correctly and something then happens in Django internals, but I'm unable to trace where exactly it happens. Any hint on what it is or how to find it?
Things that might be important:
I'm running Python 2.5 and Django 1.1.4
POST data contains a JSON array with around 1000 string entries, 50 bytes each
the response is around 100KiB
other views seem to work pefectly fine
DB operations are involved
The problem was that FastCGI module in Apache has 30s timeout and it took more than 30s for Django to prepare the response. Apache then returned the generic 500 error message.