libcurl multi asynchronous support - libcurl

From the examples and documentation, it seems libcurl multi interface provides asynchronous support in batch mode i.e. easy handles are added to multi and then finally the requests are fired simultaneously with curl_multi_socket_action. Is it possible to trigger a request, when easy handle is added but the control returns to application after request is written on the socket?
EDIT:
It'll help in firing request in the below model, instead of firing requests in batch(assuming request creation on client side and processing on the server takes same duration)
Client -----|-----|-----|-----|
Server < >|-----|-----|-----|----|

The multi interface returns "control" to the application as soon as it would otherwise block. It will therefor also return control after it has sent off the request.
But I guess you're asking how you can figure out exactly when the request has been sent? I think that's only really possibly by using CURLOPT_DEBUGFUNCTION and seeing when the request is sent. Not really a convenient way...

you can check the documents this:
https://curl.haxx.se/libcurl/c/hiperfifo.html
It's combined with libevent and libcurl.
When running, the program creates the named pipe "hiper.fifo"
Whenever there is input into the fifo, the program reads the input as a list
of URL's and creates some new easy handles to fetch each URL via the
curl_multi "hiper" API.
The fifo buffer is handled almost instantly, so you can even add more URL's while the previous requests are still being downloaded.
Then libcurl will download all easy handles asynchronously by calling curl_multi_socket_action ,so the control will return to system.

Related

Testing: When writing a HTTP response to the socket, write the headers, then sleep before writing the body

This is surely a weird one ... I'm doing some extreme integration style testing on a custom Java HTTP client for a backend service I'm working with. For reasons which don't matter here, the client has some specific quirks and a custom solution was the only real option.
For automated testing, I've built a "fake" version of the backend service by spinning up a Jetty server locally and having it behave in different ways e.g. return 500, wait e.g. 4 seconds before giving a response to simulate latency etc and firing off a battery of tests against it with the client on build time.
Given the nature of this client, there is an usual and specific scenario which I need to test and I'm trying to find a way to make my Jetty serve behave in the correct fashion. Basically, when returning HTTP response, I need to immediately return the HTTP Headers and the first few bytes of the HTTP body and then sleep. The goal is to trigger a socket timeout in the client specifically when reading the HTTP body.
Anyone know where in Jetty I could plug something in to force this behaviour? Was looking at the Connector interface but not so sure thats the right place.
Thanks for any suggestions.
Write a few bytes to the HttpServletResponse.getOutputStream(), then call HttpServletResponse.flushBuffer() to immediately commit the response.
Bonus tip: use HttpServletResponse.sendError(-1) to terminate the connection abruptly.

Asynchronous Request Handling Using Multithreading

I am working on a module which uses 10 queues to handle threads and each of them send curl requests using curl_easy interface (along with Lock) so that; a single connection is maintained till the response is not received. I want to enhance request handling by using curl_multi interface where curl requests are sent by the thread and handled in parallel fashion.
I have created a separate code to implement it. I created 3 threads for instance, being handled one by one, the first thread sends request to curl_multi till it's running and there are transfers existing, which allocates resources using curl_easy interface for each transfer.
I have gone through a lot of examples but cannot figure out how to implement it in C++. Also because I have recently learnt multi threading and curl concepts in C++ I need assistance with the approach.
I expect a single thread should be able to send curl requests till the user doesn't stop sending.
Update - I have added two threads and each sends two requests simultaneously. curl_multi is being handled by an array of handles, for curl_easy.
I want to keep it free of arrays because that is limiting the number of requests.
Can it be made asynchronous and accept all transfers and exit only when the client/user does. There are enough examples of curl_multi therefore I am not clear of its implementation.
Reading the curl_multidocumentation, it doesn't seem as you have to create different threads for this, as it works via your multiple easy handles added to the multi handle object. You then call curl_multi_perform to start all transfers in a non-blocking way.
I expect a single thread should be able to send curl requests till the user doesn't stop sending.
I don't understand what you mean by this, do you mean that you just want to keep those connections alive until everything is transferred? If so, curl_multi already gives you info on the progress of your transfers which can help you determine what to do.
Hope it helps

Is there any way to build an interactive terminal using Django Channels with it's current limitations?

It seems with Django Channels each time anything happens on the websocket there is no persistent state. Even within the same websocket connection, you can not preserve anything between each call to receive() on a class based consumer. If it can't be serialized into the channel_session, it can't be stored.
I assumed that the class based consumer would be persisted for the duration of the web socket connection.
What I'm trying to build is a simple terminal emulator, where a shell session would be created when the websocket connects. Read data would be passed as input to the shell and the shell's output would be passed out the websocket.
I can not find a way to persist anything between calls to receive(). It seems like they took all the bad things about HTTP and brought them over to websockets. With each call to conenct(), recieve(), and disconnect() the whole Consumer class is reinstantiated.
So am I missing something obvious. Can I make another thread and have it read from a Group?
Edit: The answers to this can be found in the comments below. You can hack around it. Channels 3.0 will not instantiate the Consumers on every receive call.
The new version of Channels does not have this limitation. Consumers stay in memory for the duration of the websocket request.

libcurl multi interface single easy interface multiple requests

I am looking to use libcurl for asynchronous HTTP requests. For that I am using multi interface provided by libcurl. My application will be having many requests coming periodically for which I am looking to use single easy interface and add it to the multi interface. I am not planning to use new easy interface handles for each and every http requests because it opens up a new connection with a new session. I need to make all requests in a single connection/session. So, I am looking to use single easy interface handle for all requests.
With this model, I am getting issues in making multiple http requests. The first request goes through using curl_multi_perform with no issue. Response is processed. The second request does not go through with curl_multi_perform. When curl_multi_perform is called second time, the second parameter running_handles is returned as 0 and not 1.
This is the flow of APIs I am using at high level.
curl_easy_init()
curl_multi_init()
curl_multi_add_handle()
curl_multi_perform() // running_handles returned is 1.
//look for response (curl_multi_timeout, curl_multi_fdset, select, curl_multi_info_read, ...)
curl_multi_perform() // This does not work and running_handles returned is 0
...
curl_multi_cleanup()
curl_easy_cleanup()
Can't libcurl multi interface be used with single easy interface added for multiple requests coming over a period of time?
Please help. Thanks in advance.
When an easy handle has completed its transfer and you want to re-use that same handle for a subsequent transfer, you need to first remove it from the multi handle (curl_multi_remove_handle) and (possible set new options and then) re-add it with curl_multi_add_handle to make it start another transfer.
But note that when using the multi interface, the connection pool and reuse mechanism is owned by the multi handle and the easy handle so connections can and will be re-used across easy handles as long as you keep the multi handle alive.

How to send request and receive response asynchronously to a .NET webservice by gSOAP2

I have a .NET webservice and a client program which was written by C++. The client program is using gSOAP2 to access the web service. The problem is I need to make a client request and receiving the response from server asynchronously. I search a lot by google and also read gSOAP user guide in 7.3 and 7.4 section but I still don't figure out how to do it. Please help me if you know.
Many thanks,
Tien
I don't think that gsoap means the same thing by asyncronous as you do, an asyncronous gsoap client fires of a message and then forgets about it; from reading your question my understanding is that you want to start the SOAP request/response process, go away and do something else, and then come back latter or be notified when the response has been returned.
If this is the case then I'd suggest you look at using threads to get the behaviour you want. Start a new thread to make the call, your main thread can then be notified or can check back when the call has completed. If you need data back from the call then if I was doing this I'd be tempted to write a thread that communicates via a pair of threadsafe queues. One queue to send requests into the thread and one to pass responses back out. So the main thread writes to the input queue and reads the output queue. If you search on here for C++ threadsafe queue you'll get lots more info.