I'm having a difficult time with libcurl trying to adapt it to a particular situation. What I'm doing is essentially loading a variable number of objects into memory, performing various transforms on them, and then I want to uploaded them (serialized binary data of course) as part of a multi part post.
The part I'm struggling with is that I want to just add them as a part as they finish down this pipeline, then delete them after that particular part is posted.
I have thought about perhaps giving it a read function ptr, and on the callbacks perhaps manually feed the buffer with the part headers and data? This approach seems to be quite a hack.
I have tried the regular multipart approach (with multi-handle) but that seems to require all the data up front, or to be read from a file. Which i do not want libcurl to deal with.
To recap, I want to open a connection, start http multipart post request -> get in memory buffer -> add as post attatchment (multipart) -> send that off -> wait for next chunk of data -> repeat till done.
Thanks in advanced.
Use the curl_formadd() function to prepare a multipart/form-data HTTP post, and then use the CURLOPT_HTTPPOST option to actuallly send it. curl_formadd() has a CURLFORM_STREAM option to enable use of the connection's CURLOPT_READFUNCTION callback so you can custom-stream each multipart's data.
Related
sorry for absolutly murdering the tilte. But I am not sure how to frame this question, please edit this if there is a better way of explaining my problem.
I am reading a bitstream from a program which I convert into json data, write it to a socket, where another program reads this data and appends it to a log.json file. I am doing all of this in C++
Now I want to display this data in a better way. So why not try to display this in an html document, with some css applied on it.
My first thought was to simply fetch this with javascript. But now-a-days this throws an error.
So my second thought was to create a simple node.js server which accepts GET requests and then use this to serve the file. But this feels like its a bit overkill.
My third thought is now to perhaps use my original server (who continuously reads from the socket). And use that one to also accept http requests. But then I would have to multithread it, which again seems kinda overkill.
So im kinda falling back to needing 2 different "servers". One that reads from the socket and appends to the log file and another to serve this file to the website.
Am I'm thinking wrong here? What would be a good way to solve this?
I want to use bokeh to display a time series and provide the data with updates via source.stream(someDict). The data is, however, generated by a c++ application (server) that may run on the same machine or a machine in the network. I was looking into transmitting the updated data (only the newly added lines of the time series) via ZMQ to the python program (client).
The transmission of the message seems easy enough to implement but
the dictionary is column based. Is it not more efficient to append lines, i.e. one line per point in time, and send this?
If there is no good way for the first, what kind of object should I send? Do I need to marshal the information or is it sufficient to make a long string like {col1:[a,b,c,...], col2:[...],...} and send this to the client? I expect to send not more than a few hundred lines with 10 floats per second.
Thanks for all helpful answers.
I am quite new to parsing text files. While googling a bit, I found out that a parser builds a tree structure usually out of a text file. Most of the examples consists of parsing files, which in my view is quite static. You load the file to parser and get the output.
My problem is something different from parsing files. I have a stream of JSON data coming from a server socket at TCP port 6000. I need to parse the incoming data.I have some questions in mind:
1) Do I need to save the incoming JSON data at the client side with some sought of buffer? Answer: I think yes I need to save it, but are there any parsers which can do it directly like passing the JSON object as an argument to the parse function.
2) How would the structure of the real time parser look like`? Answer: Since on google only static parsing tree structure is available. In my view each object is parsed and have some sought of parsed tree and then it is deleted from the memory. Otherwise it will cause memory overflow because the data is continuous.
There are some parser libraries available like JSON-C and JSON lib. One more thing which comes into my mind is that can we save a JSON object in any C/C++ array. Just thought of that but could realize how to do that.
I have a Qt TCP Server and Client program which can interact with each other. The Server can send some function generated data to the socket using Qtextstream. And the Client reads the data from the socket using simple readAll() and displays to a QtextEdit.
Now my data from Server side is huge (around 7000+ samples ) and I need the data to appear on the Client side instantaneously. I have learned that using XML will help in my case. So, I made an Qt XML Server and it generates the whole xml data into a .xml file. I read the .xml file in Client side and I can get to display its contents. I used the DOM method for parsing. But I get the data to display only when all the 7000+ samples have been generated on the Server side.
I need clarifications on these questions:
How do I write each element of the XML Server side in to a String and send them through socket? I learnt tagName() can help me, but I have not been able to figure out how.
Is there any other way other than the String method to get a single element generated in the Server side to appear in the Client side.
PS: I am a newbie, forgive my ignorance. Thank you.
Most DOM XML parsers require a complete, well-formed XML document before they'll do anything with it. That's precisely what you see: your data is processed only after all of the samples have been received.
You need to use an incremental parser that doesn't care about the XML document not being complete yet.
On the other hand: if you're not requiring XML for interoperability with 3rd party systems, you're probably wasting a lot of resources by using it. I don't know where you've "learned" that XML will "help in your case". To me it's not learning, it's just following the crowd without understanding what's going on. Is your requirement to use XML or to move the data around? Moving data around has been a well understood problem for decades. Computers "speak" binary. No need to work around it, you know. If all you need is to move around some numbers, use QDataStream and be done with it. It'll be two orders of magnitude faster than the fastest XML parsers, you'll transmit an order of magnitude less data, and everyone will live happily ever after*.
*living happily ever after not guaranteed, individual results may vary.
We're using XTK to display data processed and created on a server. In our particular case, it's a parallel isocontouring application. As it currently stands we're converting to the (textual) VTK format and passing the entire (imaginary) VTK file over the wire to the client, where XTK renders it. This provides some substantial overhead, as the text format outweighs in the in-memory format by a considerably amount.
Is there a recommended mechanism available for transmitting binary data directly, either through an alternate format that is well-described or by constructing XTK primitives inside the JavaScript code itself?
It should be supported to parse an X.object from JSON. So you could generate the JSON on the serverside and use the X.object(jsonobject) copy constructor to safe down cast it. This should also give the advantage that the objects can be 'webgl-ready' and do not require any clientside parsing which should result in instant loading.
I was planning to play with that myself soon but if you get anything to work, please let us know.
Just have in mind that you need to match the X.object structure even in JSON. The best way to see what is expected by xtk is to JSON.stringify a webgl-ready X.object.
XMLHTTPRequest, in its second specification (the last one), allows trans-domain http requests (but you must have the control of the php header on the server side).
In addition it allows to sent ArrayBuffer, or Blobs or Documents (look here). And then on the client side you can write your own parser for that blob or (I think it fits more in you case) that BinaryBuffer using binary buffer views (see doc here). However XMLHTTPRequest is from client to server, but look HTML5 WebSocket, it seems it can transfert binaryArrays too (they say it here : ).
In every case you will need a parser to transform binary to string or to X.object at the client side.
I wish it helped you.