How to use libpqxx to receive notifications from the PostgreSQL database? - c++

I'm writing C++ applicatoin which needs to receive notifications for data changes from PostgreSQL through libpqxx library. But it's tutorial doesn't include such use case. The notifications must be received on multiple channels. Also I'm using boost::asio as networking library and for me is preferable if possible to use asio socket classes with asynchronous callbacks for notification events instead of polling of raw BSD style sockets. Can someone provide sample code for this or links to some external resources for how this can be achieved?

You need a class derived from pqxx::notification_receiver, see http://pqxx.org/devprojects/libpqxx/doc/4.0/html/Reference/a00208.html "Notifications and Receivers" and
http://pqxx.org/devprojects/libpqxx/doc/4.0/html/Reference/a00062.html which is the API reference for notification_receiver.

Related

NodeJS server send data to C++ process

I have a nodeJS server which receives user POST/Streaming requests from a web-UI.
I have a C++ back-end engine process which does some calculations and sends API calls to other 3rd party services. The API call requires certain info provided by the web users.
My question is what is the best solution to pass the request data received on NodeJS and send over to the C++ process?
WebUI -> NodeJS ->???->> C++ engine
Make your C++ application listen on a TCP or Unix socket.
Make your NodeJs application connect to that socket and exchange messages. For messages you can use Google Protocol Buffers, JSON, etc..
If the information what you have is still at JavaScript layer, then you have to implement C/C++ Addons implementation. If you already have some type of native module, then you may follow the same design based on that (very likely existing module could be based on NAN). If you are plan to introduce a brand new native module then it is a good time to consider N-API. You can get more information about it from.
https://nodejs.org/dist/latest-v11.x/docs/api/n-api.html
https://github.com/nodejs/node-addon-api

Recommended way to communicate from a C++ application to an Akka actor

I have a C++ application that needs to send structured data to an Akka actor. The best option I found (Google, stackoverflow...) is to use protocol buffer and ZeroMQ, since it looks like everyone recommends it.
However I struggled the whole day trying to make it work, having various crashes into my Scala actor code (with strange Windows socket errors). And when I take a deeper look at it, I notice that it seems zeromq disappeared from the Akka official documentation a while ago, and the most recent documentation I read about it said that ZeroMQ 3 was still not supported by zeromq-scala-bindings underneath (while the version 4 is already out).
Would it be a better option to use the Camel-netty extension and pass the information through JSON ?
Thanks !
A fairly simple way would be to write a HTTP endpoint using Spray.io. Spray has JSON support and since it is built on Akka it communicates seamlessly with other Actors. This has the advantage that the data you send to the endpoint does not have to match the message format the Actor is expecting. You can change the message the actor is expecting without changing what your C++ code sends. For bi-directional communication there is also web socket support.

How to communicate between C++ server app and django web app

I have some framework doing specific task in C++ and a django-based web app. The idea is to launch this framework, receive some data from it, send some data or request and check it's status in some period.
I'm looking for the best way of communication. Both apps run on the same server. I was wondering if a json server in C++ is a good idea. Django would send a request to this server, and server would parse it, and delegate a worker thread to complete the task. Almost all data that need to be send is string-like. Other data will be stored in database so there is no problem with that.
Is JSON a good idea? Maybe you know some better mechanism for local communication between C++ and django?
If your requirement is guaranteed to always have the C++ application on the same machine as the Django web application, include the C++ code by converting it into a shared library and wrapping python around it. Just like this Calling C/C++ from python?
JSON and other serializations make sense if you are going to do remote calls and the code needs to communicate across machines.
JSON seems like a fair enough choice for data serialization - it's good at handling strings and there's existing libraries for encoding/decoding JSON in both python & C++.
However, I think your bigger problem is likely to be the transport protocol that you use for transferring JSON between your client and server. Here's some options:
You could build an HTTP server into your C++ application (which I think might be what you mean by "JSON server" in your question), which would work fine, though might be a bit of a pain to implement unless you get a hold of a library to handle the hard work for you.
Another option might be to use the 0MQ library to send JSON (or otherwise) messages between your client and server. I think this would probably be a lot easier than implementing a full HTTP server, and 0MQ has some interprocess communication code that would likely be a lot faster than sending things over the network.
A third option would just be to run your C++ as a standalone application and pass the data in to it via stdin or command line parameters. This is probably the simplest way to do things, though may not be the most flexble. If you were to go this way, you might be better off just building a Python/C++ binding as suggested by ablm.
Alternatively you could attempt to build some sort of job queue based on redis or something other database system. The idea being that your django application puts some JSON describing the job into the job queue, and then the C++ application could periodically poll the queue, and use a seperate redis entry to pass the results back to the client. This could have the advantage that you could reasonably easily have several "workers" reading from the job queue with minimal effort.
There's almost definitely some other ways to go about it, but those are the ones that immediately spring to mind.

how to implement a protocol adapter

I want to implement a adapter which can provide a universal interface to clients to use socket, opc, message queue, etc. In other words, it is a non-trivial job to learn to use the three above protocols' api.
For example, the client want to communicate with a external socket server, and the only thing he should do is to use our simple api rather than the complex bsd-socket's.
I want to know is there any existing implementation now which I can learn from. thanks!
ZeroMQ provides a socket like API that allows you to abstract away the transport mechanism. Currently it supports in process, shared memory, PGM, and TCP as transport mechanisms.
Google has protobuf, i think it's called, and there is another i've seen mentioned but it escapes me at the moment. Check here for information on protobuf

Client and server

I would like to create a connection between two applications. Should I be using Client-Server or is there another way of efficiently communicating between one another? Is there any premade C++ networking client server libraries which are easy to use/reuse and implement?
Application #1 <---> (Client) <---> (Server) <---> Application #2
Thanks!
Client / server is a generic architecture pattern (much like factory, delegation, inheritance, bridge are design patterns). What you probably want is a library to eliminate the tedium of packing and unpacking your data in a format that can be sent over the wire. I strongly recommend you take a look at the protocol buffers library, which is used extensively at Google and released as open source. It will automatically encode / decode data, and it makes it possible for programs written in different languages to send and receive messages of the same type with all the dirty work done for you automatically. Protobuf only deals with encoding, not actually sending and receiving. For that, you can use primitive sockets (strongly recommend against that) or the Boost.Asio asynchronous I/O library.
I should add that you seem to be confused about the meaning of client and server, since in your diagram you have the application talking to a client which talks to a server which talks to another application. This is wrong. Your application is the client (or the server). Client / server is simply a role that your application takes on during the communication. An application is considered to be a client when it initiates a connection or a request, while an application is considered to be a server when it waits for and processes incoming requests. Client / server are simply terms to describe application behavior.
If you know the applications will be running on the same machine, you can use sockets, message queues, pipes, or shared memory. Which option you choose depends on a lot of factors.
There is a ton of example code for any of these strategies as well as libraries that will abstract away a lot of the details.
If they are running on different machines, you will want to communicate through sockets.
There's a tutorial here, with decent code samples.