I have a need to build a custom communication protocol in a distributed system. The logic on the individual nodes is implemented in C++.
In my past experience, when I had to do this thing in Java, I relied on Netty. Is there a similar framework/library in C++ that allows me to implement my own custom protocols?
I looked at ZeroMQ briefly. However, at the docs I found seem to over-emphasize on using the pred-defined patterns like REQ/REP, PUB/SUB etc. Is there a more foundational layer on ZeroMQ that does not force me to use these patterns, but still provides enough support to implement custom communication protocols?
If there are other libraries (heard of Boost.Asio) that are a better fit, then that is also welcome.
ZeroMQ or nanomsg frameworks ( as cool, broker-less tools ) make you a great messaging IO layer and you may forget about their smart internalities.
You can do whatever procotol-abstraction on your own.
If you got an impression, that PUB/SUB is the focus of the ZeroMQ, seems you have missed their greatest powers.
Did you have a chance to read into any Pieters Hinjens' book on advanced design principles behind zero-copy, zero-energy, zero-sharing, zero-latency ;o) ?
Worth one's time. [More gems included.]
The very PROTOCOL oriented design approaches may help a lot in your own-protocol-FSA design & validation, the more if you strive for professional-grade, multi-threaded, heterogenous, distributed, scale-able, self-healing, fast, low-latency formal-communication-patterns.
You probably want to look into middlewares like CORBA (many different brokers are available, but I would not recommend use it), ICE from ZeroC, Protocol Buffers from Google, SOAP or even RPC. There are different pluses and minuses of using one or another, but I would recommend to use existing middleware, rather that develop your own. You can start here middleware on wikipedia and then decide which one is better for your needs.
Related
We currently have a number of C++/MFC applications that communicate with each other via DCOM. Now we will update the applications and also want to replace DCOM with something more modern, something that is easier to work with. But we do not know what. What do you think
Edit
The data exchanged is not something that may be of interest to others. It is only status information between the different parts of the program running on different computers.
there are many C++ messaging libraries, from the old ACE to new ones like Google's Protocol Buffers or Facebook's (now Apache's) Thrift or Cisco's Etch.
Currently I'm hearing good things about ZeroMq which might give you more than you are used to.
DCOM is nothing more than sugar-coating over a messenging system.
Any proper messenging system would do, and would allow you to actually spot where messages are exchanged (which may be important to localize point of failures/performance bottlenecks in waiting).
There are two typical ways to do so, nowadays:
A pure messenging system, for example using Google Protocol Buffers as the exchange format
A webservice (either full webservice in JSON or a REST API)
I've been doing lots of apps in both C++ and Java using REST and I'm pretty satisfied. Far from the complexity of CORBA and SOAP, REST is easy to implement and flexible. I had a bit of a learning curve to ged used to model things as CRUD, but now it seems even more intuitive that way.
Now, for the C++ side I don't use a specific REST library, just cURL and a XML parser (in my case, CPPDOM) because the C++ apps are only clients, and the servers are Java (using the Restlet framework). If you need one, there's another question here at SO that recommends:
Can anyone recommend a good C/C++ RESTful framework
I'd also mention my decision to use XML was arbitrary and I'm seriously considering to replace it with JSON. Unless you have a specific need for XML, JSON is simpler and lightweight. And the beauty of REST is that you could even support both, along with other representations, if you want to.
I'm working on a personal project which is an RPC (client-server) in C++. The RPC will communicate over TCP/IP or HTTP. The spec of the RPC is here :
http://groups.google.com/group/json-rpc/web/json-rpc-2-0
I'm wondering if there is an existing design pattern (or a combination of pattern) that could help me to produce a clean and flexible design. I will appreciate to have code examples, UML diagrams, or articles.
Thanks.
You are probably going to need a Proxy on the client to represent and make callable locally any server-side methods that you will be calling.
Under the covers, Abstract Factory could be useful to encapsulate provision of a concrete network connection that is selected using a configured or requested protocol (TCP, HTTP).
I would go for Observer.
Details and diagrams.
It's not clear if you asking in regards to high level design (observer patterns, JSON/XML processing techniques, etc.), low level design (sockets, HTTP client/server handling, etc.), or both.
If interested in lower level aspects including scalability, it might be worthwhile to study the design and motivations behind ASIO: http://think-async.com/Asio/asio-1.3.1/doc/asio/overview.html .
I know that there are a lot of discussions already on SO about SOAP, bloat, XML, and alternative mechanisms like REST.
Here's the situation. A new team member is really talking up SOAP based upon the difficulty of implementing protocols by hand. He recommends gSOAP (project is all in C++.) He is stating things like WSDL cleaning up lots of messy hand coded C++.
Right now I am handling most networking with XML based text messages, and the expat XML library. So I have some programming effort (not much) associated with modifications to message formats or additions to parameter lists. At the sender end I package up an XML request and send it over a plain old TCP socket. At the receiver I parse the XML using DOM or SAX. Etc. It has worked fine so far. The XML messages are quite compact and average a couple of hundred characters, at most. And I understand every item in those messages.
We want one portion of the product (a server) to be accessible to web sites that are coded using PHP. That is partly driving this idea here, that a SOAP interface will be "easier" for script writers. Everyone on this project believes that SOAP is their salvation.
I see the introduction of a new large library like gSOAP as highly disruptive to the momentum of a mature project.
What I am wondering is if there is a different and more compact way of doing what SOAP gives us. And how to balance claims of gSOAP or other SOAP tools making development life easier against hard reality.
IE, I am being told that WSDL is better, easier, more workmanlike, etc than hand coding C++ using an XML library. That it puts the semantics of the C++ objects right into the declaration of the network messages. The thing is, many of the XML messages that I have defined don't map one for one to a single distinct object at the receiving end.
Or, it is possible that I am worrying about nothing.
But the reality as I scan messages here seems to contradict what I have been told locally.
I'm not buying SOAP.
Don Box's original vision of a Simple Object Access Protocol is anything but simple now. It's become a bloated, design by committee mess.
Throw in all the additional dependencies on bloated libraries and you have a potential mess on your hands.
Tool vendors love SOAP, but I don't see much for anyone else.
I think that you will find that PHP developers are more likely to prefer RESTful interfaces. Here is a 2003 article about it.
http://onlamp.com/pub/a/php/2003/10/30/amazon_rest.html
RESTful interfaces are a growing phenomenon and if you need to attract developers to your platform it will be easier if you catch the wave.
Having said that, is there a good reason why you cannot support multiple interfaces? This is fairly common in web services that do not have a captive audience. You could support your legacy model, a clean RESTful model and a SOAP/WSDL model. Then take stock after 6 months to a year to see which model is the most popular and least effort to support.
When it comes to making the site more accessible to outsiders, REST has more widespread usage. As far as saving your project, it is possible that SOAP would do this because it demands a certain amount of rigor in interface design, however the same could be said of REST. If this is a key criterion, then you should probably abandon the hand-coded XML and go with a high-level interface design that could be implemented as both REST and SOAP.
I know some people believe that SOAP and REST are fundamentally different approaches, but if you take a RESTful approach to the interface design, you shouldn't have great difficulty in creating a SOAP version. Don't try to do it the other way around though.
Here is a classic, hilarious, debunking of SOAP - The "S" stands for Simple". The community I move in is completely converted to REST.
If you look around at RESTful interfaces on the net, you'll notice that SOAP is nearly universally avoided. SOAP is such a complex beast that it effectively locks out languages with no existing SOAP package, since nobody is going to implement it themselves. Raw XML, on the other hand, is pretty universal at this point, and not that difficult to implement in-house if necessary.
I'm implementing an SOA at a large company, and I'm not sure which web service specifications (WS-*) actually make sense to implement. At a minimum, I'm looking at WS-Addressing, WS-Security, WS-Eventing, and WS-ReliableMessaging. However, there are several other standards that look interesting, but I don't know which ones are widely adapted. I don't want to implement a standard (and force all the developers to follow them) if they're not mature or necessary.
EDIT:
I'm asking this question not about a specific situation, but in general. There are quite a few WS-* standards that don't seem to have a lot of practical use (at least to me), so I'm really curious about which ones are widely used.
Thanks for your help!
KA
WS-Adressing is widely used, and quite useful. For WS-Security, consider the set of mechanisms you'll need (based on your usage scenarios).
Only SOAP is widely adopted. If you care about reach, going beyond WS-Security and WS-Addressing is asking for trouble (even WS-Security can be hard for a lot of people). If you are creating services for internal use in a large company, then I wouldn't worry as much. Something like WCF would allow you to provide endpoints with different bindings for a wide range of consumers without writing any additional code.
There are two types of web services : REST and SOAP. They represent different protocols of sending data over the internet.
SOA is an acronym standing for Service Oriented Architecture. It is a way of architecturing your system using multiplet tiers (applications) one atop the other. Web services, mostly soap based are used to implement this archtecture, but they are not the only way.
I have no experience with web services. Historically I've built client-server systems using proprietary communication protocols (even they happen to be XML). I just spent a few hours looking over Axis2 and it sent a shudder down my spine. The learning curve of WS scares me, and seeing all that XML surround so little functionality makes me wonder if it's worth the trouble.
How do you decide whether you need to use Web Services or a custom communication protocol? What are the advantages/disadvantages of each approach and what use-cases are they best suited for?
Please post a clear guideline, not an opinion piece :)
Build RESTful web APIs; then you get a lot of automatic caching and etc benefits that you don't get if you use other methods (SOAP, XML-RPC, etc)
See this post for more details
Another benefit is that if you build a RESTful API for your code to use, you can potentially let your users take advantage of it too - they often have uses for your product that you never dreamed of.
"Web Services" as defined by the W3C means using SOAP over HTTP. SOAP is severe overkill in most cases; it's only really appropriate (IMO) when you're making a public service available to the world, like an API for interacting with your website, for example.
Anything else (especially internal, private communications) rarely need anything more complex than XML-RPC. Only if performance is an issue should you consider a more condensed protocol; XML-RPC is so simple and widely-supported that the ease of development and debugging more than makes up for the performance loss of using bloaty ol' XML.
Remember that there are a number of frameworks out there that make programming web services very trivial stuff. In the VB / C# world .Net makes it a joy. I'm not really sure about specific frameworks for other languages but I am sure most have at least one.
The standardisation and simplicity of implementation and reuse of web services make them very attractive. As previously pointed out- yes, they make communications very verbose. If you are worried about this why not calculate how much data you actually will be trasmitting. chances are, with current network and internet speeds, it will be trivial - even with the XML overhead.
I would always use the custom data formats as a last resort and not a first. What widely used method you use it up to you but it's unlikely you would go wrong with Web Services model.
Maintainability and extensibility are the main benefits. The use of widely used technology your solution will be easier for someone else to understand plus you can use ready to roll libraries as consumers and providers.
I have recently broken my custom protocol habit. I am now using Apache on the server side and libCurl plus libxml2 to load and parse the XML on the client which is written in C++.
The server side can be either PHP or a CGI written in a more serious language. Depends what you want to do.
Webservices have the advantage of being somewhat standard, so it's possible for programs you've never heard of to use a webservice you wrote. Using HTTP can help them communicate over proxies and other network obstacles without any extra work from you. The XML, although rather verbose and ugly, is rather easier to read when debugging than binary data.
When you're transferring stuff over the network, it's unlikely that serialisation/deserialisation to xml will be the limiting factor in performance. It can be a bit of hassle, although a library to do it for you will help a lot.
SOAP and XML -- "all that XML surround so little functionality makes me wonder if it's worth the trouble."
Totally. SOAP is heavy-weight, and -- to a large extent -- a workaround to the need for static binding throughout the Java technology stack.
REST, on the other hand, is much lighter weight. Further, REST with JSON or REST with YAML is very lightweight, and very easy to implement. It builds right on top of the off-the shelf HTTP protocol.
REST requires you to define resources (named via URI's), and transactions based on the canonical CRUD rules (GET, POST, PUT and DELETE). Very simple and canonical.
In my personal (old cranky dude) opinion, web services should only be used as a way to make some of your internal information available to third parties (i.e. other companies, people outside your organization etc.). Of course, that is also the originally intended purpose of XML. :-)
If you have access to a direct connection with the databases containing the information your application needs - that is the way to go. It's faster and simpler - which in application development means "better" and "less buggy".