As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
1.
I need to build a "Web Service Server (Simulator)" which generates the xml files and also sends async calls to the client for notification. At this point, I am writing a code to generate dummy XML files which will be used for testing (FileGeneratorClass-- builder)?
2.
Also, can I implement this in a way that I do not have to write a complete code from scratch to simulate another web service server and another file format ? - what pattern can I leverage there ?
3.
The objects/classes are generated from a Schema file (for xml File) and WSDLs ( for web service ), how can I make my code immune to changes to these files (newer versions) ? - which design pattern ??
(Please let me know if information I provided is too much or too less, also if you need me to edit)
Thank you very much.
Disclaimer: I am a complete newbie and using patterns for this small project might be overkill yet I want to do it so that I learn/understand it. Which, I think, will give me confidence and clarity when I need to do that in a more complex project.
Patterns don't do anything. You are asking if you should use prepositional phrases when you are planning to write a mystery novel. You don't start a design saying what patterns do I need. Patterns emerge from the design process. You say my program will need x and y, that's similar to the such-and-such pattern, I should see if that pattern fits. If it does, use it. If it doesn't fit, don't force it to fit.
You are treating patterns like classes. Don't do that. That's not their purpose. They are not building blocks. They are not checklist entries. They are exactly what the mundane meaning of patterns implies. They are things you see repeated over and over. Many times you sense their necessity ahead of time and so you include them in the design. But they are not a starting point.
Sometimes there just isn't any other way than doing the research. If you wish to learn design patterns then start studying design patterns. Learn a little each day, and as you do your normal coding you will begin to see uses for what you've learned. Personally, I like the way Wikipedia has tackled the topic (as opposed to some of the books). Read the overview, and then dive into the ones that you foresee as having some relevance to what you are trying to do.
Also, you may be interested in Enterprise Integration Patterns, as opposed to design patterns, which apply more to web services than to algorithms.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I am building a C# project. This project is going to use NVidia's Tesla through CUDA. CUDA C native implementation is not exposed directly to C# and, in my opinion, the available C# wrappers (like Brahma, CUDAfy, Linq to GPU) are not mature enough for production.
I decided to go ahead and build my math logic in a C++ component that is going to access CUDA which is the official supported way. C++/CLI is not an option as I am using Intel C++ Compiler, for performance, which doesn't support CLR extensions.
My most important criteria is performance, so, I would try to minimise marshalling and copying arrays between C++ (where my business logic lives) and .NET (the rest of my applications).
I am aware that this question has been asked before, but mostly, the C++ library is already there and other times, C++/CLI is an option, but here, both situations are not the case.
Given that I am going to write the C++ library from scratch in C++, I am in the position to decide the best way to expose it to C#. Do you have any recommendations or best practices that I should follow to get the easiest and highest performing integration between C++ and .NET? Note that what I will be exchanging are mostly large arrays
Edit: clarifying that I am building my business logic (math) in C++ and not an infrastructure library to facilitate access to GPU.
While it is certainly possible to outperform the already existing libraries that you deemed not mature enough, the very fact that you are asking this question here should make you think twice about deciding to roll your own library/implementation!
Considerations beyond specific performance such as stability and reliability should be your primary concern if this is going to production. Generally unless you know what you're doing, duplicating the effort of the community, or other teams of developers, can be a slippery slope.
I know this answer doesn't really address your question but as it's formulated your question is in my opinion overly broad and there is no simple answer. Initially I was going to post this as a comment but decided it was too long to fit the format.
So, in closing, I recommend you try out the already existing libraries and if you find them not-fitting from a performance stand point, start asking specific questions.
UPDATE
If you're going to implement most of the logic in C++ and you're expecting to just be transferring some results back to your managed code in the form of arrays then there isn't much that you need to do. In general the automatic marshalling of arrays is as efficient as you're going to get.
The one thing I would recommend is though to read as much as possible about Marshalling and use a performance profiler before deciding to get "creative" in order to improve things.
And here's one last idea that might be interesting but again, you should profile before attempting to use this: you might try to use a Memory Mapped File as the backing store for your data and open the file from both ends. Ultimately this may or may not be useful so definitely profile before you buy ;)
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I was recently working on a Web Service Project and realized my choice of architecture was extremely inefficient.
I wrote this in a very procedural manner with a hint of OOP and standard Exception Handling using Python. Basically, it would procedurally step through the data, validate existence of expected data, validate the data against a regular expression, validate some data against a database, perform some specific logic, check for errors, and then finally return a response. It might be helpful to mention that all data was exchanged using JSON.
I tried to go back through the code, find any duplicated exceptions, and push their handling to the top of the logic chain. This was not as easy to do as I had hoped and actually cost more time. It also made my code more prone to bugs by being less Unit Testable and harder to read.
I've noticed this paradigm of procedural code for handling User Data is very easy to fall in to with Web Development. For example, while handling a Form in PHP one may run a consecutive series of isset() and !empty() methods on the data. My problem with this coding style is that I feel like I'm spending an enormous amount of time coding for Error Events and it's difficult to generalize and re-use code for this particular purpose.
Various frameworks offer great ways around this through the use of Form Classes (e.g. Django). However, I have noticed that while you save time by reducing the duplication of Validation Logic, you will still need to "build" a Form for each expected input. When dealing with Software as a Service, there can be potentially hundreds of API Methods that you must code for. OOP offers a benefit here but there are times where a client may set an odd requirement which removes any efficiency gained.
Web Applications can benefit greatly from following paradigms/architectures such as MVC. In my personal experience, MVC (and the frameworks which use its principles) are not well tailored to this type of problem. I've considered the use of Functional Languages but have yet to give them a try.
Are there any particular languages, architectures/paradigms, conventions, or even example frameworks that are well suited for the development of custom SASS or Web Service Projects?
As someone who does a lot of this work, I would say that part of your problem with OOP and PHP is caused because initially PHP was not an OOP language. OOP was added to the language later on. So when you look at code examples they often can have a procedural feel.
In recent years I've been most happy with either Spring (Java) or WCF (C#). Both these languages are strongly typed OO languages. From a conceptual standpoint this leads to a paradigm that works well for my projects. Here's the overview:
Endpoints (Either REST or WSDL) -- similar to view in MVC
Services -- These feed the endpoints and coordinate DAOs as needed. Organize these around your business logic
Data Access Objects -- convert data into native objects and vice versa. Organize these around your data sources.
Model / API -- Native Objects to support application and automatically provide documentation for your service.
Hope that helps
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I am almost 100% locked in to Django for the projects I have been planning.
The final "myth" I'd like to "dispel" is that Django is "mediocre" at
conveying business-logic.
Direct quote by Peter Shangov:
Whatever your choice of framework your real-life needs will very
quickly outgrow the functionality available in the ecommerce modules
that you started with, and you will end up needing to make non-trivial
changes to them or even rewriting from scratch sooner rather than
later. This is because open source has always been exceptional at
building infrastructure tools (think web servers, templating
languages, databases, cacheing, etc.), but relatively mediocre at
implementing business logic. So what I'd be looking for if I were you
is the library that I'd be happiest to hack on rather than the one
that looks most mature.
"Products" which I am putting Django (with satchmo) up against:
Ruby on Rails (with spree) [Ruby]
Catalyst [Perl]
JadaSite [Java]
KonaKart [Java]
Shopizer [Java]
Could you please alleviate (or confirm) my concerns regarding the
aforementioned quote about Django?
The short answer - of course its bad, because its not a business process management software; it is a framework for web development and getting things done.
The long answer - you need to clarify what you mean by business logic (and "conveying" it). Are you talking about process mapping, workflow management or the execution of the process itself?
I do not see how the other projects you listed "convey" business logic - because they are not business process diagramming or testing or validation packages. They are simply frameworks to do some work. Once the process has been defined and validated (using some external tools), you can then execute that process in your code.
In terms of online shopping - the business "process" as far as the store front is concerned is quite standard and you can easily map it to any of the packages you have listed. You did not mention what kind of store you'll be running or what your fulfillment or delivery processes are, so cannot give you a detailed response if satchmo has those components built-in or would you have to write them from scratch.
The only possible negative when it comes to django is that it doesn't have a mature workflow engine (the two main projects GoFlow and django-workflows are stalled), but this is hardly a criticism against django since it is not a generic web framework. It is designed for a specific application for which a complex multi state workflow engine is not a primary need.
Finally, as far as the quote is concerned - without knowing the context - I can only say that one of the most popular business process mapping software is actually the open source JBoss BPM engine.
I don't doubt that closed-source/proprietary people are great at building infrastructure tools and frameworks too. What they don't do is release them, or let people play with them. They build on them themselves, making money by sticking "business logic" specific to the businesses who give them the money for it.
If you go for a proprietary solution there will doubtless also be some non-trivial changes required, and you'll pay through the nose to the one company that provided you with the (not quite) solution. "Oh, another $4000 to add an extra class of fields to the database? Hmmm. Oh I guess we've already paid you $100,000 and your software is closed source so we can't subcontract it to a tendering process... Here you go..."
Open Source is better at implementing business logic because, when it comes down to it, its people that implement business logic, not frameworks, and open source means more people can work with it.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I have compiled the list as per my knowledge but would like to enrich & prioritize it using this community inputs. I understand having a centralized rule repository itself is debatable but we can have separate question for it.
Business user adaptability to use the platform for writing rules [Definition, classification, deciding on rules that might go on a rule repository ]
Ease of Rules invocation and consumption from different Application
Rules portability - [RIF (Rule Interchange Format) importance?]
Rules maintenance – BRMS [Business Rule Management System]
Rule engine performance – [How much , How fast and how reliably]
Here is my take based on 12-year-long experience with 3 rule engines (ordered by importance):
Ability to create, edit and deploy rules without any IT involvement after the system was installed and tested. Ability to version, approve, test and debug my rules is nice to have but not critical as long as engine comes with normal API, so I can build that functionality myself, the way I need it. I'm not sure about "the platform", just give me a decent UI for rule authoring and editing, preferably web-based UI.
Rule execution performance should be EXCELLENT. Cannot emphasize enough: slow engines almost always lead to lost profits. In my life, the engine must be capable of evaluating a 50-80 conditions rule set (with no external calls) under 2 milliseconds each (about 1.5 millisecond on average is good, 0.5 milliseconds is great). It must be thread-safe, all rule evaluations must be completely independent from each other and the engine itself (other than the rule caching).
Rules should be presented in XML format so it can be saved anywhere. I don't care what kind of format it is as long as it works and stays consistent between engine versions. I doubt that there is a huge need out there for "sharing" rules between different organizations. I definitely don't expect to share my rules with anyone :) Rule repositories can be total evil simply because I may need to move my rules from one storage to another (say, in case of merger, or asquisition of the entire system by others with other types of storages). It should be fine as long as the brand of engine stays the same. Rules are just logical sets. It should be completely irrelevant where they are stored at the moment. If the engine cannot just load a rule in an expected format from anywhere then I don't need such engine.
Ability to create, NAME and save small rules and later combine them in rule sets by their names would be a huge plus.
All other features of rule engines that I can think of at the moment are irrelevant to me and to what I do. Hope this helps.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 12 years ago.
Can somebody point out the advantages of Clojure and what type of applications is it suited for ?
I don't intend to compare it to any languages as such. As a language in itself what is it suitable for ? My intention is to know the right tools for the right job, and where does clojure fit-in in that kind of scenario.
Advantages:
all the benefits of functional programming, without the purity straitjacket
lispy: allows dynamic, compact code with late binding, macros, multimethods
Java interoperability
can code functions to sequence abstraction instead of to specific data structures
concurrency goodies baked in: functional data structures, software transactional memory
runs on the JVM: portability and fast garbage collection
Suited for:
bottom-up design
embedded languages
highly-concurrent applications
Probably not suited for:
cases where you want static typing
if you want the language to be amenable to static analysis
anything that needs a fast startup time
hordes of clueless Java monkeys
In general I find as strong points for clojure (in no particular order):
1) The REPL to try things out interactively.
2) Everything is immutable by default and mutability has several well chosen standard patterns to modify state in a safe way in an multithreaded environment
3) Tail recursion is made explicit. Till there is proper support for tail recursion on the JVM this is probably the best compromise
4) Very expressive language which favors a functional approach over an imperative approach.
5) Very good integration with the Java platform making it painless to mix in Java libraries
6) Leiningen as a build and dependency management tool together with the clojars site
Ok, point 6 has nothing to do with the language perse, but definitely with my enjoyment of using it.
Regarding applications it targets multithreading applications, but the way things go right now that could mean about anything, as everywhere people try to keep all those cores busy while the user is not waiting. On the other hand apparently a lot of people use it to deploy to Google App Engine which is radically SINGLE threaded.
The functional approach works well in my (limited) experience for implementing data transformations and calculations. Where information and events can be 'streamed' through the application. Web apps fall largely under this category where we "transform" a request into a "response".
But I still have to use it in real production code. Currently I use it for personal projects and prototyping/benchmarking stuff.