Webservice on Navision / Microsoft Dynamics version 5... or else? - web-services

Go bounty!
This question has earned me a tumbleweed badge (7 views in 7 days!), which is somehow a strong confirmation that Navision has a very limited market share, which - I suspect - should be a confirmation Navision is neither all that great piece of software...
But hey... that's what we got as a back-end, so I am ready to fight with this. :-O
If there is some daring navision developer who is able to shed light onto this... the bounty is there for you! :)
Original Post
I have recently implemented a rather complex e-commerce system that interacts with a legacy back-end based on Navision 5. So far the exchange of data between the two platform has happened via XML files, but this method is quite clumsy and very much prone to mishaps.
Our needs are:
To expose certain elements of the business logic of each platform to the other one (for example: "what's the total amount ever purchased by this customer?", "what are the products currently on offer?", "how many new customers have registered on the website?", etc...).
To have mechanisms of feed-back / validation for the various transactions (for example: "Here's the a new order from customer X" ... "Ok, got it, the order will now start to be processed" ... "Ok, copy that, bye!").
If possible, avoid playing around with files, but keeping all of this happening in terms of calls/ports/services...
The most natural way I could think of would be to integrate the two systems via webservice, but Navision 5 does not support this natively. So I did my "due diligence" and found a few things on MSDN including this article and this other one.
According to these articles it should not be that difficult to create a webservice on Navision 5, but when I suggested this solution to the team in charge of the legacy system, they told us that it is "pure theory" and they do not know of anybody who ever implemented it.
I have no reason to doubt their word, but mileage can vary... and I thought that maybe in the SO community there are professionals from other countries who actually implemented something similar and are available to share their experience.
So, my question is two-folded:
Is there anybody who has tried this at home and would be available to share a bit on what have been the greatest difficulties, if the final result is reliable, if they think the outcome is worth the effort, etc... ?
Is there anybody who faced a similar problem but solved it with a different approach and that would be available to present their solution ("I never did it myself, but if I had to do it I would do it like this..." type of answers are also welcome)?
Thank you in advance for your time! :)

I too will chime in with a not-too helpful answer about Nav 6 :)
I've just completed a project using Nav 6. Suprisingly, the webservices are VERY easy to expose and consume. It's really a trivial matter to go find a object in the webservices interface and tick a box to tell it to expose itself.
Unsuprisingly, the webservices don't work as you'd expect, you have to often use some trial-and error to get objects and properties to persist, as it's really touchy as to the sequence of events you use to save and object. And each object seems to work slightly differently. eg: To create a customer, I eventually found out that you have to create and save a blank customer, massage this new record with a codeunit, then fetch the record and then write the customer's attributes and save again. I expected to just create a new customer(), set the attributes and save in one quick swoop.
I guess you're not too keen on upgrading to Nav6, but off the top of my head, Here is how you could simulate web services:
Sharepoint can already consume and expose webservices, so that tier isn't a problem.
Nav 5 doesn't have them 'naturally', but you could cook up your own program that acts as a webservice 'broker' - you're already getting info into and out of nav, via mostly XML. You could build this broker to take input as the xml files and massage them to use in a webservice call. You could even forgo the XML and write and read directly from the Db, as all the Nav info is stored there anyway. So here's what I'm thinking:
NAV <-> SQL SERVER <-> New 'broker' webservice <-> Sharepoint
Or if you already have the NAV API down pat and want to resuse your XML:
NAV <-> XML files <-> New 'broker' webservice <-> Sharepoint
If you are using XML and use filewatchers, the latency shouldn't be too bad, usually filewatchers pick up on a drop or change in milliseconds.
Hmm, but I'm thinking you are probably supposed to use BizTalk for stuff like this:
NAV <-> BizTalk <-> Sharepoint
But I don't know how easy it would be to set up BizTalk to communicate with Nav. I'll bet that it's pretty straighforward to get comms working, but this is speculation.
In any case, I don't know how helpful this post is, but maybe it gives you a few ideas to go with.
Cheers,
Lance

Where I work, we were able to use one of the web services from NAV 6 to integrate with SharePoint, so you can look up a customer or record and show it in a web part in SharePoint. I know your question is about NAV 5 in particular, but I have only seen this working on NAV 6. And I wasn't the developer who worked on this, so I don't have any more specifics I'm afraid.
Did you try asking on mibuso.com? They're much more Navision-focused.

When you say expose business logic, does this include executing AL code (e.g. a CodeUnit)? If you only need to perform queries against the database you could use NODBC & System.Data.Odbc or the CFront .NET API. Either of these can easily be wrapped using a .NET web service, and both support the native NAV database. To pass messages back you would still need to use COM, as described in the first article you mention.
Any of the above are entirely possible, and relatively easy depending on your proficiency in .NET, COM and NAV.
The second article you linked to describes using the NAS. I'm no expert on this, but I think this might require a special license granule. Before spending time implementing anything its worth checking whether your license includes the NAS, CFront or NODBC.

You can actually do a "technical upgrade" of NAV 5 to NAV 2009 and then use the native webservices. Meaning replacing the exe-files and the entire application (but not the objects, which will still be version 5), and a couple of other tricks. But it works, and thus you have 2009-webservice-functionality on your NAV 5 :-)

For the benefit of anyone googling this, if you're still on Navision 6 with the native database, a good way of connecting it is via the windows message queue.
You can write your own web service that puts XML requests into the queue and then wait for the reply to pop up. On the Navision side, you have one clients that pulls the queue and puts answers into the reply queue. There's a special non-GUI client called NAS for that.
I've been using this approach for 15 years to connect booking engines to out Navision back end. It works very well and has the advantage that you can peek at requests before they reach Navision, so you can protect your back end from too many or faulty requests.
The only problem with this approach is that the COMMIT command within Navision is super expensive. It's difficult to serve high volumes of requests. As soon as the back end needs committing, you are down to just a few requests per second. This may be fine for low volume.
For high volume you need to implement caching or getting some of the business logic out. For me it was getting hit by price comparison websites, so here the solution is serve those from web services written in Python and only pass on the requests if someone buys something...

Related

Creating a browser front-end for TigerLogic D3 DB application

I have an interesting conundrum. I have been challenged with identifying the most suitable process in which to create a "browser front-end" to an existing multi-user application built within the TigerLogic/Pick D3 environment. My research indicates there are many ways to do this; but I am struggling to decide which method is best or where to start. I have "played" with a few technologies but a commitment to one is needed to get started.
These methods include:
Creation of a complex webservice using MVS Toolkit; and engineering a client from WSDL either from scratch or using maven/wsimport. Tests indicate there is a lot more to this process than originally though for a simply WSDL.
Development of a Java based web app that harnesses the MVSPJavaAPI - I am not a JAVA developer so this means learning a new language. Development would most likely take place in Eclipse.
Using TigerLogics FlashCONNECT - resulting in additional expenditure to clients so not preferable - and more or less ruled
out.
There is also the .NET option - but I have ruled this out on the basis of needing portability.
My question is, has anyone else out there done anything like this and could you share your experience? My first task is to build a web-app that will reliably give me the D3 TCL prompt in a browser that I can customise.
I am not sure there is a definitive answer here but would like peoples thoughts and will label the most useful as the answer.
What path you choose depends in some part on your existing skill-set and whether that fits in with your portability needs. It is very difficult to give you a concrete answer to your question becaue of not knowing in which part of the chain you need the portability.
It is however possible to develop a web-browser front-end using .NET which will run on Linux or Windows, so I don't see an issue with portability here. Your web server will have to be windows based but it shouldn't matter whether D3 is running on Linux or Windows at the server-end, or whether the client desktops are running Linux or Windows.
You could try TigerLogics MVSP .NET API but I do not know if it has the power to deliver based on your needs. I believe you may find mv.NET from Bluefinity could fulfill your needs. This is in my opinion the leading product on the MultiValue market for achieving the goals you have in mind. This will mean spending money of course. For this you will get a very powerful set of tools. Also, the cost of investing in a good tool could end up smaller than the cost in terms of time, effort and potential complications of trying to do this piecemeal without spending any extra money. I am sure Flashconnect would also do the job. You would have to weigh up the cost of the different options to find out which one is right for you both technically and financially.
Not knowing whether or not you have .NET in your skill-set, I don't know whether the .NET option would be easier for you. It is however technically possible.
I would suggest using Rocket's D3 (formerly TigerLogic D3) .NET APIs and create a Web API RESTful service that you can consume with JavaScript in any other web technology and if you need to call from a D3 subroutine (in case you would) then use the MVS Toolkit.
Requirements though are D3 9.0 or later.
I've used all of the technologies described, and many more to interface with D3. I agree with #Glenn and will add... I understand you're edging away from .NET. That's fine, you don't need it. But consider that most LAMP implementations separate the DBMS servers from the web servers. That topology introduces short delays between the tiers but decouples them in case you want to use multiple web servers or multiple databases - a common topology even with D3 / MV.
I have a client where we have a Java/Grails front-end over Linux, with all data queries filtered through a single, elegant data provider class that's abstracted from application logic. That uses a web service call which I wrote in Java, calling to a .NET web service. The service is easily generated/modified, as is the client from the WSDL. From there IIS carries inbound queries to D3 via mv.NET, and at this point it doesn't matter if the D3 DBMS is in Linux or Windows. My web service could have as easily been in Linux with Java but it would then lack a pooling mechanism - see below.
If you want all Linux then you can go with the MVSP Java library. TigerLogic (now acquired by Rocket Software) committed to a PHP binding for MVSP some months ago. Rather than wait, one of my clients created a PHP wrapper around mv.NET, though MVSP is as easy. So the resulting application is essentially LAMP, but with the M = Multivalue. I have written code like this too - we can write a wrapper in any language which exposes a useful API and abstracts both connectivity method and OS dependencies. In other words it doesn't matter what languages we want to use or what OS's are involved. That part is rather trivial and subject to change later. It's better to focus on the application than the communications.
You can also go off the menu, so to speak, and create your own Java/PHP wrapper around the OS-level d3tcl command, which is a script/wrapper around the d3 executable. This allows you to open a connection yourself and pass in commands.
Whatever option you select, you need to consider that opening and closing a DBMS connection is a slow process. You do not want to script a login around every data request. You do want to open a connection and keep that open persistently, while your client code accesses and releases that persistent connection as required. This is why we like mv.NET and FlashCONNECT. With MVSP and other mechanisms you need to create your own persistence model. You'll also need to manage a pool of connection resources - what happens when you get 10 simultaneous queries, or just 1 short one after one long one? You don't want queries to back up, you don't want to reject or timeout connections, and you don't want to fire up a connection for every client. You do want the proper number of DBMS sessions waiting for inbound connections. mv.NET and FlashCONNECT do this for you, the others do not.
Personally I'd shy away from FlashCONNECT. I was there for its initial development and testing and for years of end-user implementations. It's not as widely used as the other options and is more a tool for those who aren't familiar with other options. If you're talking about Java then you're probably not inclined to use FlashCONNECT. That said, if you have developers who are not familiar with anything outside D3 then FlashCONNECT is a decent server-side tool for them while someone else is focusing on the client-side with other technologies. Everyone should use their best skillset.
Finally, (already?) if someone is not familiar with external technologies, and more intimate with D3, then other options exist like DesignBAIS and Viságe, mostly removing the burden of communications and allowing developers to work on the client-side features and back-end rules in BASIC.
I discuss all of these topics plus mobile and telephony on my blog.
HTH

How should a Windows 8 Metro Application connect to a central database?

How should a Windows 8 Metro application connect to a central database?
I've read about local storage, but I haven't read anything about connecting to a central database.
Obviously, this architectural design decision needs to support the disconnected scenario.
WCF web services seem to make sense.
But even if they do make sense, should we really create separate methods for all read/write operations?
Or are OData WCF services the way to go?
It seems like tablet software architecture should be able to borrow a lot from smartphone software architecture (but I am new to both).
Has Microsoft made any recommendations in its app samples?
It appears that others are asking similar questions on the Microsoft Developer Forums.
Here is what I've found:
According to Tim Heuer:
...You cannot directly have a SQL db embedded in your app or use
something like ADO.NET. This is more of an async/services
infrastructure. So if your data was exposed via services, then of
course you could connect that way. There are some other light-weight
methods you could use for local storage as well using things like the
Windows.Storage namespace (which is similar to Isolated Storage in
.NET).
Morten Nielsen agrees:
You can use HttpClient to download pretty much anything from the web.
Why don't you configure your WCF service to return data as JSON, and
use the DataContractJsonSerializer to deserialize the results?
Also, Tim Heuer cautions:
...Please note that while awesome, the SQLWinRT project on codeplex is a
wrapper to communicate with the classic SQLite engine...which uses
APIs that would not pass store validation currently.
Generic Object Storage Helper for WinRT and WinRTFile Based Database seem to have some promise.
But Daniel Stolt raises some good points:
It's awesome that there is good support for building OData clients and
other REST clients - but this only addresses the online scenario. The
"structured" part of Windows.Storage is a very limited model,
essentially limited to name/value pairs, insufficient for all but the
most basic scenarios. Yes there is local file storage, which is great
of course. But forcing every app developer out there to build her own
DBMS on top of local file storage will simply not cut it, especially
with all of System.Data having been removed from the profile. If local
file storage was sufficient for most device apps, then things like
SQLCE would have no purpose today already. And SQLCE clearly has a
purpose, and has played a very important role for occasionally
connected device apps for a very long time. There is also a tremendous
need for synchronization with a server-side database such as SQL
Azure, mostly to be able to roam data between devices. Yes there is
the roaming storage model in WinRT, but it shares the same limitations
of local storage mentioned above, and on top of this is very limited
in capacity (currently 30KB if memory serves). It is simply
insufficient for all but the simplest roaming data needs. Again,
forcing every app developer to design and implement her own
synchronization solution is very bad. You can do much better to enable
developers.
Many people are disappointed that the System.Data namespace is not supported in WinRT.
Richard Bethell said:
I don't even have words for this. This is astonishing. Leave aside for
the moment they want to force you to abstract to middleware for
database connectivity - I don't agree, but I can quasi understand a
rationale for that. I can even see pathways for developing like that.
But no System.Data.... at all? Do you even understand what you've done
to us?
What System.Data can do, outside of just having providers for Sql,
OleDb and other custom providers like Oracle, is provide a rich
abstraction of XML datasets that allow you to very quickly build a
data oriented Service Oriented Architecture.
For instance, I can easily create a web service using SOAP or WCF that
returns DataSets or DataTables, and then consume those objects easily
and directly. Being able to do this allows very rapid construction of
n-tier architectures, even without direct data connections available.
Without System.Data, and the power of DataViews, DataTables, etc. this
gets a lot harder. Sure you can custom create structs, put data in
there, and serve up structs, and use Linq to do whatever sorting,
filtering, etc. you want to do.... but it ends up being twice the
work, and makes code reuse a lot harder. And it means using our
existing service oriented architecture is impossible (without a big
overhaul.)
The withdrawal of System.Data is as big a thing for developers to deal
with as the loss of the Printer object in VB6 to vb.net 1.0 was. What
is harder to understand in this case is why it is necessary -
re-enabling it in the Metro profile can't possibly be a technical
difficulty of the product, can it?
It is valuable enough that I would seriously consider including Mono's
System.Data classes as part of any app I create (which would obviously
have to be open source.)
I think that this is another of those "it depends" questions...
The first and most obvious issue is that it very much depends on the context in which the application is running as to whether, to take the first case "Obviously...support...disconnected" is actually true - if the app is an internal corporate app then quite possibly not in that case no db == not work.
Secondly you could look (hmm, rash... one assumes you could look, this could be a bad assumption) at database synchronisation between a local SQL database and the remote db and so on and so forth.
Taking a step back... yes - you're absolutely right, look at it as being the same as phone or silverlight (although I don't know if there is yet RIA support) - but the thing is at this point its very hard to be prescriptive because given a general purpose platform one can therefore write applications to suit all sorts of purposes.
Not a hugely helpful answer really - but a start.
Having read #Jim G's answer it seems that I should probably withdrawn mine?

Upgrade Path For Legacy Reporting ("Embedded" Access 2003)?

Update: Albert D. Kallal has kindly started the discussion off, and to get some more opinions I'm adding a bounty.
This is a nontrivial question about maintenance of a legacy application myself and two other developers support. We are not the original developers, and the code base is 300,000 lines of MFC and business logic tightly coupled together. We don't know every single line of code 100%.
We do know the code behind the major components, and we know that it's poorly written. Our objective is to refactor the application out of 1995 and into 2010. Between the three of us there is (in aggregate) enough experience in software architecture and database design for us to fix the components that are poorly architected in code or incorrectly modelled in the database, but we don't have a lot of experience with modern reporting systems. Thus my question (once you get to the end of it...) is about reporting systems.
For anybody who reads this entire post, I am appreciative of your time. For anybody who reads this post and replies with solutions, experience (or sympathy!), I am both appreciative and thankful.
At work I have inherited the maintenance of an Access 2003 database that contains approximately 250 reports (and thousands of supporting queries) that acts as a reporting engine for our application.
The reports all have swathes of VBA in them for particular formatting or pulling extra information into the report. For this reason we are entirely locked into the Access platform, we can't use tools like BIDS to import the Access report objects without messing around to make the report display the same without VBA.
So to get ourselves out of this Access solution we need to put some time in going over every single report. Which means we're looking to pick the best longterm solution, since we're going to have to redevelop every report regardless of the platform we choose.
Furthermore our customers have a choice of Microsoft Access or SQL Server as their database. This means that all our SQL has to be written with the lowest common denominator in mind - JET SQL. We've got some wiggle room to drop support for Microsoft Access, but we'd need to build a case for it. If the best reporting system we can identify has strong support for SQL Server but little or no support for Microsoft Access this will accelerate us dropping support for Microsoft Access as a database.
The overall implementation of the report system is quite mediocre, when we want to display reports in our application we start a Microsoft Access process, find its window and reparent it to our application, strip off its window styles and then use the Access.Application COM interface to invoke some VBA that creates linked tables to the database (either a Microsoft Access MDB or a SQL Server database) and then opens up the report we want. Probably the only supported part of the process is using the public COM interfaces, the rest is an ugly hack. The other components in the application are equally underwhelming.
To "fix" our application we've got a new development plan, with development of our application split into (approximately) three parts every year.
4 months upgrading our application to support the latest government legislation in our industry
4 months delivering a new major feature
4 months "consolidation" (fixing what is broken)
We're currently at #3 now (for this year), and we really want to take advantage of the downtime to fix up the application, refactoring the major components. We have three developers, and want AppName v5.0 out at the end of 2012 (it's currently AppName v4.12). This gives us 36 months of development effort to approportion between several components (user interface, underlying database structure, reporting, etc) over the three consolidation periods we will have before then. The sum of the components that we fix will give us v5.0.
We've scoped out what we'd like to do with most of the components except for our reporting engine, and I'm posting on SO in the hope of getting some good ideas, or at least a feel for the work that's required.
I have two ideas for improving our reporting system. Both of them involve a moderate amount of work, and there is one consideration that neither solution addresses completely: in addition to the reports that we develop, our customers also have the opportunity to request bespoke development of reports. They're customer-specific, we take their Access database, augment it with their report and give it back to the customer. There's hundreds of unique reports out there - unusable if we turned the old system off. (And we have to turn the old system off eventually - we don't know how much longer we're going to be able to mess around with the Microsoft Access window to make it look like an embedded report. We already have two distinct code paths for Access 2003 and 2007. What if we can't hack up a code path for Access 2010 and all our customers have to use Access 2007?)
For both ideas, the intention is to stop supporting our current reporting system and let it run for as long as it will without maintenance. Maybe we can hack in Access 2010 and Access 2014 support, and the customer reports that were developed keep putting along for 5 more years. Over time, we'd migrate the most commonly used reports from the old Access database into their new format.
Idea 1: Microsoft.Reporting.WinForms.ReportViewer
The first idea is to write a wrapper around the ReportViewer control as a replacement reporting engine.
We'd need to move the project to C++/CLI (already on the cards), and instead of having to launch an entire process each time we needed to view a report we could simply instantiate this control. A bonus of this that the RDLC files that contain the reports are much easier to version control in Subversion than the Access 2003 database we currently have (we use Visual SourceSafe because the tools to integrate SVN with Access don't work well with the size of our Access database). The visual designer for RDLC files is also nicely integrated into Visual Studio.
This is more of an evolutionary rather than revolutionary change to the way we do reports, the ReportViewer control will take an RDLC file that has the report layout, and our application will take care of querying the data. Because our database might be SQL Server or Microsoft Access, we still have to write simple JET SQL. We're gaining better reporting (drill down looks nice), stronger authoring tools and easier version control, but is this worth the effort?
Idea 2: SQL Server Reporting Services and SharePoint 2010 with Access Services
The second idea is to kill Access as a database platform and migrate all our customers to SQL Server (we have hosted instances of our application for those customers who don't have the skill set to set up their own SQL Server instances). Once they're migrated we would use SQL Server Reporting Services as the reporting engine, with the ReportViewer control in server rendering mode.
In addition to SQL Server Reporting Services, I am curious as to whether SharePoint 2010 with Access Services could be used to rapidly migrate existing Access reports into a more manageable format. We'd take the Access report that the customer uses, convert it to an Access Web Report then make it available for them on a SharePoint site. This would only be for our hosted customers, but if we find a way to deal quickly massage the VBA out of customer reports we could churn through the several hundred custom reports our customers have.
I'm also interested in the ability to use an Access Web Navigation Form to act as a portal to all our reports. We'd host a web browser control inside our application which would give customers access to their own reports and to our standard suite.
We'd get all the benefits of Idea #1 plus the ability to write in full Transact SQL, a reports portal, and (hopefully) a reasonable upgrade path for customer's proprietary reports.
So, my question is: am I going about this the right way? Are these viable solutions for modern reporting systems, or laughable? We have a strong preference for using the ReportViewer control either in client rendering mode where our application processes the data, or in server rendering mode in conjunction with SQL Server - but are there reporting systems like Crystal Reports which offer better reporting and better migration paths for our legacy Access reports?
If you had up to 36 months of developer time, how would you do this?
Well, ok, no one else jumping in, I give this a go.
Quite interesting how you talking about a report writer that 15+ years old. Back then the Access report writer was beyond state of the art. It was a country mile ahead of everything else in the industry. Even today a lot of competing report writers don't have the concept of sub reports that allows modeling of relational data without having to resort to code or even SQL. Then, throw in programmable VBA, then the result is something that's very unique and powerful.
For access 2007, the report writer received some more nice upgrades in terms of layout controls but that going to be of little help here.
And, for 2010 we can now display reports in a sub-form control. This feature was added to facilitate use of the new access navigation control. Access 2010 has a new web browser control (works in forms or reports), and there also a new navigation control. Your post hints that the new navigation control and the web control are somehow related to each other but they completely different features.
Both the new web browser control, and navigation control can be used in both web appliations or 100% client only applications. The navigation control is nice since you can build that nav contorl by drag and dropping reports onto the nav control to build up a up a list of reports to choose from (it is slick and easy and nice). And with this navigation control, we can actually build some nice drill down type of interfaces for reports.
As you noted for access 2010 we now have web publishing of access reports and this feature is based on SQL server reporting services (they are RDL reports). However, two important issues here is no VBA is allowed inside of the web reports. And, I also point out that there is no automatic conversion utility that is built into access that will convert existing reports into web based reports. So to build a report that's going to be designated and published to the web, you have to choose specifically to create a web report to accomplish this goal. So this answers and clears up one question of yours of will this help you convert existing reports to SQL server, and the answer is no. So, Access will not help you convert existing reports to web based RDL reports (As noted, Access uses RDL and sql reporting for those web reports - those reports also render in the access client side without conversion).
Access has a great path for web based reports via SharePoint and also Access Web is coming to Office 365. However, keep in mind this ability is not going to help much with the existing reports that you have.
In fact one of the things I would be looking at if you're going to use winforms report viewer is the change in where that existing VBA report code will be moved to? You not really mentioned this issue. As noted one really interesting and great feature of those reports is that imbedded VBA code. Often that VBA will have been used because SQL and something like RDL will NOT work because neither of those languages (sql, and RDL) are procedural code.
I can't stress how important this concept is. So, this quite much means any report writer replacements means that code will now have to be OUTSIDE of the reports and moved into your application. So, keep this issue in mind as now when you issue new reports, you also be issuing new procedural code that NOT be contained in those reports. This code will have to become part of your application (so, to issue new reports, you will thus also be issusing a new version of your software).
You are not likely to find much that allows procedural code to be imbedded inside the report like you can with access. So, that report code and logic will now have to be built and maintained within your main application and outside of the reports.
At the end of the day, I should point out the old adage if it ain't broke, then don't change it. Access been around for a very long time, but we seen significant investments from the folks in Redmond into this product during the last few years, so it shows no signs of dying anytime soon.
So, one possible suggestion is to keep the status quo, and continue going the way it works now. I mean you stated that you have to continue supporting JET for this anyway so you not getting away from having to use a major part of Access anyway. So, you continue to have to use JET engine anyway. So, you just dumping the report side and you still have use the JET data engine anyway.
However, assuming this decision's been made, I can't really suggest what report writer you should replace the access one with. Obviously considerations for the next report writer should have a seamless path to web even if they are NOW going to be rendered on the desktop. It makes no sense to make a large investment today without web considerations in some fashion.
I do think SQL server reporting services is a good choice due to the web ability. And, as an access developer we also have the option to create web based reports but they also render perfect in the access client on the desktop side (and this works when you have no server and no conversion issues exist when publishing these reports to the web, or using them local on the client). So, even if you don't use access, do choose something that allows reports to render both desktop and web like access 2010 allows.
I would consider building the report system around some .net tools. This would likely not play too well as an embedded report system inside of your existing application, but it would allow you to issue new reports, and you would not have to touch your existing code base for each new report issued. This issuing of new reports that have procedural code needs to be resolved. You likely can now issue new reports without having to modify the main application because those reports can contain code inside. I would be looking to use something that would allow new reports to be built and issued but you not having to issue new edition of your main software. You might not embeed the code in the reports anymore, but you need to palce it somewhere, and hopefully outside of your main application.
Wow, this is a great question and Albert has given you a teriffic answer.
Unfortunately I do not believe there are any magic bullets to solve your problem. I have used Microsoft Access since it's first version, and always felt it's strongest feature was as a report generator, particulary when used with SQL Server. As you undoubtably know, one can often have issues with corrupted Access databases in a multi-user environment and SQL Server addresses that issue very nicely.
To my way of thinking the biggest problem with Access is that Microsoft brought out managed code (.Net) ten years ago now but Access is still a native application. In an ideal world Microsoft would rewrite Access in C# using all the latest features such as improved support for multiple processors etc. Unfortunately I do not expect this to happen any time soon.
Visual Basic for Applications (VBA) was definately far abead of "state of the art" when it was introduced, but today I believe most would agree that coding in VB.Net with Visual Studio is much more productive than continuing to develop in VBA.
SInce the selection of a new report generator is something you will need to live with for several years, perhaps it would be helpful to consider what an "ideal" report generator for the next ten years should look like?
Personally I would want:
1) All the great graphics and ease of skinning and branding that Silverlight provides.
2) Great multiprocessor support (you must have noticed how the UI thread in Access often appears "unresponsive" when running long queries or reports).
3) Support for lots of devices such as cellphones, iPads etc. While today the desktop and web dominate, these are becoming increasingly important (unless for some particular reason they are not important to your customers going forward).
4) Support for modern programming practices such as test driven development, dependency injection etc.
Please do let us know what you decide upon.
This is a long shot, but is there a possibility of using Access to generate a saved PDF and displaying that in your app in a PDF-viewing control that is part of your app, rather than external? Or export to XML or something (I haven't a clue what XML export options are available for reports in recent versions of Access, if any)?
The point is that you'd not have to rewrite the Access reporting logic, but you'd have eliminated the fake embedding and replaced it with something that's really embedded in your application.
What you'd be giving up is the perhaps the options that the Access UI gives the user, but I'm not sure how useful that is (I'd tend to not want those options available!).
Also, you'd be persisting the reports to disk, but I'm not sure this is much of any kind of significant issue, either, but it would entirely depend on the context (I'm assuming you have no 1000-page reports with heavy graphics, etc.).
You could take a look at ActiveReports by Data Dynamics. We use it within our apps for paperwork type reports (eg, invoices) and it's extremely flexible, far more so than what you can achieve with the MS reporting tools. For reports that are genuine reports rather than paperwork we use reporting services. It's been a while since I had to port an access report to active reports, but there is little or nothing you could do in access that you can't do in active reports. I'm also fairly certain that it has a decent tool for import access reports. There's a fully functional evaluation version available for download, which, unless they've changed things, just printes a watermark in the report footer rather than expires after a fixed evaluation period. Well worth a look, I'd say - Here's a link to their site
I won't get into any specifics since I'm not a Microsoft developer, but I can answer on how to integrate a legacy product into the current or new product. As for the 36-month question, see the end of this answer.
Usage Requirements - how do you intend to use the legacy code in the context of your new code?
Identify Use Cases - drill down into usage and create a use case for each transaction between the new and old code.
Identify I/O - drill down into each use case and identify I/O requirements
Write Tests - for each I/O pair, write tests to determine the best way to handle that I/O pair.
Reuse - reuse your tests to create a wrapper/API for the legacy code.
Future - as you replace legacy code with new code, let it match your wrapper/API so you can keep refactoring to a minimum.
If I had 36 months of development time to spend, I would spend 3-6 months writing a wrapper/API and then replace each unit tested I/O pair with new code every 7-10 days utilizing sprints (scrum/agile).
For the data store, I would absolutely move from Access to some SQL server product and prioritize that requirement for the new code.
I've used Crystal, Access (2 - 2007), SQL Reporting and now DevExpress and am very happy with DevExpress's reporting engine. It is specific to .net, but can be utilized by Windows Forms, ASP.net Web pages, WPF and Silverlight. If you are willing to utilize some .net controls, I highly recomend it. It can use just about anything as a datasource and is very flexible. My current projects aren't as complex as some things I have done in the past, but I would venture to say that I would rather do complex reports using the DX engine over any other I have used.
They have an End User designer that includes scripting capabiliities and DX is actively adding functionality.
I would recommend taking a look at: http://devexpress.com/Products/Index/Reporting.xml

What would be a good Coldfusion-based bug tracking software?

What I am looking for is a tool that easily or automatically sends coldfusion error messages to their system.
Then I can use the web-based interface, to manage priorities, track who fixed what and so forth.
But I want to use this to help us deal with errors better, but also to show the importance of a bug tracking system to my fellow works.
System Requirements: Apache, Windows, Coldfusion 8 Standard, Sql Server 2005.
Financial Requirements: Free or Open Source
Goal Or Purpose: To encourage my fellow workers to want and use a bug tracking system.
Does this re-write make more sense?
Thanks
Craig
Wiki has a list of issue tracking software, maybe this list could help.
http://en.wikipedia.org/wiki/Comparison_of_issue_tracking_systems
You may be able to find a hosted service and use either email or web services to create the ticket using onError. With that said, a simple issue tracking app could be created for your site using the same DB used to drive the content. 2 or 3 tables would take care of the data storage and you're already using CF so the application layer is already there.
HTH.
I have been heavily using this type of a setup for several years by email only, and the last 3 years with a Bug Tracking Software.
I must say, the bug tracking software has made my life so much more peaceful. Nothing is left, forgotten, or slips through the cracks. It's easy to find trends in errors, and remember "all the times" it happened.
Our setup is like this:
1) Coldfusion + Appropriate framework with error reporting - It doesn't matter what you use. I have used Fusebox extensively and am making the transition to ColdBox. Both are very capable, in addition to Mach-II, FW/1, Model-Glue, etc. The key part you have to find in them is their ability to catch "onError", usualy in the application CFC.
2) Custom OnError Script - Wherever an error occurs, you want to capture the maximum amount of information about that error and email it in. What we do is, when an error occurs, we log the user out with a message of "oops, log in again". Before logging them out, the application captures the error and emails it to Fogbugz. Along with it, at the top we include the CGI variables for the IP address, browser being used, etc. Over time you will find the things you need to add.
3) Routing in Fogbugz. A 2 user version of Fogbugz is free, and hosted online. There are two main ways to submit bugs. One is to email one in at a time. So if an error happens 2000 times, you get 2000 emails, and 2000 cases. Not always the best to link them together, etc. They have a feature called BugzScout, which is essentially an HTTP address that you do a form post to with cfform with all of the same information you would have put into the email. There's plenty of documentation on this and something I've always wanted to get around to. I had a scenario of 2000 emails for the first time happen a few weeks ago so I'll be switching over to this.
Hope that helps. Share what you ended up doing and why so we all can learn too!
I'm surprised no one mentioned LighthousePro (http://lighthousepro.riaforge.org). Open source - 100% free - and ColdFusion. As the author I'm a bit biased though. :)
Hard question to answer not knowing what kind of restrictions are there? Do you have any permissions to install anything? Also most bug-tracking systems require some kind of database support.
I have a suggestion. You can put in place a basic bug-tracking system, that just allows people to create tickets, and allows you/someone else to close it.
More Windows based tools are mentioned here
Good open-source bug tracking / issue tracking sofware for Windows
Any reason why coldfusion specifically?
I really like Fogbugz from the makers of Stack Overflow. For one user it's quite reasonably priced. I enter some bugs manually and have others emailed in.
A lot of bug tracking software will expose SOAP methods for entering data into them.
For example, we used Axosoft's OnTime and that exposed some WSDL pages that I consumed in my application. I was told that Jira did as well.
There are few in CF411 list: Bug Tracking/Defect Tracking/Trouble Ticket/Help Desk Tools Written in CFML
We use HopToad. There is another bug-tracking app called LightHouse that integrates with HopToad so you can easily create a [bug] ticket from an incoming exception. HopToad has an API of which there are many clients, you want the CF based one:
http://github.com/timblair/coldfusion-hoptoad-notifier
Even if you dont use HopToad and you end up using a different service or roll your own, if you needed to write your own API client you could leverage the code or pattern(s) of the above HopToad client.
A lot of good information from everyone, and I really do appreciate the efforts given. But not the answer i was looking for. Which maybe means, that what i want does not exist, yet.
So i may have to roll my own solution...Or maybe integrate with another existing app...
Thank You all.

Why use Oracle Application Express for web app?

I believe we're moving to Oracle Apex for future development. I've read about Oracle Apex on wikipedia and it's pro and con. It seem to me the con outweigh the pro but maybe I'm wrong. I get the sense that Oracle Apex is for DBA with little or no programing knowledge to setup a web app quickly sort like MS Access for none programmer.
If you have Oracle Apex working experience, can you share your thought? From Wikipedia's entry, it doesn't seem like you need to know any programming language at all but just the PL/SQL?
edit: Is Oracle Apex scalable? Can it handle traffic like Facebook's size?
edit: after working nearly two years on Oracle Apex 3.2. I can safely said that I hate it and I don't see why anyone would want to create web app/page on browser, pl/sql and no way to do version control.
Thank.
Jack
Please note my experiences are with APEX 2.x-3.0.
I used Apex for a few internal apps over a 12 month period but eventually dumped it for ASP.NET.
Some Oracle evangelists claim it is capable of creating highly dynamic content on par with the more mainstream frameworks like ASP.NET/J2EE. Technically this is true, but then technically its also true that you could cross the atlantic in a one man canoe. If you are tempted to throw yourself into a APEX project of even moderate complexity then I suggest you look at the APEX sample of a simple discussion forum. Compare it to an ASP.NET MVC discussion forum sample or a RoR implementation.
Having said that:
The Good
Incredibly easy to generate a respectable web app with basic CRUD data entry, simple reporting and populate it with data. If you're the IT guy who's been tasked with consolidating a company's mess of Excel/Access dbs into a central DB/web environment then you should take a look at APEX, it very well suited for this task. If you expect the scope to grow to something of even moderate complexity then I would move straight to a more flexible framework.
If you are a DBA/PLSQL guru but have no experience with traditional web development you'll be well prepped to expose existing business logic in a web app without stuffing around with HTML/CSS/JavaScript if you dont want to.
APEX support forum has a ton of info and is well staffed by APEX devs.
The Bad
My experience with Apex began to go downhill when apps moved beyond CRUD data entry and required more dynamic and event driven behaviours.
The web based GUI is not cool. Debugging is painful.
Version Control? Who needs version control?
When you (inevitably) need to do anything outside the limited scope of the framework, you'll have to get your hands dirty with PL/SQL. Writing business logic against the database is fine, but generating HTML from PL/SQL procedures felt uncomfortably archaic in 2007.
Given the large number of sneaky places you can hide page and redirection logic, the program flow is both difficult to visualise and not naturally conducive to modular, separable and reusable code. OOP developers will be not be impressed. It's possible to have well structured maintainable applications with APEX but its harder than it should be. This is worlds away from MVC.
Unacceptable number of framework bugs in the versions I used. I'd hope this has improved with recent versions, but the paradigm of integrating the IDE into the APEX platform itself caused me some of the darkest, soul destroying debugging sessions of my life. As an example, I was trying to reproduce an intermittent bug that would cause a user to lose their session data. Using the session information popup I saw that occasionally the session data would change when it shouldnt have. I spent 2 days trying to find the error in my code with no luck. Near delirious, I noticed by pure chance that I could reproduce errorous session data in the debug window but the application itself wouldnt go into an error state. My heart sunk when I realised what might be happening. Oracle later confirmed that I'd found a bug in APEX that caused the session information window to intermittently show me data from a prior session. I'd wasted 2 days debugging a session related bug with a buggy session debug window. That was the last Apex app I built.
PL/SQL is not and will never be the Next Big Thing in web development. After working with APEX for a while I realised it wasnt going to make me a better web developer. Mastering APEX is really about PL/SQL. Thats fine if you plan to focus your career on Oracle technology, just be aware that APEX is so tangential to the direction of mainstream web technologies that the portable set of skills you can take from APEX to other web frameworks is minimal.
If you are considering APEX to provide simple web based data entry and reporting, its worth a look. If you are looking for an alternative to .NET/JAVA/PHP for dynamic web content and rich UI interaction I'd advise you to look elsewhere.
I read this page with great interest. Our development team has using Apex for about 2 years now, and I'd like to sum up our experience.
For building basic CRUD applications, Apex really is excellent. In fact I recommend you try it yourself. We did face some initial minor difficulties setting it up, but these seem to have been ironed out in the 3.2 release.
The good
Great for simple applications. If you app will grow in complexity, consider an alternative solution.
The built in templates mean your app looks quite professional (although some will debate this).
A good support forum and community, with plenty of eager people on hand to assist you.
Some superb built in controls. Love the graphs and reports (but see below).
The bad
The debugger is abysmal. If you have used Visual Studio (and even ancient versions of Microsoft Access), you will cringe at the debugger. No breakpoints, debug messages spewing out to screen in a big list, having to manually print debug messages to the screen. Horrible. The cause of many, many hours lost to support.
As soon as your application becomes complex or requires any rich functionality, you have to resort to Javascript and HTML / CSS hacks, which make debugging and support even more complicated (although you can use tools like Firebug or Visual Studio to assist with this).
We've encountered unexplained session state bugs, and stylesheets becoming 'detached' from the application without explanation - to name a couple of issues.
Supporting unfamiliar apps can be challenging, as it can be difficult to follow the page logic flow without a good debugger. And I don't buy the stock response of 'well - apps should be coded better'. Because in the real world, they aren't - especially when you're using a contractor.
Reports look good but not much good if you can't print them or export to PDF. Of course you can shell out for a reporting server, in the end we used another solution.
Overall
I would say by all means use Apex for simple CRUD apps. For anything of more than mild complexity go for .Net or Java. I wouldn't take any notice of the Wiki article on Apex as it's very skewed. Note how 'difficult to debug' (in my opinion the biggest failing) has been erased from the article.
Something to be very wary of as well is the ludicrous claim that you can quickly convert Access databases straight to Apex. Yes it will work if you Access DB is very, very simplistic. Anything moderately complex, forget it, as we found.
We would definitely not use it for web facing apps, only internal. There are simply too many difficulties doing things you would take for granted in say, .Net. I know there are sites out there such as AskTom, but these are not exactly complex. Will we see the next Facebook on it? I think not - although I am sure someone reading this will have a crack at it.
Apex is summed up in a previous comment - managers see the demos, and quickly buy in, convinced that they've found a silver bullet that will slash development times. I've had managers calling me and saying, we need a db app with 40 tables building in a week in Apex please - that's how far the myth has perpetrated. The reality is somewhat different. Yes some things are quicker, substantially quicker, but you will lose the time in other areas - debugging, support, and customisation.
Of course you are best deciding for yourself. Install it, give it a go, you may like it. But don't be fooled by the fast development time claims until you've given it a good going over on a realistic application.
I am involved in a huge project to migrate a 5000 module Oracle Forms application to APEX. This is an extreme use of APEX, but it's working just fine. It is a complete myth that APEX is suitable only for small internal apps built by DBAs, interns or end users: it is certainly suitable for those too (and more suitable than most other tools), but it can also be used to build extremely sophisticated applications.
To build a sophisticated application (rather than a default out-of-the box APEX one) you will need someone on the team with Javascript skills, and someone with CSS skills. But most developers will just need PL/SQL initially.
Is it scalable? Yes: probably more scalable than most other solutions! APEX adds very little overhead to the database server, and only the most minimal of application servers is required. "Facebook size"? I don't know for sure but I don't see why not, assuming you have an Oracle database on a machine large and powerful enough to handle "Facebook size" data and transaction volumes. Like any Oracle project, scalability is impeded mostly by bad database designs and poorly written SQL, not by the tool. Not many people ever find themselves building "Facebook size" systems though: are you?
APEX is a framework that uses the database and PL/SQL to produce web pages. If you can figure out what the output to the browser will need to be you can create it in APEX. If you find any part of the framework inhibiting you can write PL/SQL procedures and expose them to the web server directly but still take advantage of the security, logging, session state, etc that the APEX system manages for you.
You should know PL/SQL, SQL, HTML, JavaScript and CSS. Sure the interface looks like a big data entry application but the data you enter will mostly be code snippets in each of these languages.
It scales as well as the database does. It typically uses Apache as a web server but is only used to serve static files and pass requests back to the database, where the web pages are created by the PL/SQL code in the APEX schema. You can use AJAX to minimize the size of the traffic traveling up and down the pipe. You can set caching for specific items, lists, page regions, pages, etc.
Since most things are pretty simple to do with the framework, naturally there will be some things that are a little more complicated to do within the framework. The color coding example given above might be something you do with CSS or maybe you would need to turn to print statements to produce the output you need. The thing is to learn the how the framework makes life easier and then when you hit a limit you can easily resort to more direct methods.
Coming from VB.Net you will miss the step by step debugging and the drag and drop. You will never miss the fact that some part of the page lifecycle will do a bind and reset the values you bound to an object in another part of the page.
Good luck.
Greg
I am a DBA and I never had to program with APEX or recently anything else (aside some bash scripting and custom SQL scripts for administration purposes) because my job is far away from developing applications (except being pain in the ass of developers that is). Of course my background is developer though and I do believe APEX is future for strictly Oracle based data centric programs.
Now the keyword here is data centric since I disagree with many other DBAs that all applications are data centric (you know the kind of DBAs who still think ODBC stands for ORACLE Database Connectivity). Of course all applications involve data but are all applications data centric? I doubt, just as I doubt APEX would be ever used for image processing or mobile gaming kind of apps. However, despite all the hype with RIA and Web 2.0 the fact is most of businesses around us are hungry for those plain old data centric applications and Oracle is best database around and I can assure you Oracle and APEX can handle much much more than Facebook scalability provided of course you have put the same amount of money as Facebook guys in underlying infrastructure.
By the way I also hate Oracle's design of APEX themes (awful unprofessional UI, just imagine it as main UI for a bank or airline business), limited capabilities (although that seems about to change in the future), many many more issues (professional PDF reporting without paying amount of Enterprise Database license for BI publisher??) but most of all marketing of APEX as substitute of Access or Excel because it gives bad impression it is for kids and I can assure you my friend I would never allow kids to touch my databases :)
You see, Oracle has a gem called PL/SQL which was perfected over the years to handle data in much more intutitive way than any other language. Now that gem is withering with slow death of Forms/Reports and I am positive no fresh graduate will ever bother learning it strictly for database stored procedures (just see the raging war between Java and .Net developers and you realize that once you touch curly brackets {} anything else becomes a heresy). Alas for thousands upon thousands of excellent PL/SQL developers APEX remains the only sanctuary where they can remain productive and develop outstanding data centric applications and without APEX PL/SQL will surely become next COBOL. This is why PL/SQL community will drive Oracle to transform APEX to grade A platform much more powerful than what we are seeing today. Either that or say bye bye to PL/SQL and join curly brackets front (by the way it is never a bad idea to at least try different technologies when you are developer, at least you get an idea why it is not neccessarily greener at the other side).
I'm not sure why you don't consider PL/SQL a programming language...
APEX is ideal for internal applications where you want a lightweight UI on top of your data. You can build that rather easily without having to write any code.
I also find APEX to be very good for developing smaller customer-facing applications. I wouldn't want to build a giant application that is going to have hundreds of developers working on it using APEX. But if you have a case where 3 or 4 developers are building a smallish site, APEX is likely to be just as good as Java/PHP/ASP.Net/whatever assuming equally skilled developers. If your developers all have lots of ASP.Net expertise, for example, they're going to have a learning curve to write APEX apps. You'd have at least the same level of difficulty, though, if you had a bunch of PL/SQL developers try to learn how to build ASP.Net sites.
That Apex is only suitable for non-programmers and DBAs is an unfortunate misconception. We have used it to build several line-of-business, mission-critical, customer-facing web applications.
The GUI is handled by Apex page templates (HTML), CSS and a bit of Javascript to enhance the user experience. All business logic is placed in PL/SQL packages. This is key to making your application easy to maintain, and to reuse the business logic in other Apex applications and from other client tools, such as C# WinForms, Delphi, Java apps, etc.
As for performance, the Apex engine adds little overhead and the response times and scalability of your application depends largely on the quality of your SQL queries (and the data model). Think about it this way: With Apex, the only thing between your user and the database is a thin layer of PL/SQL. It's only common sense that this has to be faster than a typical .NET or Java application that has seventeen layers of complexity (typically including lots of web services and object-relational mapping layers) between the GUI and the database.
Don't put buisiness logic into Apex. Use it for presentation only.
If you put the code in the app your will not be able to maintain it, and you'll get RSI from all that clicking. I always create a wrapper layer, and in the oracle world follow Tom Kytes advise - put the business logic as close to the data as possible. This also means that you PL/SQL modules can be called by other systems etc - and best of all - the real meat of your applicaiton will be in straight text files that can be manipulated with your favourite text editor / IDE.
Create a view with all of the data to be retrieved for each screen.
Create a single wrapper package for all CRUD operations. (Thats is Create, Read, Update and Delete I presume)
In short:
DO NOT PUT YOUR APP LOGIC IN APEX.
Thats my advise . . . .
Oracle's Metalink support site was written in Apex, so it definitely CAN scale. They're migrating to a newer Flash-based support site now though. I understand they acquired that platform through an aquisition of another company, rather than building it in response to any Apex limits.
If you want 'super sexy' with any web-app, you'd probably need to go Flash/Silverlight/Air. Under that, any HTML based site, including an Apex one, can be prettied up with Javascript. The JQuery library will be included in with the next main version of Apex (4.0), though you can include that (or any other library) now.
The caching issue mentioned in the Wikipedia article has been addressed, though most installations would still put images and scripts on a conventional directory structure rather than serving them out of the database.
While you are locked into the Oracle database, I don't get the 'platform' lock "con" in the article. Oracle is available on Windows, Linux and AIX (amongst others). That's a lot less lock than ASP / SQL Server.
On my project, we use Oracle APEX for internal views of our system. It works very well for that purpose.
There's no programming required. PL/SQL and even SQL are optional. As a result, our DBA and operator can mold the view to their liking.
On the downside, if there's a feature you need which is not programmed into the system, it's very hard to add it. For instance, we wanted to color-code our output and have not been able to do that.
I would not want to have a customer-facing site built on APEX.
On the question of scalability, one nice thing about APEX is that it's built on Oracle. Focus on writing good SQL and designing the tables properly, and things should scale just fine. I'd be more concerned about getting enough users for scalability to be the problem.
Enjoyed very much reading the thread from top to bottom, as it felt like a hot debate. To remind the start of the thread it started as "I believe we're moving to Oracle Apex for future development..." jack being a .NET programmer was worried about the decision of his management and thought of finding counter facts for Oracle Apex which ultimately ended up washing the dirty linen (of all web frameworks) in public.
Despite the fact that the victim was Oracle Apex, the same could happen to either .net or j2ee if the debate was between .net and j2ee gurus. My point is all frameworks has their own pros and cons. That is why actually we have so many. Its a waste of time debating over what is more important to live (sex, food or water?) Naturally we select the most appropriate item when that is needed.
Oracle APEX suites for environments where you have lots of Oracle Databases and when you really have Pl/SQL enthusiasts. Can really build rich, complex, web 2.0 Data Centric applications really easily (Apex 4.0) but debugging and version control is still a mess and you also will have to stick to an Oracle database(yes you can have workarounds but not robust).
If you'd like to see an external web site done in APEX, I suggest looking at the Oracle Tools Users Group site, or Ask Tom. Both are large, frequently used sites with much customization.
Your impression from the Wikipedia article is correct. The only programming knowledge you need is PL/SQL. If most of your site will be simple reports, you don't even need to write the SQL queries, and the wizard interface will build the query and the output for you. If you want cool client side work, you will need to know CSS and Javascript. The PL/SQL is only for the more complex data validation.
I disagree. It is not only suitable to devs with a few developing skills or DBAs.
We actually produce highly customized apps using CSS templates of our own, a lot of dynamic actions and interaction (using jQuery and several frameworks), fine-tuned security, our own apex plugins and complex PL/SQL processes.
Of course, I am using apex > 4.0.
So, you can build complex apps (we have up to 100 different processes/validations and dynamic actions per pages) if needed. And it could require strong programming skills to code properly in javascript and PL/SQL(OOP) or Java stored procedures + a good knowledge of SQL to define opimized queries of up to 500 lines of code using recursive SQL and some funny features.