Business Process "Observer" application - unit-testing

My client is requesting to be notified any time one of their business processes fails for any reason. I had the idea of writing a seperate application that will run as an "observer" and check for various parts of the process.
An example would be that a daily file was generated and uploaded to an FTP location. The "Observer" might have the following "tests" :
Connect to the FTP
Go to folder where file should exist
Find file with naming convention
Verify create date of file
Failure of any step will send an alert email and also log to a report (both in case database is down OR email is down).
My question is.... Are there any products out there that do something close to this? I'd rather buy if there is something robust out there. If not, this almost seems like a unit test platform... Anything out there for testing I could potentially repurpose?
As an FYI, we are a Microsoft/Windows based shop.
Thx in advance!

You could even use a Continuous Integration framework for this. They normally monitor source code repositories and build&test things, but could be used for this as well.
For instance, Hudson, Jenkins and CrouseControl.NET are a few open source ones that are good and can easily be set up for something like this. Only change the monitoring of a repository to either filesystem over FTP and write a small script which checks what you need. Everything else comes for free by the framework, i.e. email, web interface for monitoring and running things.
Just an idea.

Related

How to write automated tests when using cloud APIs?

I'm adding to an open source project that uses in this case some Azure cloud functionality, but the same general problem is applicable for any cloud API. I want to write tests for my code, but the results of the test are reliant on something having happened in the cloud service I'm using, and in order for this to happen, I need to supply credentials to the cloud service. In a private project, I could certainly just add my cloud credentials to the testing environment, but for public/open source projects, I can't do this. I can test locally easily enough, but this project uses CI (as do many OSS projects), so this can't really be done.
One approach seems to be using mock or something similar, but that doesn't actually seem to test that things are happening as they should be, and strikes me as a mostly pointless method to achieve 100% coverage.
Are there any 'virtual test cloud' environments that can be spun up to create an identical interface to the cloud service in question, but only for testing? How do these deal with side effects (the code in question creates a DNS entry, and ideally would test for the actual existence of a DNS entry using the system's resolver rather than another cloud call)?
How do people do this kind of testing?
I start with a spike solution to learn how to pass the required credentials. With this knowledge, I can TDD an acceptance test to call a simple API and get a "success" result.
I exclude the credentials from my repository. Instead, I include a template file with instructions.
From there, I drop down to unit tests to TDD sending requests and receiving responses. I don't test actual communication with any service. Instead:
Test the contents of requests.
Create responses and test how they're handled. This makes it really easy to test all sorts of error conditions.
Once I've TDD'd credentials, requests, and responses, I use what I call a spike test to confirm that everything is in fact working. Basically, this uses non-automated confirmation in anything I can quickly hack together.

Application update server

my application (QT, MSVC2010) requires constant updates both in code (the executable file itself) and data (files to be used by the customer).
The main issue is that not every user has the right to download the whole set of updates so I need a way to send him only the appropiate files.
I decided to do something like this:
Client: send user ID
Server: check user ID in database, send him
appropiate updates
Client: receive updates
At this stage I'm not focusing on security issues (authentication, encryption), I'd just like to know if there is any ready solution I could use or if I have to code this by myself. Even a partial solution would be of great help.
I'm not aware of any server side application that can handle this kind of situation but I must admit this is really not my field.
Last point: I need to avoid any web based solution (user logging in a website, PHP and so on) for a very long list of reasons.
Thank you!
I don't know if it's really an answer, I can just describe how I've implemented very similar design in a simple way some time ago.
1) Client has a version information build in (through .rc file) and user credentials
2) Client access central database checking if there is a URL for it's current version and user credentials.
select url from some table for credentials and version more then current version
3) Client fetches updates as single zip file using Url from Http/Ftp using standard Qt classes. If you need custom made protocol you might want to implement some logic over it.
4) Client update itself based on received data
5) Client notifies server about update complete so we know whats installed where
It's really very simple skeleton with a lot of limitations, but it's solved perfect everything I needed in that project. So you can deploy an update for particular user without affecting others.

Virus scan for files being uploaded to Sitecore

Are there any best practices on virus scanning all files being uploaded to the Sitecore media library (and ultimately stored in Sitecore's DB)?
I searched all over the web but there is too much noise caused by the word virus since many people seem to have performance issues on server that have anti-virus software installed.
I don't know if it is an established best practice, but I would probably add a processor for the uiUpload pipeline that used an API or command line process for a commercial antivirus product. Other than the fact that it is in a pipeline processor, it shouldn't really be much different from how you would do it in any other ASP.NET application. Performance will definitely be a concern, but you could create a dialog with a psuedo progress bar to give some feedback to the user.
Take a look at this post by Mike Reynolds. It may help you out:
http://sitecorejunkie.com/2013/11/09/perform-a-virus-scan-on-files-uploaded-into-sitecore/
I am not aware of any published best practices, but if you are able to add a step in the upload process, you might want to take a look at Metascan, which provides API level integration to multiple antivirus engines. Using this, you could build a workflow for those uploaded files to scan them prior to them hitting your Sitecore media library by establishing rules based on the results of the antivirus engines used in your Metascan deployment. There's also a hosted version at metascan-online(dot)com
Disclaimer /// I am an employee of OPSWAT, who produces Metascan, but it appears to be a potential solution to your issue
In one of our recent Projects, we were faced with a requirement to scan incoming files for virus. The problem in the project was that the files after begin uploaded, were made public available on the website.
The way we solved the problem was to implementing https://www.virustotal.com/. Its a free online virus scanner that has a public API. You can send files via SSL.
We implemented the solution by adding newly uploaded files to a Sitecore workflow. The workflow would handle the scanning of files, and move the files to the final stage of the workflow, if the files wasn't infected. If a file was infected, the file would be deleted.
A Scheduler is running every 5 minutes to check for new incoming files with the workflow.
This also means that the files aren't available straight away, as the scheduler has to check the file, but you should be able to implement the functionality directly when the user has uploaded the file, by adding your custom code to the upload pipeline.

Is there an ideal way to move from Staging to Production for Coldfusion code?

I am trying to work out a good way to run a staging server and a production server for hosting multiple Coldfusion sites. Each site is essentially a fork of a repo, with site specific changes made to each. I am looking for a good way to have this staging server move code (upon QA approval) to the production server.
One fanciful idea involved compiling the sites each into EAR files to be run on the production server, but I cannot seem to wrap my head around Coldfusion archives, plus I cannot see any good way of automating this, especially the deployment part.
What I have done successfully before is use subversion as a go between for a site, where once a site is QA'd the code is committed and then the production server's working directory would have an SVN update run, which would then trigger a code copy from the working directory to the actual live code. This worked fine, but has many moving parts, and still required some form of server access to each server to run the commits and updates. Plus this worked for an individual site, I think it may be a nightmare to setup and maintain this architecture for multiple sites.
Ideally I would want a group of developers to have FTP access with the ability to log into some control panel to mark a site for QA, and then have a QA person check the site and mark it as stable/production worthy, and then have someone see that a site is pending and click a button to deploy the updated site. (Any of those roles could be filled by the same person mind you)
Sorry if that last part wasn't so much the question, just a framework to understand my current thought process.
Agree with #Nathan Strutz that Ant is a good tool for this purpose. Some more thoughts.
You want a repeatable build process that minimizes opportunities for deltas. With that in mind:
SVN export a build.
Tag the build in SVN.
Turn that export into a .zip, something with an installer, etc... idea being one unit to validate with a set of repeatable deployment steps.
Send the build to QA.
If QA approves deploy that build into production
Move whole code bases over as a build, rather than just changed files. This way you know what's put into place in production is the same thing that was validated. Refactor code so that configuration data is not overwritten by a new build.
As for actual production deployment, I have not come across a tool to solve the multiple servers, different code bases challenge. So I think you're best served rolling your own.
As an aside, in your situation I would think through an approach that allows for a standardized codebase, with a mechanism (i.e. an API) that allows for the customization you're describing. Otherwise managing each site as a "custom" project is very painful.
Update
Learning Ant: Ant in Action [book].
On Source Control: for the situation you describe, I would maintain a core code base and overlays per site. Export core, then site specific over it. This ensures any core updates that site specific changes don't override make it in.
Call this combination a "build". Do builds with Ant. Maintain an Ant script - or perhaps more flexibly an ant configuration file - per core & site combination. Track version number of core and site as part of a given build.
If your software is stuffed inside an installer (Nullsoft Install Shield for instance) that should be part of the build. Otherwise you should generate a .zip file (.ear is a possibility as well, but haven't seen anyone actually do this with CF). Point being one file that encompasses the whole build.
This build file is what QA should validate. So validation includes deployment, configuration and functionality testing. See my answer for deployment on how this can flow.
Deployment:
If you want to automate deployment QA should be involved as well to validate it. Meaning QA would deploy / install builds using the same process on their servers before doing a staing to production deployment.
To do this I would create something that tracks what server receives what build file and whatever credentials and connection information is necessary to make that happen. Most likely via FTP. Once transferred, the tool would then extract the build file / run the installer. This last piece is an area I would have to research as to how it's possible to let one server run commands such as extraction or installation remotely.
You should look into Ant as a migration tool. It allows you to package your build process with a simple XML file that you can run from the command line or from within Eclipse. Creating an automated build process is great because it documents the process as well as executes it the same way, every time.
Ant can handle zipping and unzipping, copying around, making backups if needed, working with your subversion repository, transferring via FTP, compressing javascript and even calling a web address if you need to do something like flush the application memory or server cache once it's installed. You may be surprised with the things you can do with Ant.
To get started, I would recommend the Ant manual as your main resource, but look into existing Ant builds as a good starting point to get you going. I have one on RIAForge for example that does some interesting stuff and calls a groovy script to do some more processing on my files during the build. If you search riaforge for build.xml files, you will come up with a great variety of them, many of which are directly for ColdFusion projects.

Are there any decent compiler web services?

Here's the idea: you commit your code to a repository and call a web service (or enter the request through a web app) to have it compiled. The results are then pushed up to a FTP server, S3 bucket, etc. Is there anything like this out there on the public internet?
TFS has a build queuing feature, but I'm thinking more along the lines of a internet (not intranet) web service. And if it can pull from known source control interfaces (Subversion, CVS, etc.) then the caller needs to pass very little besides selecting a compiler and specific compilation options.
My reasoning is more along the lines of removing a lot of software installation and configuration hassles, especially when working between different languages/platforms/frameworks/projects
See Is there build farm for checking open source apps against different OS'es? for related information.