I am very interested in the replacment ASP.NET Session Manager portion of Appfabric, and somewhat interested in the distributed cache manager. We don't have a need for its hosting features. While we do have a clustered SQLServer inhouse, adding that as a dependency for our aspnet/oracle application probably would not be well received.
There is a network based XML file option that the appfabric videos suggest is okay for small deployments, which we would be (one 2-node farn, one 5-node farm).
So are there any success stories w/o SQLServer on the backend? Would a DFS network share prove reliable enough for Appfabric instead of SQLServer?
I think this is precisely the situation where the AppFabric team intended the XML provider to be used i.e. where SQL Server is not available/not desired. I doubt that there are any case studies available yet where this has been done, purely because AppFabric is so new that they haven't been written yet. However I don't believe there are any quirks to using the XML provider over the SQL provider - all I can suggest is try it and see! You could always switch over to SQL Server at a later date if the XML provider proves problematic. Or if you're felling brave, you should be able to write an Oracle provider (though the documentation on this seems, um, sketchy).
Related
What is a better mBaaS that supports offline sync and caching?
I am evaluating several mBaaS solutions for my hybrid mobile app under development. I looked at Kinvey, Kii, buddy, and Telerik BackEnd platform. I have also came across some open source solutions like openmobster and dreamfactory. I am looking to store data in sql-lite on mobile app and then sync it back with an online data store. Kinvey has this support, but their pricing model (per user) is not suitable in my scenario. I can see that openmobster does this but, how is what I need to understand? Can I host in on Azure VM or something? Also please suggest if there is any other solution commercial/open source capable of doing offline sync and caching with push notifications and data storage?
DreamFactory could be a good fit for your scenario. It is open source and comes with a full 30 days of free support. After which it's only like $25/month for a developer account - and this isn't even a requirement to use its product. It's specifically a support package.
To address your question a little more in-depth... I don't believe DreamFactory supports offline syncing at the moment, though they plan to very soon. In regards to sql-lite, DreamFactory's (DSP) product has a built in sql-lite driver to connect to that DB. However, it hasn't been tested enough for them to say it is a fully supported RDBMS. One of the beautiful things about DreamFactory is you're able to host the DSP (DreamFactory Service Platform) on Azure and Amazon EC2 instances (cloud solutions), host locally on your own server, or even use its own free hosted edition!
I would definitely take a little time to look into DF. It doesn't seem to me like you have much to lose. Especially, considering it's a free open-source product!
Feel free to ask me any questions you may have about DreamFactory!
-Mark
I'm planning to get GREG from WSO2 as business service registry. We're currently storing services in a Spreadsheet as a delimited text file. Services are still abstract concepts (operations not).
Which is the best approach (painless, programming-less...) to do a bulk load of about 660 business services and 12000 operations?
The most painless way probably is using the registry client. WSO2 provides a java based client you can use to easily access the registry. It won't be completely painless, but with a couple of lines of code you could easily add this information.
On other option would be to directly plug in to the underlying JCR repository or database, but than your entering the painful area I think.
We currently have a single installation multi-site setup, hosted in Europe, and are looking to move content delivery for a single site to China. This is partly for SEO purposes and partly to improve content delivery performance there. Content management performance isn't an issue.
Given that we'll be having to transfer data between two separate hosting companies we'd like to limit both how much gets sent, and if possible not send any data we wouldn't be happy to publish.
We have Sitecore analytics enabled, so this might be a complicating factor.
I've read the scaling guide, which suggests we'll need a minimum of both web and core databases in the new CD environment. They do suggest that if there is no extranet security configured it is possible to do without the core database in a pure CD environment.
Does anyone have any experience with this? What are the benefits/pitfalls? What is the bare minimum installation we can get away with?
Edit: Sitecore.NET 6.4.1 (rev. 111003)
Like divamatrix said, knowing the version number is essential.
But even though the older versions can run without the Core, I would stick to an installation that includes the Core so you will have less trouble upgrading in the future.
What you need on the Content Delivery side is:
Web database
Core database
Analytics database
Then on the Content Management side you need your usual:
Master database
Web database
Core database
Analytics database
Then setup SQL replication between the Core databases.
Analytics can be configure to run reports using data from CD and store them on CM.
You also need to setup Web Deployment for file replication between the instances.
Besides all this you need some extra configuration as is explained in the Scaling Guide.
If you are not using Sitecore 6.4 or higher, I would recommend upgrading first. Once you got this setup properly it will work like a charm!
To answer your question, older versions of Sitecore worked without the Core database. You didn't say which version of Sitecore you're using, but if it's anything current, the answer is going to be that you need a web database and a core database. Also, having analytics enabled is definitely a consideration you need to look at. You should probably look at setting up an your analytics database local to your CD hosting as this database can see a lot of traffic depending on the traffic of your site. You can have publishing set up to either publish to a local web database and then replicate or you can just let publishing should handle the transfer of data between your CM and CD environment.
I’m setting up AppFabric and I’m wondering if using xml (instead of SQL Express) for the “Caching Service Configuration Provider” has any impact on performance or may lead to other problems eventually? To keep dependencies (and things that can go wrong) to a minimum, using a plain xml file seems like a simpler solution.
XML is fine in non-HA scenarios; make sure the share is available to all account contexts on all hosts and you're good to go. Performance is a non-issue -- configuration is only checked/utilized at certain times, like startup, or adding/removing a host. SQL Server configuration is really targeted at higher availability (though itself is subject to crashing the service when SQL Server becomes unavailable, sillily enough.)
Incidentally, disk filestore will almost always be faster than DB access for this sort of work.
I have a web-based interface for handing invoices, customer records and other transaction records which interacts currently with a database of all the aforementioned stored upon the same machine. As you can imagine, this is quite a simple set-up consisting of a web-app (PHP) and a database (MySQL). However, the ideal scenario is to keep the records on the machine they are currently on (easy) and move the web-app to another server within the same network (again, easy) ... but in addition, provide facilities on a public-facing website for managing accounts by customers and so forth. The problem is this - the public-facing web server is located in a completely separate location as it is a dedicated server provided by a well-known ISP.
What would be the best way to enable the records to be accessible from this other server whilst ensuring that all communications are secure. Speed is not a huge factor, although any outages on either side should be handled gracefully. Initially my thoughts went towards web services (XML-RPC/SOAP/Hessian), but these options seem to present difficulties (security being the main one, overcomplexity as well).
The web-app must remain PHP-based. The public-facing site is likely to be PHP-based as well, although Python (likely using Django) is another option. The introduction of any other technologies (Java etc) is not a problem, although it is preferred if they be Linux-friendly (so .NET would not be the best fit here).
Apologies if this question is somewhat verbose and vague. I am testing the water somewhat in regards to this kind of problem. Any advice or suggestions gratefully received.
I've done something similar. You can expose a web service to the internet that will do the database access, but requests to the service must match a strong hashed and salted password (which will be secured on the ISP's server in the DMZ.)
Either this or some sort of public/private key encryption scheme.
OK, this might seem a bit silly, but what if you just used mysql replication?
Instead of using all sorts of fancy web services, just have a master sql server on one machine, then have it replicate to another server that holds the slave sql server as well as the web app