What are the main difference between vsphere 5.5 and vsphere 6? Are there any feature additions? Are they backwards compatible?
Please check this table for a comparison
Vsphere 6/6.5 are a major departure from 5.5. There are a number of small changes but the 2 biggest you will likely notice is that in 6+ they deprecated the c# vCenter client and are moving vCenter to an in house linux based appliance system instead of hosting it on windows.
vSphere 6.5 used the HTML5 to access the vCenter data center management. access from any system in google chrome. completly deprecated the windows based Thin client.
Lot of feature added like HCI - Hyper converged infrastructure.
In this all your compute storage and Network will be integrated with single device.
You can reduce the manual task for management task. creating switch and deploying the configuration in multiple data center with same configuration. replication job will easy.
You can find more info
What is a better mBaaS that supports offline sync and caching?
I am evaluating several mBaaS solutions for my hybrid mobile app under development. I looked at Kinvey, Kii, buddy, and Telerik BackEnd platform. I have also came across some open source solutions like openmobster and dreamfactory. I am looking to store data in sql-lite on mobile app and then sync it back with an online data store. Kinvey has this support, but their pricing model (per user) is not suitable in my scenario. I can see that openmobster does this but, how is what I need to understand? Can I host in on Azure VM or something? Also please suggest if there is any other solution commercial/open source capable of doing offline sync and caching with push notifications and data storage?
DreamFactory could be a good fit for your scenario. It is open source and comes with a full 30 days of free support. After which it's only like $25/month for a developer account - and this isn't even a requirement to use its product. It's specifically a support package.
To address your question a little more in-depth... I don't believe DreamFactory supports offline syncing at the moment, though they plan to very soon. In regards to sql-lite, DreamFactory's (DSP) product has a built in sql-lite driver to connect to that DB. However, it hasn't been tested enough for them to say it is a fully supported RDBMS. One of the beautiful things about DreamFactory is you're able to host the DSP (DreamFactory Service Platform) on Azure and Amazon EC2 instances (cloud solutions), host locally on your own server, or even use its own free hosted edition!
I would definitely take a little time to look into DF. It doesn't seem to me like you have much to lose. Especially, considering it's a free open-source product!
Feel free to ask me any questions you may have about DreamFactory!
-Mark
I just learned today about the System Center AVIcode product, which is a .net application monitoring tool. I don't know much about it and I was wondering how it would compare to AppFabric. The latter also has monitoring features as well as other useful features. How much do these two product overlap and for which scenario is each one better suited?
Thanks for any insights!
AVIcode (now simply called "APM" feature in System Center 2012 - Operations Manager) and AppDynamics are monitoring products playing in the same space/market.
They both provide visibility into code-level performance issues with your application. If you are interested in AVIcode technology you can watch my talk at TechEd 2012 to see APM in Operations Manager in action http://channel9.msdn.com/Events/TechEd/NorthAmerica/2012/MGT302
AppFabric provides hosting and activation services, so it is orthogonal to the above - while it provides some "infrastructure" monitoring capability (i.e. the host running your code being up or down) it doesn't go to the code level showing "what was slow" or "what threw exceptions" in your code.
App Sight is applicable only to .NET framework 4.0 in terms on monitoring WCF transactions and Workflows. It's integrated into IIS Mgr thru extensions.
AVICode monitors a more broader range of .NET frameworks and protocols and is available as standalone or through integration with SCOM.
So the overlap would be the visibility they both provide for apps that leverage WCF and Workflows.
If you're interested in .NET application monitoring you might want to checkout http://www.appdynamics.com/. We're currently in the middle of our .NET beta program and have had a great response so far from users. I can sign you up for a no hassle free trial if you want to have a play and see what visibility we can provide . Drop me a line at appman#appdynamics.com if your keen.
I need a way to install a distribuible application without user intervention, of course I currently have a distribution profile installed on my device (I can install or uninstall the application by means of iTunes or iPCU), the problem remain on the side of automation "no user intervention is required", basically I need to develop a software (maybe hack iTunesMobileDevice.dll) that install the application when a valid device (the one with a valid distribution profile) is connected to one machine (application server), so any ideas??....
Thanks in advance!
There is absolutely nothing in the standard API that will let you do this. I can't image a bigger security hole than a mechanism for installing software without the users intervention/knowledge. If Apple did find such a hole they would plug it so fast it would cause physicist to question certain assumptions about the speed of light.
You might could do this on a jailbroke device but AFIK all the open development tools require human interaction. You would probably have to write quite a bit from scratch and you would have all the security and software availability problems of a jailbroken device. You would also run the risk of Apple breaking the loophole you exploited in a future release.
If I may ask, why exactly are you trying to automatically install software? What advantage do you hope to gain by undermining your security to that extent? There might be a better way to go about it.
I wondered if anyone uses virtualized desktop PCs (running WinXP Pro or older) to have some old applications that are seldom used available for some ongoing tasks.
Say you have a really old project that every once in a while needs a document update in a database system or something like that. The database application is running on a virtualized desktop that is only started when needed.
I think we could save energy, hardware and space if we would virtualize some of those old boxes. Any setups in your company?
edit Licensing could be of concern, but I guess you have a valid license for the old desktop box. Maybe the license isn't valid in a VM environment, I'd definitly check that before.
Sure enough, if the application is performance critic, virtualization could hurt. But I'm thinking about some kind of outdated application that is still used to perform, say a calculation every 12 weeks for a certain customer/service.
I use virtualized desktops for:
Support that requires VPN software I do not want on my own desktop. This also lets a whole team share the support computer for a specific customer.
A legacy system which we use several different versions of (depending on customer's version) and they're not really compatible so its good to have a virtualized desktop for each version.
We use virtualisation to test on a variety of Operating Systems - the server application runs under linux, and we have a production (real) server, and a couple of test servers, which are all VMs.
The client runs under Windows, which, being an OS X user I have to run in a VM, and the other developer I work with runs an XP VM on his 8-core Vista box.
(I also have a seperate VM for running CAD software, but that's not really programming)
It depends on the requirements of the legacy systems. Very often if a system is relient on a certain clock frequency, then it better and morereliable to keep the older OS systems running as Virtulized OS' can do funy things to performance.
If the legacy systems aren't that critical, then go for it! One piece of advice I would give is to ensure that the system works FULLY before chucking out your old 3.11 systems as I have been stung before! To fully perform the testing can cost more money then you might save, but its up to anyone who make the decisions to ensure that is considered.
We use virtualisation for testing out applications on Vista. Or rather customers do the testing and we use virtualisation to reproduce the bugs they complain about.
I guess the thing that would stop me from using lots of virtual instances of my favourite proprietary OS would be licencing. I presume Microsoft would want me to have a licence for every installation, virtual or otherwise?
We use VMWare with a virtual windows XP here at work to run some old development tools with very expensive licenses that don't run at all on Vista. So VMWare saved us about $5000 in licenses.
Since my last machine upgrade I have been running virtualised OS's for a number of tasks. For example I use a different set of Visual Studio plugins for managed and c++ unmanaged development. Some things I found:
Run your vmware setup on a machine with plenty of resources. I'll repeat...plenty of resources! A fast quad and 8GB of memory is what my current machine is running and it runs sweet (warning you need a 64bit OS for the 8GB!).
I wouldn't worry about app performance if your current physical hardware is old (2+ years). With a decent machine I find the virtualized apps run faster than on the legacy hardware!
When upgrading to a new workstation, p2v your old workstation. No need to worry about synergy or a KVM in the transition period any more!
I've used virtualisation so I could take my development environment around with me while travelling. As long as I could install MS Virtual PC, (and the PC/laptop had generous enough RAM) then I could access all my tools, VPN, Remote desktop links, SQL databases etc...
Worked fairly well, just a little slower than I like. I could have carted a laptop around, but found a small portable harddrive to be lighter/easier and just as effective.
However, consulting for several clients - all with different VPN requirements/passwords/databases/versions of frameworks & tools etc, I've found that having a Virtualised support environment for each is well worth it. Then multiple users have access to what is needed when supporting each client - they just need to either remote desktop (or run directly) the virtualised instance.
I've used VMs to handle work-related tasks that I didn't want / couldn't do on the company-issued laptop. Specifically, I needed to have several editions of the JRE running at the same time, which Java doesn't really like.
To get around this, I built several VMs that each ran the one tool I needed in trimmed-down XP instances.
Another thing to consider is that if you have a 5-yr-old server running some app, it's probably going to run just fine on a VM on new hardware. So, if you have a rack of old devices, buying one or two "real" servers, installing something like ESX (I'm most familiar with that tool, though Xen and others exist), then use a physical-to-virtual conversion tool to get those old devices switched to VMs so you can reduce your electricity consumption, management headaches, and worries about a critical device failing and not being able to find hardware for it.
We use VM for legacy apps, and have retired old machines that served up those apps. It eliminated the concern of matching drivers from NT to Win2k3. From a disaster recovery perspective this also helped as we couldn't find boxes to support the old apps at the DR data center.
The likes of VMWare are invaluable tools for browser testing of web applications. You can pretty easily test many combinations of OS and browser without having rank upon rank of physical machines running that software.