Is there any way to host windows update server for software updates? I am aware of WSUS, but it seems that not used for public purpose.
There is SCCM which is distributed by Microsoft itself.
InTune (again offered by MS) is a cloud based solution and could
also be useful, depending on your specific requirements.
Maybe Opsi is worth a look, too. It uses Linux servers to manage
Windows clients. Some parts of it (the ones you seem to be interested
in) are free to use and open source.
Related
In SCCM, WMI class CCM_Application in ClientSDK does not list User-Available (Target = User Collection, Purpose = Available) Applications unless you Install them manually from the Software Center once, and hence you cannot trigger Installation ("Install" method of the "CCM_Application" class) of an application that has been made available to a user but has not yet been installed through the Software Center. Is there any way we could trigger the installation of such Application from the client machine using WMI or other code without actually having to go to the Software Center for the first time? (which otherwise, defies the purpose of ClientSDK IMO).
I do realize that there is a procedure that is followed behind the scenes whenever the deployment is made available to the user, the Policy gets downloaded (or enforced) to the client machine only once the installation is triggered through the Software Center and not before that. But all these steps of discovering and installing the User-Available Application, is it possible to do that from the client machine just like the Software Center, WITHOUT ever having to use the Software Center? There has to be a way.
EDIT:
Our purpose is not to automate the process or to use purpose = required. What we are trying to achieve is to trigger the installation of User-Available Applications externally (not automatically) through WMI without having to go to the Software Center, just like we can Install/Uninstall Device-targeted Applications though the Methods exposed in the WMI class CCM_Application without having to use Software Center at all. We can also do the same for User-Available apps as well, but only once they have been installed through the Software Center, because then they get added to the CCM_Application just like the other apps and you can install/uninstall them without using the Software Center. Why can't we perform the first initial detection of User-Available Apps and install them for the first time as well? If SDK allows us to do all the above functionality, why are we forced to use Software Center for just one step? There should/must be a way.
Can anyone refer an SCCM expert to this Question? Anyone from Microsoft who actually have solution to this?
I have already read many long blogs on the internet related to this query but none seem to have figured it out.
Related Questions:
CCM_Application User available software missing
(unsolved)
Automated Software deployment through SCCM 2012 using wmi
This question has been answered but the objective there is different. It simply creates and starts the deployment of a software from the Configuration Manager. What I am trying to achieve is to Install the already deployed software (which is Available and hence not automatically installed), right from the client machine and not the server. Just like the Software Center does.
We are going to be switching to Sitecore for our CMS and my team uses Macs. We have no .net, c# experience but are excited to learn. I understand Microsoft recently released Visual Studio Code to work on Mac, and I've looked into Xamarin. Can someone provide any tips for a Mac guy?
Visual Studio for Mac will not help you because Sitecore quite tightly relates on windows features like IIS and windows filesystem with drives and paths; also the rest of cross-platform ASP.NET 5 features (like owin etc.) are not yet supported by Sitecore.
At the moment the best way to work with Sitecore on Mac is virtualisation and in particular - Parallels Desktop for Mac. I am using that myself for last 3 years - that is the most convenient way. Parallels Desktop is a virtual machine solution that integrates your Windows VM very tightly into Mac, you can run multiple (let's say 2-4) Windows virtual machines at the same time (nice to test Content Management / Content Delivery distributed between separate "machines" just on one Mac) - they are all connected to each other and Mac by a "network". Also you will need to have an instance of SQL server (in that case you may allocate a separate VM or simply reference an external SQL server).
Parallels Desktop has a mode called Coherence, when in fact win and mac environments are sort of merged into each other, so you can for example drag-drop from Finder into Windows Exploreк like you do it natively, and get Windows start button at you Dock and many other great features.
However I prefer to run Parallels in a full screen mode on a second monitor to be 1-to-1 like a regular Windows machine. By setting hosts file on Mac machine I can run CMS and hosted websites right from Safari on Mac.
Also virtual machines are stored as folders on your hard drive, so you can easily backup your current state of OS as easy as just archiving that folder, and later revert to that moments you have "saved" - very helpful to experiment, especially if you are a beginner in Sitecore, so you'd not afraid to break anything accidentally.
Good place to start: official website, as well as quickly investigate all its magic on YouTube reviews.
P.S. of course, you may use any alternative virtualisation software, like VmWare etc.
I use Visual Studio for Mac to build my Sitecore solutions. We use a gulp task based on the one that comes with Habitat to deploy changes to files (binaries, views, config, etc...) into a Windows virtual machine running in Parallels on my Mac.
There have only been two things I am missing from Visual Studio on Windows - debugging and Sitecore Rocks.
If you can live without those two things you can definitely develop your Sitecore solutions from a Mac with Sitecore running in Windows.
I am a Qt/C++ developer. I would like to setup a continuous integration environment whereby after committing the source code, it triggers a build process that build the code for the 3 platforms I'm using:
Linux
OS X
Win32
If possible, how do I setup such environment. Any hints or links are welcome.
I've read around about Jenkins, but I can't find any good tutorial for it.
I also suggest Jenkins for several reasons:
It will run on all of the platforms you listed.
It can be configured to start a build when the repository is updated (hint: configure the Job to "Poll SCM" and you won't have to muck with your SCM tool to get it to tell Jenkins to start building).
It provides good support (mostly through plugins) for Unit Testing. [You're project is doing unit testing, right?]
The price is right
A bigger issue is going to have is that AFAIK, Qt doesn't really do cross-compiling for other platforms well. Using Jenkins (and the appropriate plugins), you should be able to solve this.
One method that comes quickly to mind is to have an instance of Jenkins on each platform. Each instance is responsible for building the version for its own platform. At the end of the build, the created artifacts are all put into a common, shared location.
Jenkins supports this feature via plugins for all major source control systems. If you seriously considering using Jenkins (and I would highly recommend it), consider buying John Ferguson Smart's Jenkins: The Definitive Guide.
Two solutions coming to my mind:
BuildBot
BuildBot is a highly customizable continuous integration system written in Python. The master component offers a nice web-based GUI to monitor and trigger builds; slave components are put on the target machines (usually virtual machines but they could be the Mac laptop of one of the developers). Docs are good enough to build up a basic system, customization could be a little tricky (at least it was for me). Using commit/push hooks provided by VC systems you can easily activate the master and trigger builds across the slaves. It also supports incremental builds (a must if your project is big).
CDash
Developed by the authors of CMake, CDash is a web application collecting builds coming from across the network, not exactly what you asked for but I think it's worth a try. Very powerful if you have a team of developers who could continuosly submit build result on their machines to the server (and if you use CMake it's almost transparent). You cannot trigger builds from the server as Buildbot does, but you could setup a bunch of VM with a cron which checks for changes and in case performs the build and sends results to CDash
Sure it's possible. Most of the version control systems are able to execute custom script on server side. Some of them (git, for example), has hooks to achieve the same locally. Have a look at git's post-commit hook.
All you need is to create a script that will trigger cross-platform builds.
Most version control systems allow post-commit hooks to allow you to kick off events like builds. Alternatively build systems can be configured to regularly poll a source control repository and manage their own build scheduling (this is how we use Jenkins).
Something to bear in mind is how long it will take to do a complete build across platforms and the typical number of check-ins in that interval. You might find batching check-ins a better way of doing continuous integration builds if you have an fair sized team or limited build server resources. Otherwise your build system could quickly end up trying to play catch up.
As for whether it is possible to build on all target platforms, that depends on your tool chain.
Is there a way to meet the following criteria in distributing a Web Service to Windows machines?
1) Automatic installation and configuration of the Web Server.
2) No configuration (or even awareness) of a Web Server required by the customer.
3) No prompts to download and install Java or .NET - especially anything after .NET 2.0; those installs / restarts can take forever!
In short, is there a way to deliver a single install process that installs the Web Server along with a simple web app without requiring lengthy installations of pre-requisites? Something for even the most non-technical of users?
.NET's WCF almost meets the requirements but getting .NET updated up to 3.0 / 3.5 is a lengthy process and can be a turn-off for customers, even if the install holds their hand through the whole thing.
Rubyscript2exe was also very close, but it is extremely touchy and out-dated.
I am open to any technology / programming language - just looking for the slickest distribution process for my customers that meets the above three criteria.
I've been doing quite a bit of research on this as it is extremely important to me that my users have a simple installation experience. Here are a few things that I've found:
UltiDev Cassini: Cassini is that convenient mini-server that runs when you debug your web apps from Visual Studio or Visual Web Developer. UltiDev Cassini builds on that and looks pretty promising. It offers support for all non-beta flavors of .Net and integrates right into Visual Studio. Most interesting to me is the ability to include as part of your installer. The only down side is that pesky .Net pre-requisite. I can handle helping users get installed up to 2.0, but the install process to move to 3.0 and 3.5 is way too heavy for the typical user.
RubyScript2Exe: I like the premise of an executable Rails app. However, I attempted to use this on a Mac and it is simply too outdated and requires too many workarounds for my tastes. It's too bad, because I love Ruby on Rails development.
Server2Go: This is my favorite of the three options. It is easily distributable (just send off a zip file) and has a lot of nice options. For example, you can configure it to leave the included Apache server running even after the browser closes - that is PERFECT for a nicely packaged web service. It can also provide a customizable icon in the task bar for shutting down the service if necessary. I think this best meets my needs for the time being.
Please, if you know of any other options, let me know.
Also, you may be wondering, "Why not just write a desktop app?". The simple answer is that I don't need much of a GUI, if any. I need a simple to install web service that can be consumed by various other applications (web, mobile, and desktop included).
We have the need to perform tests on localized platforms that put some burden on our hardware resources because for just a few weeks we might need plenty of servers and clients (Windows 2003 and Windows 2008, Vista, XP, Red Hat, etc) in multiple languages.
We typically have relied on blades with Windows 2003 and VMWare, but sometimes these are overgrown by punctual needs and also have the issue that the acquisition and deployment process is quite slow if the environment needs to grow.
Is Amazon EC2/S3 usable in the following scenario?
Install VMWare (Desktop because we need the ability to have snapshots) on an Amazon AMI.
Load existing VMWare images from S3 and run them on EC2 instances (perhaps 3 or 4 server or client OSes on each EC2 instance.
We are more interested in the ability to very easily start or stop VMware snaphsots for relatively short tests. This is just for testing configurations, not a production environment to actually serve a user workload. The only real user is the tester. These configurations might be required for just a few weeks and then turned off for a few months until the next release requires them again.
Is EC2/S3 a viable alternative for this type of testing purpose?
Do you actually need VMWare, or are you testing software that runs in the VMWare VMs? You might actually need VMWare if you are testing e.g. VMWare deployment policy, or are running code that tests the VMWare APIs. Examples of the latter might be you are testing an application server stack and currently using VMWare to test on many platforms.
If you actually need VMWare, I do not believe that you can install VMWare in EC2. Someone will correct & enlighten me if this is not the case.
If you don't actually need VMWare, you have more options. If you can use one of the zillion public AMIs as a baseline, clone the appropriate AMIs and customize them to suit your needs (save the customized version as a private AMI for your team). Then, you can use as many of them as you like. Perhaps you already have a bunch of VMWare images that you need to use in your testing. In that case, you can migrate your VMWare image to an EC2 AMI as described in various places in Google, for example:
http://thewebfellas.com/blog/2008/9/1/creating-an-new-ec2-ami-from-within-vmware-or-from-vmdk-files
(Apologies to the SO censors for not pasting the entire article here. It's pretty long.) But that's a shortcut; you can always use the documented AMI creation process to convert any machine (VMWare or not) to an AMI. Perform that process for each VMWare VM you have, and you'll be all set. Just keep in mind that when you create an AMI, you have to upload it to S3, and that will take a lot of time for large VMs.
This is a bit of a shameless plug, but we have a new startup that may deal with exactly your problem. Amazon EC2 is excellent for on-demand computing, but is really targeted at just a single user launching production servers. We've extended EC2 to make it a Virtual Lab Management environment, with self-service, policies and VM sharing. You can check it out at http://LabSlice.com and see if it meets your needs.
Amazon provides a solution themselves now: http://aws.typepad.com/aws/2010/12/amazon-vm-import-bring-your-vmware-images-to-the-cloud.html