Does it Build? - A virtualbox image to build software packages successfully - build

I would like to set up a minimal Linux system which has nothing else to do then building software packages. But reliably and successfully!
Goal:
Set up and provide publicly a virtualbox image and a set of instructions for major software packages that clearly run through the whole build process successfully and create a reliable basis for interested people to start understanding the package.
1. First important question. Is there something already existing?
It doesn't make sense to invent the wheel many times. I have done a little bit of research but didn't really find something to my taste.
Do you know anything that's already there?
2. How should I start?
Virtualbox.
Setting up Virtualbox is easy.
I have worked quite a bit with Virtualbox. But is it the right choice?
Basic Linux.
There are many linux flavors. Especially in such a case where only the building features are required. A gui is definitely not necessary, so the footprint could be a lot smaller. I first thought about Mini Ubuntu but at the second thought that is just a smaller cd image to install the whole ubuntu from the internet. The next thought was about ubuntu server edition. That's a CD with 670 MB. Why is it so big? I also remember that I one worked with LFS... Goal should be that it's limited effort to create a system that can build the main software packages without problems...
3. More suggestions how to proceed?
...

Your purpose is pretty much the same as the purpose of the Open Build Service for which you can download the installer ISO. Then if you like, you can install within VirtualBox onto a virtual hard drive.

you can probably find what you need here,
http://virtualboxes.org/images/ubuntu/
after you can just add your required tools and rezip it for distribution..
But you should add more details of what you plan to do... what kind of package for what kind of distributions ...

Related

is there any Package Manager for C++? [duplicate]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
Just wondering the best way to install cpp packages. My background is in JS/Ruby/etc, so it seems so weird there's no cpm or the like. What am I missing? I assume it's not this simple...
For an example, I can't even run a .cpp file with #include <iostream> as I get fatal error: 'iostream' file not found
Edit for clarity: iostream was a bad example, my system config was wonked back when I wrote this. Replace it in your imagination with a non-standard library.
There seem to be a few though I've never used them.
bpt
https://github.com/vector-of-bool/dds
cpm
http://www.cpm.rocks/
conan
https://conan.io/
poac
https://github.com/poacpm/poac
pacm
http://sourcey.com/pacm/
spack
https://spack.io
buckaroo
http://buckaroo.pm
hunter
https://github.com/ruslo/hunter
vcpkg
https://github.com/Microsoft/vcpkg
Conan is the clear winner today based on its 36+ GitHub contributors and the fact I found their Getting Started documentation to be easy enough to get working. It's MIT licensed too.
Conan's documentation even compares it to biicode which I was surprised wasn't mentioned in other answers, but biicode seems to be abandoned much like cpm.
pacm has some activity but is LGPL which may be an issue for some projects.
This builds on user3071643's answer, thanks!
No, there certainly isn't an official package manager for C/C++, but I'll give you a few suggestions to hopefully make your days better.
First, I would suggest investigating CMake or GENie for ways of incorporating libraries in your build system in an extensible and cross-platform way. However, they both assume that you have libraries that are in an "easy to find" place or that they have the same build system as your project. There are ways to teach both how to find libraries that you need. All other solutions are platform specific.
If you have Node.js or Haxe in your project, then both npm and haxelib do have ways that you can use C++ (in a precompiled dll/so form) from JavaScript or Haxe respectively, but that's a big, and probably wrong, assumption that you'd be using Node.js or Haxe in a project that really needs the benefits that C/C++ can provide.
For the Microsoft world, I believe that NuGet has some C++ libraries, although it's limited in its platform support to Visual Studio, and probably Windows, but it probably best fits what you mean by "package system" given your examples (assuming that you meant that cpm was a C Package Manager in the way that npm is Node Package Manager).
For the Linux world, technically, rpm, yum, and apt-get do function as C/C++ development package managers just as much as a system package manager, but unlike npm, you must always install packages globally, but given that your app, if it's a Linux app, would likely be in a package on one or more of these managers, and packages have a dependency list embedded in them, it's not much of a problem.
For the macOS/iOS world there's CocoaPods, but, like, NuGet, you're locked-in to the Apple platform. There is always MacPorts, if you are happy with the Linux-style package manager as a dev package manager, as described in the previous paragraph.
I want this npm, local install, sort of functionality cross-platform as well, but since C/C++ is used on so many platforms (all of them?), and C/C++ programmers, like myself, tend to roll their own... everything, which keeps us all (unnecessarily?) busy, there hasn't been much of a push for making such a project, which is a shame. Perhaps, you should make one? It would certainly make my days better.
UPDATE
Conan is the C/C++ package manager that we've all been wanting. It has both local and global servers, so it's good for both business and open source packages. It's young, and its global repository doesn't have very many packages. We should all work to add packages to it!
UPDATE 2
I have found that vcpkg has been very useful for Windows and Android. If you can't get over the fact that Conan.io is written in Python, then it might be worth a look.
Also, although it mandates that you use it for yourself and all of your dependencies, I believe that Build 2 should be the ultimate winner in a few years, but as of the time of writing, it's still upcoming.
The other answers have mentioned the competing solutions, but I thought I would add my experiences. I did some research into package managers and build systems for $WORK. We were greenfield so anything was on the table. These are my findings, YMMV.
Conan
Claims to support every build system but you need to write these Python scripts to tell Conan how your build works. Lots of magic, and easy to misconfigure. You also need to manage remotes, local remotes, etc. using conan create. We wanted something simple and this put me off. IDE integration did not work reliably (due to the Python scripts). I asked about reproducible builds and they said it is as reproducible as you want to make it. So it is not really reproducible.
https://conan.io/
Hunter
All packages are defined inside a single repository. You need to fork to add packages. Everything is driven by CMake. We want to deprecate CMake internally due to poor syntax, non-reproducible builds, and all the other issues you probably know already. Hunter offers reproducible installations of packages because you put Hunter in source-control, which is excellent.
https://github.com/ruslo/hunter
Buckaroo
Opinionated but simplest solution. No need to manage remotes or forks of package lists since all packages are just Git repos. We use GitHub private so this was a plus for us. We were a bit hesitant about using Buck build system, which they require. Turns out the Buck gets most things right (I used and liked Meson & Bazel in the past), and writing Buck files was less work than integrating CMake projects anyway. Also, and this was big for us, Buckaroo actually supports Java too. Maven support was hacky though. We were able to create iOS and Android builds from a single build tool / package manager. Documentation is poor but they were responsive to my emails. Needs more packages.
https://buckaroo.pm/
VCPKG
Similar to Hunter but from Microsoft. They don't have older versions of packages which might be a problem. Again everything is done in CMake, so builds get more complex and slower over time. I think VCPKG has the most packages of all solutions.
https://github.com/Microsoft/vcpkg
No, there's no package manager for C++ libraries. There are various ways to install C++ libraries, as with any other software: through your operating system's package manager, by building from a source tarball, or, in the case of proprietary software, by running some installation program.
Note that if #include <iostream> doesn't work, then your compiler or development environment is simply not installed correctly. I believe Super User is the site where you can ask for help with that sort of thing.

SDL - Cross platform development

I'm going to enter a small game competition in the coming months. They require the submission to be able to be compiled ( and it will be, before being run/evaluated for the contest) on Linux. I'm going to be using SDL and C++. I've only ever developed on Windows before and I've grown quite accustomed to the benefits Visual Studio gives. I'd like to be able to develop in windows with VS, and then near the end of the process migrate it over to linux. Beside making sure SDL is already installed on the Linux machine, are there things I can do throughout development that will make the process easier? Also, the contest rule for all of this states:
it must also work on an open platform (we strongly recommend making sure that your program run on modern flavors of GNU/Linux, as all of the judges will have access to it).
I assume compiling/running in Ubuntu (already have a home server with this) would be sufficient for this?
Your question is slightly open-ended, but my first suggestion would be to use a proper cross-platform build system such as CMake from day one. I would refrain from "migrating" to Linux at the very end; you may be under a rough schedule (and maybe run into problems you did not anticipate). Thus, a continuous build of (working) Linux versions will help ease your worries.
Furthermore, if the game is meant to run solely under Linux, why not install Ubuntu in a Virtual Machine somewhere and get acquainted with one of the development environments such as kdevelop or qtcreator? Wouldn't direct contact with the platform you are developing for make things a little easier?
I'm developing games and started like you. I'd advise you to use SFML library for this purposes. It's not very big and is very good thing to start from.
There you can use:
2D renderer (OpenGL)
Fonts
Timers
Wrappers around images/sprites
Post effects/shaders
Sound
Network
In this page you can find a few start examples.

Learning Linux from Windows Newbie questions [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I am a newbie in linux and programming. I want to learn linux command and use create some C and C++ programme to interact with the linux API from my windows XP. However, I am not sure how to set up the environment from my windows based computer.
What programme should I install? Also, it seems like linux has Ubuntu, Fedora. I heard of Red Hat as well. What is the difference and which one should I install?
Also, is there any difference between using linux with user interface like Ubuntu, using an IDE to create programme AND the command line terminal using VIM to create programme?
Besides, I have heard of using Valgrind to debug programme. Does Valgrind works together with an IDE in Ubuntu or works in command line terminal only? If my IDE already has debugger, do I still need Valgrind?
Sorry for such newbie questions.
Thanks.
Your question is very vague and prone to start argumentations and fights. Also, you miss-used many terminologies there and before you even start programming with Linux, you should first get acquainted with the OS, especially the terminal... But first things first, programming in C/C++ for Windows is not entirely the same as programming in C/C++ for Linux. If you want the latter, then use the latter.
My suggestion is this :
Grab VirtualBox and install it.
Download Ubuntu ; IMHO, Ubuntu is best for starters (or anyone as a matter of fact) because it has a lot of support, a good user base and is compatible with pretty much any Linux software installer (RPM, deb, etc.) You can choose any other distribution, it doesn't really matter, but I recommend this one. [1]
Start VirtualBox and create a new Ubuntu virtual machine. The steps are pretty straight forward, consult the documentation for any assistance). Your virtual machine may look something like :
1GB of RAM will be enough;
10GB of hard disk (you won't need much more, but you may increase the size if you think you'll need more space for /home, see next point)
a network adapter set a bridged
etc.
Install Ubuntu from the ISO that you have just downloaded (that you have mounted into VirtualBox as a CD-ROM device) You'll only need about 8GB of hard disk total for a typical, minimum Ubuntu installation (ext4+swap), however I recommend this setup.
Enjoy your installation. (Tip: now you can install the VirtualBox's Guest Additions.)
Open a terminal in your Ubuntu VirtualBox window and type sudo apt-get install build-essential to install the GCC compiler
Gedit is already installed by default with Ubuntu and it's a fairly good text editor compared to Windows' notepad. However, vim is not, but you can install it with sudo apt-get install vim in the terminal.
And voilà! You're all set to go to do some C/C++ programming in a Linux environment, where you can still have Windows in case you're stuck.
I also recommend you do most of your learning using the terminal (aka the command line) so you know how things work under the hood. Then, when you are familiar with the GCC compilier, MAKEFILEs, etc. you can install some IDE to avoid repeating tasks; The two best I have yet found are Ajunta and MonoDevelop. Both are available from the repositories.
Now, if you want to "create some C and C++ programs to interact with the linux API from [your] windows XP", You need, for example, to learn sockets and SSH; so you can connect to your Linux machine from your Windows machine and execute some commands remotely from your Windows applications. But before you do that, learn C/C++ and play around with Linux. For a newbie, you already have your hands full right there.
Good luck!
[1] Ubuntu (a Linux distribution) comes with Gnome as GUI, while Kubuntu with KDE and Xubuntu has XFCE. All of them (GUIs) are separate projects and you could have all of them installed on any Linux desktop installation. Even, you don't need any GUI with any Linux distribution; for example, VMWare's Virtual Appliance Marketplate have a whole bunch of ready-to-go Linux installations like that.
Try it with a VM or as a live-cd.
Valgrind is a command-line tool but maybe some IDEs use integrated it.
Under Linux you'll see, than most of the time an IDE is quite useless (not a troll).
You'll do your Makefile manually,...
Hope you'll enjoy' it.
Regards,
Learn 1 thing at a time.
If you want to learn to program first, try python first. It works in Windows and Linux and you get result faster
If you want to learn C++, get Visual C++ express or Cygwin/GCC
If you want to experience with Linux, get a distribution of your choice (Linux-Mint is a good introction, coming from Windows) and try it in a VM (VMWare Player or VirtualBox)
Try easy projects and only after that, worry about debuggers
If you try it in virtual machines (virtualbox or WMWare for example) you can test as many options you want before deciding what Linux distro you will want to use. There are a lot, but from what you listed, my personnal opinion is that Ubuntu is a lot easier than Fedora to start. I've never used RedHat so I can't tell but it hasn't the reputation of beeing a hard one (for experts).
Anyways, at least to start I recommend installing it with GUI (and after starting too unless it's a server...).
Regarding IDEs, you could try Eclipse and Netbeans. They run both on Windows and Linux but I'm not C/C++ programmer so I don't know if they are good at that job. I you don't use IDE, Vim is far from beeing the unique option (Vim "addicts" :) will say it is the unique productive one but that's a personnal choice and the learning time is not very short). Personnaly I prefer a good IDE or at least graphical editors for programming, not that I don't like the power and speed of the terminal with command line as I prefer to use it for system administration or configuration but not for programming where you stay a long time on it.
I don't know about Valgrind but Eclipse or Netbeans IDEs have debuggers of course.
Programming for Linux a series of projects to learn, for the steps, you may refer to:
Red Hat Certified System Administrator I &II student-book which may help you to get the survival abilities in Linux, actually when you really understand the fields covered by these courses, you will have got the ability to find what to learn.
Search amazon with keyword 'Linux Programming', choose one and start your journey.
Have a good time.

Which install system to pick when deploying to Windows and Linux?

My company is thinking of dumping InstallShield and move to something else, mainly because of the poor experience it had with it, mostly on Linux.
Our product is a C++ application (binaries, shared libraries) targeted at Windows and Linux (Red Hat).
The installer itself isn't required to do anything special, just dump some binaries and shared libraries and sometime execute an external process. Things like version upgrading through the installer isn't necessary, this is handled after the installer finishes.
I thought of suggesting using NSIS on Windows and RPM on Linux.
What are the recommended installer systems to use when deploying to Windows/Linux? Something that is cross platform to prevent maintaining two installers is a definite plus.
For Windows I would definitively use NSIS. It's very lightweight, easy to code and very simple to understand. Using msis would just be a killer - it generates guid for every file so you can get upgrades for free and stuff but truth being said, you never end up using any of these.
Regarding Linux I would go for RPM and Deb. They're probably the two biggest packaging system so you'll be targeting most of the Linux users. I've never tried RPM but creating a Deb package is fairly straightforward.
See also:
What to use for creating a quick and light setup file?
Packaging to use to deploy cross-platform?
And even:
Creating installers for complex cross-platform programs
There's a tool called BitRock Installer which can create installers for Windows, Linux and OS X.
However, I think that if you target RedHat it would be better to provide native packages for that platform (that is .rpm).
For C++ projects, I'd go with cmake/cpack, if you are also willing to change your build system. Great support, strongly cross-platform. cpack has various generators, NSIS is one..
Take a look at InstallJammer. It will handle both platforms from the same build project, and you can have the installer register the package with the RPM database as well if that's your requirement.
You may want to consider our tool BitRock InstallBuilder , it can generate installers for Windows and Linux from a single project file and also RPMs. Is your application based on Qt? Our clients include the makers of Qt, Nokia (previously Trolltech) and they use it to package their Qt Creator product. We encourage to give InstallBuilder a try and contact our support with any questions or suggestions you may have.

Using Visual Studio to develop for C++ for Unix

Does anyone have battle stories to share trying to use Visual Studio to develop applications for Unix? And I'm not talking using .NET with a Mono or Wine virtual platform running beneath.
Our company has about 20 developers all running Windows XP/Vista and developing primarily for Linux & Solaris. Until recently we all logged into a main Linux server and modified/built code the good old fashioned way: Emacs, Vi, dtpad - take your pick. Then someone said, "hey - we're living in the Dark Ages, we should be using an IDE".
So we tried out a few and decided that Visual Studio was the only one that would meet our performance needs (yes, I'm sure that IDE X is a very nice IDE, but we chose VS).
The problem is, how do you setup your environment to have the files available locally to VS, but also available to a build server? We settled with writing a Visual Studio plugin - it writes our files locally and to the build server whenever we hit "Save" and we have a bit fat "sync" button that we can push when our files change on the server side (for when we update to the latest files from our source control server).
The plugin also uses Visual Studio's external build system feature that ultimately just ssh's into the build server and calls our local "make" utility (which is Boost Build v2 - has great dependency checking, but is really slow to start as a result i.e. 30-60 seconds to begin). The results are piped back into Visual Studio so the developer can click on the error and be taken to the appropriate line of code (quite slick actually). The build server uses GCC and cross-compiles all of our Solaris builds.
But even after we've done all this, I can't help but sigh whenever I start to write code in Visual Studio. I click a file, start typing, and VS chugs to catch up with me.
Is there anything more annoying than having to stop and wait for your tools? Are the benefits worth the frustration?
Thoughts, stories, help?
VS chugs to catch up with me.
Hmmm ... you machine needs more memory & grunt. Never had performance problems with mine.
I've about a decade's experience doing exactly what you're proposing, most of it in the finance industry, developing real-time systems for customers in the banking, stock exchanges, stock brokerage niches.
Before proceeding further, I need to confess that all this was done in VS6 + CVS, and of late, SVN.
Source Code Versioning
Developers have separate sourcesafe repositories so that they can store their work and check it packages of work at logical milestones. When they feel they want to do an integration test, we run a script that checks it into SVN.
Once checked into SVN, we've a process that kicks off that will automatically generate relevant makefiles to compile them on the target machines for continuous integration.
We've another set of scripts that synchs new stuff from SVN to the folders that VS looks after. There's a bit of gap because VS can't automatically pick up new files; we usually handle that manually. This only happens regularly the first few days of the project.
That's an overview of how we maintain codes. I have to say, I've probably glossed over some details (let me know if you're interested).
Coding
From the coding aspect, we rely heavily on the pre-processors (i.e. #define, etc) and flags in the makefile to shape compilation process. For cross platform portability, we use GCC. A few times, we were force to use aCC on HP-UX and some other compilers, but we did not have much grief. The only thing that is a constant pain, is that we had to watch out for thread heap spaces across platforms. The compiler does not spare us from that.
Why?
The question is usually, "Why the h*ll would you even what to have such a complicated way of development?". Our answer is usually another question that goes, "Have you any clue how insane it is to debug a multi-threaded application by examining the core dump or using gdb?". Basically, the fact that we can trace/step through each line of code when you're debugging an obscure bug, makes it all worth the effort!
Plus!... VS's intellisense feature makes it so easy to find the method/attribute belonging to classes. I also heard the VS2008 has refactoring capabilities. I've shifted my focus to Java on Eclipse that has both features. You'd be more productive focusing coding business logic rather than devote energy making your mind do stuff like remember!
Also! ... We'd end up with a product that can run on both Windows and Linux!
Good luck!
I feel your pain. We have an application which is 'cross-platform'. A typical client/server application where the client needs to be able to run on windows and linux. Since our client base mostly uses windows we work using VS2008 (the debugger makes life a lot easier) - however we still need to perform linux builds.
The major problem with this was we were checking in code that we didn't know would build under gcc, which would more than likely break the CI stuff we had setup. So we installed MingGW on all our developer's machines which allows us to test that working copy will build under gcc before we commit it back to the repository.
We develop for Mac and PC. We just work locally in whatever ide we prefer, mostly VS but also xcode. When we feel our changes are stable enough for the build servers we check them in. The two build servers (Mac and PC) look for source control checkins, and each does a build. Build errors are emailed back to the team.
Editing files live on the build server sounds precarious to me. What happens if you request a build while another developer has edits that won't build?
I know this doesn't really answer your question, but you might want to consider setting up remote X sessions, and just run something like KDevelop, which, by the way, is a very nice IDE--or even Eclipse, which is more mainstream, and has a broader developer base. You could probably just use something like Xming as the X server on your Windows machines.
Wow, that sounds like a really strange use for Visual Studio. I'm very happy chugging away in vim. However, the one thing I love about Visual Studio is the debugger. It sounds like you are not even using it.
When I opened the question I thought you must be referring to developing portable applications in Visual Studio and then migrating them to Solaris. I have done this and had pleasant experiences with it.
Network shares.
Of course, then you have killer lag on the network, but at least there's only one copy of your files.
You don't want to hear what I did when I was developing on both platforms. But you're going to: drag-n-drop copy a few times a day. Local build and run, and periodically checking it out on Unix to make sure gcc was happy and that the unit tests were happy on that platform too. Not really a rapid turnaround cycle there.
#monjardin
The main reason we use it is because of the re-factoring/search tools provided through Visual Assist X (by Whole Tomato). Although there are a number of other nice to haves like Intelli-sense. We are also investigating integrations with our other tools AccuRev, Bugzilla and Totalview to complete the environment.
#roo
Using multiple compilers sounds like a pain. We have the luxury of just sticking with gcc for all our platforms.
#josh
Yikes! That sounds like a great way to introduce errors! :-)
I've had good experience developing Playstation2 code in Visual Studio
using gcc in cygwin. If you've got cygwin with gcc and glibc, it
should be nearly identical to your target environments. The fact that you
have to be portable across Solaris and Linux hints that cygwin should
work just fine.
Most of my programming experience is in Windows and I'm a big fan of visual studio (especially with Resharper, if you happen to be doing C# coding). These days I've been writing an application for linux in C++. After trying all the IDEs (Netbeans, KDevelop, Eclipse CDT, etc), I found Netbeans to be the least crappy. For me, absolute minimum requirements are that I be able to single-step through code and that I have intellisense, with ideally some refactoring functions as well. It's amazing to me how today's linux IDE's are not even close to what Visual Studio 6 was over ten years ago. The biggest pain point right now is how slow and poorly implemented the intellisense in Netbeans is. It takes 2-3 seconds to populate on a fast machine with 8GB of RAM. Eclipse CDT's intellisense was even more laggy. I'm sorry, but a 2 second wait for intellisense doesn't cut it.
So now I'm looking into using VS from Windows, even though my only build target is linux...
Chris, you might want to look at the free automation build server 'CruiseControl', which integrates with all main source control systems (svn, tfs, sourcesafe, etc.). It's whole purpose is to react to check-ins in a source control system. In general, you configure it so that anytime anyone checks code in, a build is initiated and (ideally) unit tests are run. For some languages there are some great plugins that do code analysis, measure unit test code coverage, etc. Notifications are sent back to the team about successful / broken builds.
Here's a post describing how it can be set up for C++: link (thoughtworks.org).
I'm just getting started with converting from a linux-only simple config (Netbeans + SVN, with no build automation) to using Windows VS 2008 with build automation back-end that runs unit tests in addition to doing builds in linux. I shudder at the amount of time it's going to take me to get that all configured, but the sooner the better, I guess.
In my ideal end-state I'll be able to auto-generate the Netbeans project file from the VS project, so that when I need to debug something in linux I can do so from that IDE. VS project files are XML-based, so that shouldn't be too hard.
If anyone has any pointers for any of this, I'd really appreciate it.
Thanks,
Christophe
You could have developers work in private branches (easier if you're using a DVCS). Then, when you want to checkin some code, you check it into your private branch on [windows|unix], update your sandbox on [unix|windows] and build/test before committing back to the main branch.
We are using a similar solution to what you described.
We have our code stored on the Windows side of the world and UNIX (QNX 4.25 to be exact) has access though an NFS mount (thanks to UNIX services for Windows). We have an ssh into UNIX to run make and the pipe to output into VS. Accessing the code is fast, builds are a little slower than before, but our longest compile is currently less than two minutes, not a big deal.
Using VS for UNIX development has been worth the effort to set it up, because we now have IntelliSense. Less typing = happy developer.
Check out "Final Builder" (http://www.finalbuilder.com/). Select a version control system (e.g. cvs or svn, to be honest, cvs would suit this particular use case better by the sounds of it) and then set up build triggers on FinalBuilder so that checkins cause a compile and send the results back to you.
You can set up rulesets in FinalBuilder that prevent you checking in / merging broken code into the baseline or certain branch folders but allow it to others (we don't allow broken commits to /baseline or /branches/*, but we have a /wip/ branching folder for devs who need to share potentially broken code or just want to be able to commit at the end of the day).
You can distribuite FB over multiple "build servers" so that you don't wind up with 20 people trying to build on one box, or waiting for one box to process all the little bitty commits.
Our project has a Linux-based server with Mac and Win clients, all sharing a common codebase. This set up works ridiculously well for us.
I'm doing the exact same thing at work. The setup I use is VS for Windows development, with a Linux VM running under VirtualBox for local build / execute verification. VirtualBox has a facility where you can make a folder on the host OS (Windows 7 in my case) available as a mountable filesystem in the guest (Ubuntu LTS 12.04 in my case). That way, after I start a VS build, and it's saved the files, I can immediately start a make under Linux to verify it builds and runs OK there.
We use SVN for source control, the final target is a Linux machine (it's a game server), so that uses the same makefile as my local Linux build. That way, if I add a file to the project / change a compiler option, usuall adding / changing a -D, I do the modifications initially under VS, and then immediately change the Linus makefile to reflect the same changes. Then when I commit, the build system (Bamboo) picks up the change, and does its own verification build.
Hard earned experience says this is an order of magnitude easier if you build like this from day one.
The first project I worked on started as Windows only, I was hired to port it to Linux, since they wanted to switch to a homogenous server environment, and everything else was Linux. Retrofitting a Windows project into this sort of a setup was a fairly major expenditure of effort.
Project number two was done "two system build" right from day one. We wanted to maintain the ability to use VS for development / debug since it is a very polished IDE, but we also had the requirement for final deploy to Linux servers. As I alluded to above, when the project was build with this in mind right from the start, it was quite painless. The worst part was a single file: system_os.cpp that contained OS specific routines, things like "get current time since linux epoch start in milliseconds", etc. etc. etc.
At the risk of getting a little off topic, we also created unit tests for this, and having the unit tests for the OS specific pieces provided a great deal of confidence that they worked the same on both platforms.