IntelliJ IDEA: how to rebuild groups of modules? - build

In my IntelliJ IDEA project I've got a couple of modules. Some of the are separate webapps (WARs), some of them are libraries shared by the webapps (JARs).
Imagine for example modules war1 and war2, each dependent on module jar1.
I need to have all these modules in the project because if I break something in jar1 I want to know if war1 AND war2 compile.
Now, if I change something in jar1 (and see that something's not getting updated) I just use 'Rebuild project' - then everything gets rebuilt and this is fine.
But when I change something in war1 I just want to rebuild war1 (or sometimes war1 + jar1). The 'Make module' option does not always work the way I want because it does not seem to clean the output directory.
I put jar1 and war1 into a separate group but I also don't see a 'Rebuild group' feature.
The reason for why 'Rebuild project' is not enough sometimes is that it takes a lot of time to rebuild ALL modules. I also don't see any 'Clean output directory' feature (if I had it I could clean just one module and then make this module).
Thanks in advance for any hints.

There is an option to Rebuild the selected module (you can't rebuild a group of modules), also note that Make is enough in 99% cases and you don't have to Rebuild, however you may need to rebuild artifacts (Build | Build Artifacts), you can multiple select several artifacts, then rebuild the selected.

Related

Visual Studio 2015 - Pre build event to determine which projects to compile

Motivation
PreBuild to disable compilation of redundant projects for faster compilation cycle.
Background
I have a VS15 ALL solution that contains many projects.
I have a single project, PreBuild, that all the other projects are dependent on, meaning, this PreBuild compiles first.
In addition, we also have a PostBuild project that do some more work once binaries are ready.
All projects are configured to build in Release mode (which is desired).
When a team member wants to release some binaries, he hits F7, Build Solution.
Now, the PreBuild, activates a separate dedicated process that calculates which projects should be released. The nature of the calculation is irrelevant to this discussion.
Problem
Out of the many many projects, it is often the case that only a few projects needs to be released. However, once the PreBuild process is done, ALL the projects are will compile which is very time consuming.
Question
Is it possible, after a solution build had started, to change the released projects?
Suggested unwanted approaches
A developer handpicks only the relevant projects and only build those.
PreBuild Kill & Revive. Once desired projects are calculated, PreBuild kills the VS15 process and activate a cmd compiling only the relevant projects.
Suggested approach
Change file ALL.sln and remove the the unwanted projects.
This would work had I changed that file prior to the process start but I'm not sure it would work if this change occurs during the process.
The simplest way I can think of, while still keeping most of the current infrastructure in place: have a dedicated project which invokes the release build (by calculating dependencies and invoking msbuild) and configure VS so it can be select just that project for a build. All from within your ALL.sln so the rest of the features remain. Steps:
Get rid of the PreBuild/PostBuild projects. I assume the PostBuild you mention is also meant for the actual release builds; if not just leave it there. Note by not requiring all projects to depend on the PreBuild project you already got rid of one maintainance burden.
Add one single project which will do the release building, say ReleaseBuild. Such name is also better than having PreBuild/PostBuild projects since it clearly states the intent of the project. A Makefile project is suitable, though technically it could be as simple as an msbuild file with just one Build target. Configure the build command line to do whatever is needed, i.e. figuring out what to build then build. For the sake of an example: say you use Powershell to do this you would configure the build commandline to be
Powershell -NoProfile -File BuildRelease.ps1 $(Platform)
and BuildRelease.ps1 contains something like
$projectsToRelease = CalculateMyProjectsForRelease()
$platform = $Args[0]
$projectsToRelease | %{& msbuild $_ "/p:Configuration=Release;Platform=$platform"}
In Configuration Manager add an extra Configuration called Deploy or so. This will be used to select what to build: you probably have Debug and Release configurations now already. Those stay in place, and are simply used to build everything. The idea is this extra configuration will take care of building the actual release. This is fairly consistent with the standard way of working in VS and easy to discover and understand for newcomers. Using the checkboxes, make it so that when the Deploy configuration is selected only the ReleaseBuild is built and none of the others whereas when Debug or Release is selected the ReleaseBuild project is not built. Looks like this:
To build a release, select Deploy from the configuration drop down menu in the VS toolbar and hit F7 (or whatever way you use to invoke Build Solution). Any build errors/warnings will be parsed and shown as usual in the Error List.
This is also easy to extend: suppose you only have a couple of release build versions just add more configurations like DeployA DeployB DeployC and adjust the build command line for them.

Prevent BizTalk projects from invoking a full rebuild?

I'm on a BizTalk 2013 solution and I'm trying to grow into automated testing. However, when I try to run my tests after changing only the test project, or even just run the tests after changing nothing anywhere, I'm stuck building the same amount of projects that I build when I invoke a full rebuild on the project being tested. This eats up an enormous amount of time, and it's a death sentence for my ability to sell future investments into this type of thing.
Is this is a known deficiency with BizTalk, or with its interaction with MSBuild? Is it a known pitfall that I can repair on my end?
EDIT: After reviewing the "possible duplicate" thread, I believe this question to be similar, but distinct. The explanation from the thread highlights the mechanics by which MSBuild determines that a rebuild is necessary, but MSBuild is widely-used technology across all projects in Visual Studio and can differ significantly by project type based on that project type's specific targets import. I've edited the question title to reflect that I want to learn how to prevent this for BizTalk solutions rather than simply asking why it's happening (although knowing why is always helpful).
So, what you're seeing is not a problem with BizTalk (because BizTalk is perfect and wonderful and never has any problems ever...:).
It's actually a behavior of Visual Studio. To note, BizTalk Projects are just specialized c# Projects.
The best workaround, which I do all the time, is to uncheck the Build and Deploy options for Projects I'm not actively working with in the Solution Configuration. If the Project is not checked for Build, it will not build even when you choose Rebuild Solution.
One possible solution would be to reference not the projects, but the DLL files which are the result of the same - already compiled and built - projects.
This way, when building your test project, it would be built against these existing assemblies and hence would not take the time to rebuild those.
You have to make sure however that these DLLs are updated whenever the project behind them also updates. You could do this by rebuilding them, whenever necessary, in a separate Visual Studio instance.
It takes some practice and thinking to make sure you are building against the latest version, but it WILL save you a lot of time.
I've noticed this as well. Turning on diagnostic output on MSBuild, it turned out that the project settings .user files were being modified after the .pdb files. I've tried several ways of resolving this, including changing the modify date on the pdb file, setting the .user file to readonly, removing (renaming) the .user file, etc.
Unfortunately, the build task for BizTalk will overwrite/recreate/create new .user file after every build, and I haven't come up with a way to convince MSBuild that that it can just ignore the .user file being created as new. Due to that, I'd go with one of the other suggestions here.
Even creating an exclusive lock on the file so that MSBuild can't update it causes a rebuild, since then MSBuild thinks the build is dirty ("Project 'Schemas' is not up to date. Project dirty in MSBuild.")

How to recreate a deleted target?

I have deleted my application target and now all my Build option are gone. I cannot run my project because I am missing a target. How can I regenerate it?
You have two options.
The first is DarkDust's suggestion: restore from a backup or an SCM repository if you have them. If you have neither, you must admit you were begging for trouble.
The second is unfortunate but comes with a message of hope. Recreate the target from scratch. Select File > New > New Target from the main menu and select the appropriate target type (a Cocoa Mac OS X application, doc-based, or whatever). With the new target selected, click the Build Phases tab, expand the Compile Sources phase, and drag all your implementation files - .m (and .c and .mm if you have them) - into the list so they're compiled as part of this target. Expand the Link Binary with Libraries phase and add in any frameworks you use. Expand the Copy Bundle Resources phase and drag in your resources (including xibs, credits, InfoPlist.strings, your app icon, etc.). Don't forget to recreate any Copy Files build phases you might have set up manually (if you did, you'll already know how). That should do it. The message of hope I mentioned is that you're now familiar with what a target is and all it needs to build your product. It's actually a lot simpler than it appears.
If restoring from a backup or a repository is not an option, and your bundle has many resources, I'd recommend starting a new XCode project from scratch and importing the source files and resources into it.
Create a new Xcode project of the same type and info as your project.
Delete this new project's ViewController and AppDelegate source files, copy your source files into the new project's folder, then import them into the Xcode project.
Add any frameworks you've used.
Import the resources (images, sounds, plists, etc) into the project.
It might take longer than recreating a target and adding things to it, but you're less likely to make mistakes along the way, and you'll ensure that everything is properly added to the target.
If you have version control then the following method works. I used another method to revert my changes if there are not much, executing "Git checkout ." on terminal. Changes will be reverted and Your target will be restored.

Allowing developer-specific settings in VS2008 Native C++ projects

Is it possible to combine the following properties, and if so, how?
Store in our version control system some Visual Studio 2008 native C++ (VCPROJ) project files for the developers in our team that use this IDE.
Allow some of those developers to tweak their projects (e.g. using debug version of third-party libraries instead of the usual ones).
Make sure these modifications are done in files that are not versioned.
In other words, I would like to allow developers to tweak some settings in their projects without risking that these changes are committed.
An 'optional VSPROP' file approach seems doomed to fail, as VS2008 refuses to load projects that refer to non-existent VSPROP files...
Any other suggestion? Is this possible with VS2010?
You may not be able to do this but using a solution that generates the vcproj like CMake for example would let you do this. Scripts all your project with CMake and literally conditionally include a config file(if present for example) that developers can change on their setup.
Branches could solve this problem: you create a branch, play with different versions of third-party, merge changes to trunk if results are good.
Well, as a preliminary solution you could put the project file into something like .hgignore or .gitignore after its initial commit.
This way changes to it can't be done accidentally.
At least that's how I handle .hgignore itself.
We use a versionned "common_configuration" folder, and a script which copies project files from this "common_configuration" folder towards the "project" folder.
We have another script to copy the configuration backwards, so the developpers need to make a conscious action to commit their local changes to the global version control system.
It answers partly your needs :
The upside : we have a way to keep a common configuration for everyone, and no accidental committing of local configuration
The downside : blindly copying the files actually crushes local changes. We live with it. We could write some more clever merger tool (using diff, or xml specific manipulations), but don't want to spend to much time on supporting the deployment tools.

How to unit test WIX merge modules?

I am building merge modules with WIX. The batch files which calls the WIX tools to generate the merge modules from *.wxs files are run by my daily build.
I am trying to figure out how could I automate the testing of these merge modules. Things I would like to test are, whether the merge module installs required files, whether the versions of the files are correct etc.
One idea I have is to write a script (may be VB Script) to install the merge module at a temporary location and check if it has installed everything correctly. However, I am not sure if this is a correct way to do it.
Are there any standard ways of writing unit tests for merge modules? Any ideas around how to go about this are welcome.
When you test an installer, the primary goals are to verify that
When installing the msi file, msiexec reports success (i.e. return code 0).
After running the installer, your application can be started and works as expected.
The first point should be easy enough to do, though if you want to keep the test automated you can only test the non-interactive install. Here is an example how to do that from a batch file or on the command line:
msiexec /i myinstaller.msi /passive || echo ERROR: non-zero return code!
The second point is a bit more complicated. I think the best way to achieve this is to build some integration tests into your application, and invoke those tests after the install:
"c:\program files\mystuff\app.exe" /selftest || echo ERROR: non-zero return code!
In your case you're testing a merge module rather than an entire installer. The same approach can be used. You just will have to do the additional work of building a "self test" application and its installer which consumes your merge module.
I've often thought about this but haven't come up with anything that I like. First I would call this integration testing not strictly unit testing. Second the problem of "right files" and "right versions" is difficult to define.
I'm often tempted to say WiX/MSI is just data that defines what the installer is to do. It's declarative in nature and therefore by definition correct. It's tempting to want to create yet another set of data that cross checks the implementation of the installer but what exactly does that accomplish that the first data set didn't already represent? It's hard enough sometimes to maintain what files go into an application yet alone to maintain a second list of files.
I continue to think about this and wonder if there's an approach that would make sense but at this point I just do my normal MSI validation.
You could try to use scripts or other small console program that will do the job, just like you suggested.
With your build process you could also build a basic setup that just uses the merge module. Your script could just install this, run the other script or console app that will check if all the files are in place, that they have the correct version, that all the registry keys are installed, etc. After all the output is gathered your main script would just uninstall everything. You could also run the check program after uninstalling to be sure that everything is gone and that the uninstall works correctly. I would recommend this if, for example, you have custom actions set for install and uninstall.
Ideally this whole install / uninstall process should be done on a separate machine, or a virtual one, in order to avoid messing up the build server.
You'll have some work to do with all this scripts but once you have it, you'll be able to use it with little changes for any other future merge module projects or just plain setup projects.
Hope this would help,
Adrian.