I use visual studio 2008, and I would like to know if it is possible to run scheduled build on it.
Indeed, I have several projects I work on, and every time I change one, it may impact the other.
I would like to run, every night, a batch build on all my projects, so when I come in the morning I can see what compiled (and what did not compile as well).
Is there a way to schedule such a task in visual studio ?
EDIT:
I probably should have mentioned that I needed a light weight solution, so the TFS and other are a bit to complicated to put in place for me.
A simple scheduled task can do the trick.
The best way is to set up a continuous integration server, e.g. CruiseControl.NET, Hudson, TeamCity, etc. There's endless debates here about which is the best one. These can be configured to take your source from your source control and build it as scheduled jobs. Obviously if you're using TFS for source control then you can use TFS to do this as in Nawaz's answer.
Both #Rup and #Nawaz have the best solutions. If you're looking for something really low-tech, though, you could consider simply creating a windows task that does a command-line build of your code on a nightly basis.
For that, you need TFS. Team Foundation Server
See this: How To: Set Up a Scheduled Build in Visual Studio Team Foundation Server
Related
On a project I am working on, I am maintaining some Feature Tests written in SpecFlow. Our team started using Visual Studio 2017 about a year ago, and we finally got around to doing some upkeep on our tests!
Our tests for the project I'm working on were originally written in SpecFlow 2.3.2, and were last updated in Visual Studio 2015.
The SpecFlowSingleFileGenerator is known to not work on VS 2017, so I spent the better part of yesterday changing our suite to use the MSBuildSingleFileGenerator instead as detailed in this article in SpecFlow's official documentation
Problem:
Locally, I can build my solution, including the Feature Test project just fine.
However, I keep getting the following error when I try to build the project on our build server:
[exec] C:\CheckoutDirectory\My Awesome Project\packages\SpecFlow.Tools.MsBuild.Generation.2.3.2\build\SpecFlow.Tools.MsBuild.Generation.targets(45,5):
error MSB4036: The "GenerateAll" task was not found.
Check the following:
1.) The name of the task in the project file is the same as the name of the task class.
2.) The task class is "public" and implements the Microsoft.Build.Framework.ITask interface.
3.) The task is correctly declared with <UsingTask> in the project file, or in the *.tasks files located in the "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\MSBuild\15.0\Bin" directory. [C:\CheckoutDirectory\My Awesome Project\AwesomeProject.FeatureTest\AwesomeProject.FeatureTest.csproj]
I should point out that our team has no experience with writing MS Build tasks, as hitherto we haven't needed to; we use NAnt build scripts on TeamCity to manage our build work. It's clear that error message would be helpful...if we knew literally anything about it.
Now, normally the correct answer would be: Google it. I did that, and this specific error has no pertinent results.
Additionally, this is blocking my team, since we need our build to work. I don't have the time to do the research and education necessary to properly understand how MS Build technology works. That wil have to come later.
Question:
Bearing in mind that SpecFlow has broken our process, and our team's lack of knowledge about the MS Build system: I need to know how to get around the "GenerateAll" task was not found error. What do I do to get around it?
Secondary Question:
I'm also open to lateral thinking. Is there some way to hack either VS 2017 or SpecFlow to make the SpecFlowSingleFileGenerator "compatible" with each other? The objective here is NOT to avoid making changes, but to control the changes. I need a path towards transitioning from the old file generator to the MS build generation system.
Additional Information:
So, I did some digging, and I found a place where "GenerateAll" is being called in the SpecFlow.Tools.MsBuild.Generation.targets file:
<Target Name="UpdateFeatureFilesInProject"
DependsOnTargets="BeforeUpdateFeatureFilesInProject"
Inputs="#(SpecFlowFeatureFiles)" Outputs="#(SpecFlowFeatureFiles->'%(RelativeDir)\%(Filename).feature.cs')">
<GenerateAll
ShowTrace="$(ShowTrace)"
BuildServerMode="$(BuildServerMode)"
OverwriteReadOnlyFiles="$(OverwriteReadOnlyFiles)"
ProjectPath="$(MSBuildProjectFullPath)"
ForceGeneration="$(ForceGeneration)"
VerboseOutput="$(VerboseOutput)"
DebugTask="$(SpecFlow_DebugMSBuildTask)"
>
<Output TaskParameter="GeneratedFiles" ItemName="SpecFlowGeneratedFiles" />
</GenerateAll>
</Target>
Because I've confirmed that this is being copied out to the build server, the situation is yet more mysterious. It appears that the NuGet package is being pulled down faithfully. Therefore, I can't figure out why my local copy is behaving differently than the copy on the build server.
I am not sure where you found this statement:
The SpecFlowSingleFileGenerator is known to not work on VS 2017
The SpecFlowSingleFileGenerator is working in VS2015, VS2017 and VS2019. We see it as a legacy feature, but it's still there. Since some weeks it is disabled by default, but you can enable it in the options.
It works for SpecFlow >= 2.3.2 and 2.4. For SpecFlow 3 you have to use the MSBuild integration. There are some problems with older versions of SpecFlow, but with them it can also work. It depends on your setup.
About your MSBuild error:
The MSBuild Task for SpecFlow < 3.0 is in the specflow.exe. Is it on your build server?
It is part of the SpecFlow NuGet packages. Normally you get this kind of error if MSBuild can't find the assembly where the task is.
For "debugging" problems with MSBuild, I can highly recommend to use the MSBuild Structured Log Viewer (http://msbuildlog.com/). With it, it makes it easy so see what is happening in your build.
We have an example for MSBuild Code- Behind- Generation with SpecFlow 2.3.2 here: https://github.com/techtalk/SpecFlow-Examples/tree/master/MSBuild/OldCSProj_SpecFlow232
You could compare your project with this example.
Full disclosure: I am one of the maintainers of SpecFlow.
The Question
Is it possible to build the Intellisense database for a solution (C++) at the command line?
The Context
I work on a fairly large C++ codebase. The code takes a while to compile so I set up a local nightly automated build that I can grab any time I want to start a new task. I would like to create the Intellisense database for the solutions of the codebase during this nightly build. We are using Visual Studio 2013.
According to this answer on MSDN, it is (sadly) not possible:
It is not possible to create this file without running VS, it is not a
scenario the team designed/planned for.
Ryan
I'm developing a project in Visual Studio 2012 with some other developers. In my project, every one is connected together with TFS 2012.
I'm writing a lot of unit-tests that testing the classes on my project.
In a perfect world, I wish to:
Run all my unit tests every night. It's will be nice if I could control the scheduling (e.g. on Sunday run X tests and on Monday run Y tests).
Getting report every day of which tests failed/passed. It would be nice to get an email sent to me.
Browse running history: view the running history of specific test.
Is that possible?
I've already viewed the solution "Run tests after build" and found it ineffective. It's bothersome that after each [build -> build solution] it's starts to run 100+ unit-tests.
I would suggest you use something like TeamCity, this is Continuous integration Services, this will be build run test every time you check in,
Jenkins is a continuous build machine, which it sounds like would meet your needs. You may have to intall some plugins, but it can do what you are talking abotu
"local continuous integration system" may not be the correct term, but what I'm hoping to find is an continuous integration system that can be configured to monitor changes to local files (C++ files in particular) and 1) try to compile the affected object files (stopping on first failure), and if successful and no new source file changes 2) link the affected binaries, and if successful and no new source file changes 3) run affected tests.
By monitor changes to local files, I do not mean monitor commits to a revision control system, but the state of local files as they are saved. Ideally the system would be provide integrations into source editors so it could monitor changes in the editor that haven't even been saved to disk yet.
Ideally it would also provide a graphical indication (preferably on Windows 7) of current and recent status that quickly allows drilling into failures when desired.
The closest thing I found was nose as described here but that only covers running Python tests not building C++ files.
The closest thing to what you are looking for is cdash and the Boost test bench; I think that a tool like the one you are looking for will never exist for C++ because compiling each project after editing a single file it's only a waste of time in a productive C++ workflow.
Continues Integration is a rising concept today, so you are not alone here.
Assuming you are developing on Windows, if you are working with Microsoft Visual Studio
you may consider Microsoft's Visual Studio Team Foundation Server (TFS)
(formerly Visual Studio Team System).
That will give you Source-Control AND Build-Automation in one package,
with great integration to Microsoft products, of course
(I think there is a free version for MSDN users).
If not keen on Microsoft products, or just looking for build-automation,
I would recommend a great Open-Source Continues Integration tool:
Jenkins CI.
Good luck!
I would look at Jenkins CI - it is a good tool, works on any platform, and can be configured to do almost anything. I used it to run Python Code that talked to a mobile phone, made calls and recorded those calls (and tested the "quality" of the call, although my project never got the £xxxx real quality software, as we were just showing a concept), and then Jenkins would produce graphs of "how well it worked".
You can also do what you describe of "chaining" - so it would discover that your source has changed, try to build it [generally this is done using make, so it would automatically stop at the first errored file (although it could be hundreds of errors in one file!)]. Compile and build success then chains to running tests. Not entirely sure how you determine what is "relevant". If your test cycle isn't enormous, I'd run them all!
I'm doing research on doing builds using TFS2010 Team Build. At the moment we are using Nant Scripting to build .NET code as well as launching external processes to build VB6 projects. We are slowly converting our VB6 over to .NET but would like to gain the benefits of automating as much as possible. How can we launch an external process to compile in Team Build to compile our VB6 projects? Is this possible? Any suggestions, etc would be greatly appreciated.
Thanks,
Jim
Have a look at the InvokeProcess activity, it should be able to do what you want. For relatively simple tasks like compilation etc. it should be enough. If you need to capture output in a more detailed way (e.g. publish test results from external tests), you may need to write your own custom activity.
you can use the VB6 activity from Community TFS Build Extensions