How do I create a TFS2017 Build Task equivalent to Visual Studio's Web Deploy Publish method? - build

I am trying to complete Continuous Integration/Continuous Deployment automation for a web application project. I have been helped by a series of SO posts link1, link2, link3 and things are now running, except the upload to the hosting server is longer than it needs to be; I currently upload all files instead of just the ones that changed.
When creating the TFS2017 Build (or Release) there are many Task options, including some from the marketplace. I'm referring to, in this case, the dialogue for a Build as shown below:
I'm currently using a PowerShell script which seems a little archaic and inefficient as noted above. Do any of the tasks available to us mimic the Visual Studio 2017 Web Deploy Publish Method which runs quite nicely and quickly? If not, what can I use for an 'intelligent' upload process that checks whether or not a file must be uploaded?

Unfortunately, there is no this kind of build task could mimic the Visual Studio 2017 Web Deploy Publish Method for now.
The method trough VS IDE will dynamically check if some files need to be uploaded or not.
However through TFS build task or powershell script will not do this, just simply copy all files you assigned. Afraid there is no workaround for an 'intelligent' upload process that checks whether or not a file must be uploaded. Since we don't how VS IDE did this.

Related

Visual Studio 2017 2 users on one desktop

I have the following problem: I have a Laptop with two different user accounts (UserA & UserB / they belong to the same person but have different priveleges).
UserA: can build the solution and run tests locally (Everything is working as expected)
UserB: Has the rights to publish to the specific network drive, however he cannot build the project. The reason is that the Windows user itself has not the correct proxy settings configured, and thats why the nuget packages cannot be downloaded.
I have no possibility to change the proxy settings for UserB right now (It depends on internal processes, which might take a while)
However I would have expected, that when I build the solution with UserA, UserB would not need to build it again and just can publish it. This, does not work, as soon as I want to publish, VS tries to rebuild the solution and this does not work because some dependencies are missing and cannot be loaded.
Is there any solution for this problem? I tried to make a research, however I was not really sure for the correct keywords to search for.
Edit: I have now the domain added to my UserB, but I still have some other problems. However I found out that this article is heavily related to my issue: Unable to launch Visual Studio 2015 as a different user.
When I start Visual Studio via Command Line with the mentioned arguments, more works, however I am now not able any longer to connect to my database using Integrated Security = true in the connection string. But at least I can build now. Deploying also works but I just get a 500 Server error when I try to connect to the resource.
Edit2: I needed to add the domain to my user when opening Visual Studio with the command mentioned in the link above. This fixed my problem with connecting to the database.
Actually I do not need an answer anymore for my initial question but I will not delete it because maybe someone has an answer for a person who will have the same question later on.

Want to make a Job on Hudson C/C++

Hopefully I find here someone who has experience with Hudson and its functions.
Now . I have Hudson installed this did not reveal any problems. But now I want to create a new job and that I'm developing in C / C + +.
In addition, I am working on Subversion svn where I run on the first error. Hudson did not find my svn . He says that I need an authentication . As I learned I can at Hudson authenticate but that does not work .
Maybe one of you knows how to create a project.
The things should be done in the job of Hudson.
Hudson is on my computer (local ) delete my project.
Then Hudson to access my SVN and check out the project from there.
The whole is now compiling Hudson . ( The best would be a compiler for C / C + + for Visual Studio 2008 compiler ) . The compiler then creates a * . Exe file.
Now Hudson to start the project on the basis of the *. Exe file and run the program .
Last but not least is to Hudson case of an error or if it was all right, inform the persons working on the project via email.
So that would be it what I 've hoped of Hudson. Otherwise, I take the whole not much. I know that I can do all this via a batch file . But that's not my goal. I want Hudson to automate so that I can start at midnight my builds / tests daily.
Do you think that at Hudson are my requirement too high?
For your help I would be very grateful , as I am stuck for days.
Here is a "basic" Hudson job
Create a new free-style software project job.
Configure that job.
(Optional) Configure triggers, such as "timer", "SCM polling", or others.
(Optional) Under Source Code Management section, select your SCM source and configure your repositories and local workspace
Under Build section, select Add build step and select:
Execute Shell if on *nix
OR
Execute Windows Batch Command if on Windows
OR
Pick whatever build-step plugin you are using.
(If using either of the "execute" build steps) Write your build/make/compile command as you would from command line.
(If using another plugin build step) Configure the plugin options according to your requirements.
(Optional) Archive the artifacts of the build with Archive the artifacts under Post-build Actions
(Optional) Execute other post-build actions
(Optional) Send out an email
Now to address your specific scenario. First things first, your question is too broad, and may get locked. Don't get discouraged if that happens, create separate question for each item individually. I cannot cover in details all these items, but I will give you an overview.
The SCM part
Based on your previous question, No Credentials to try in Hudson, I am now guessing that you are not providing Hudson with an HTTP URL to your SVN server, but trying to give it your local workspace location... Please do the command line check that I asked in that question.
You need to provide it with a proper HTTP server URL. Hudson will check out the project from the SVN URL you provided, under what is called a Workspace. The location of workspace can differ, based on your Hudson configuration, but it is a folder inside Hudson installation that is dedicated to the job. It can be referenced from within the job through %WORKSPACE% environment variable.
There are ways to use a different workspace location, but that is outside the scope of this overview. The whole SCM part is also optional, you can rely on existing file system, but this is not a good approach, and again, out of scope of this overview.
The Build step
After Hudson checked-out/updated the Workspace with your SVN, comes the building step. Hudson can do Execute Windows Batch Command by default. It can also Invoke Ant by default. (It can also do Maven, but that is not applicable to your situation)
To do other types of builds, you need a Build Wrapper plugin. In your particular case, the MSBuild plugin is probably what you want. I've never used MSBuild, so cannot give you details. Again, if you have a specific question on how to use MSBuild plugin, you should probably make a separate question with specific issues.
So, using either Execute Windows Batch Command or MSBuild plugin, configure your building step.
Running the exe???
This is very vague. You want to start the .exe and then what? Will it quit and you need an exit code? Do you want to see it on the screen? Again, this is very broad, and deserves a separate question (or read existing questions). If you just want to make a call to the .exe, you can configure a second Execute Windows Batch Command step, type there call path\to\yourfile.exe. But most likely you will not see that on screen. Read my answer here, Open Excel on Jenkins CI, on details of launching an .exe from Hudson/Jenkins that would be visible on screen.
Email
If you want a simple email, Hudson Post-Build actions has a way to send an email. For better customization options, you would want Email-Ext plugin. Once again, if you need details on how to use the email-ext plugin, create a new question (after searching existing questions first), as this is too much to cover in one question.
Conclusion
Your requirements are not too high, but Hudson is not a magic tool that will do the work for you. You still need to configure every step of it. And unless you have a Maven based project (which integrate very well with Hudson), a lot of actions will need to be done through the Execute Windows Batch Command and scripting of your own.

What is the best way to set up your development environment for Sitecore

The general guidance appears to be to install Sitecore into one folder, e.g. D:\Websites\MyWebSite and then create your Visual Studio project in a separate folder, e.g. C:\Projects\MyWebProject. You would then publish your custom code into the Sitecore folder from Visual Studio (This video explains what I’m describing https://www.youtube.com/watch?v=i3Mwcphtz4w around 13 mins in).
I have the following questions:-
Do people only store their Visual Studio project in source countrol and not the Sitecore code?
The publish option from VS into the Sitecore folder only has options for adding files or deleting anything not in the VS project. How would files removed from the VS project ever get deleted without doing it manually?
We use web-deploy to publish sites to staging and live environments. In this scenario would you publish from your VS project or would you set up a way to publish the Sitecore folder (if so how)?
Is this actually a good set up to have or do you do something different?
I did a lot of research on this when we started Sitecore development a couple of years ago. I remember reading a post from Sean Kearney that made a lot of sense to me: http://seankearney.com/post/Visual-Studio-Projects-and-Sitecore
We ended up using this approach for both large and small scale projects and it has been great. You will also want to look at a couple of other tools:
Team Development for Sitecore (TDS) from Hedgehog Development (http://www.hhogdev.com/products/team-development-for-sitecore/overview.aspx)
CopySauce from Igloo (http://www.igloo.com.au/blog/copysauce-igloos-sitecore-development-utility/)
SitecoreRocks for Visual Studio
So to answer your questions:
All of your code and some of the Sitecore items are stored in source control. The approach you want to take is to only store new Sitecore items (layouts, sublayouts, templates, etc) that you create along with any items you may need to customize. You do not need to store all of the sitecore source, content or modules...just what you would need to reapply to get a fresh environment up-to-date. You can manage this manually but a tool like TDS makes this MUCH easier.
We use TDS to manage the publish/deploy to each of our environments. TDS has configurable settings for handling items that have been deleted, including the ability to move it to the Sitecore recycle bin or simply remove it. You have to be careful with this but it does work.
We use a separate build environment to assemble and run deployments using TDS and Jenkins. Basically, all of the code is retrieved from the source control system to the Sitecore server and built using MSBuild and TDS. In most cases we use a webdeploy directly to the Sitecore webroot, but for production we build TDS packages and then run them on each Content Delivery Server
We have used this setup for 7 sitecore projects so far and I am very happy with how it has worked out. We have questioned whether TDS is worth the license fee but the answer always comes back as a yes. The alternative is not very appealing for our development staff and time savings far out-weigh the costs.
Everything is stored in Source Control!... just not always in the same area as they reside on the web server. Storing the Sitecore folder in source control is a good idea as there are changes that you will have as you install modules, but you do NOT add the Sitecore folder as part of your solution/project and should really be there to pull from if need be and not something that is even tracked/monitored.
Once Sitecore is installed, create a new project that resides in the website folder and only add things like the properties folder, layouts, xml and other folders that you want. I don't even include the app_config in my project. Oh and to be clear, it's probably best to just keep the Sitecore folder as a sort of reference folder in your source control but not as part of your website trunk. We have it on the ignore list for website folder in source control. However, that being said, keep in mind that you will NEED to have it in your website folder.
Technically speaking, the recommended approach is to install Sitecore on to the server itself as a stand alone empty instance.. like using the installer with the client mode (not full) so that you get the framework for an empty site in place. Then you can create the deployment package/packages/whatever and it will all be your own code. You should really never have to mess with changing/removing the base Sitecore file system manually.
See above. Generally speaking, unless you have a reason to do so: install Sitecore as an empty instance... then manage your code/files via deployment and just leave the Sitecore folder files alone. You will have very little reason to ever touch them or the Sitecore folder itself outside of an upgrade.
Adding Sitecore itself to source control should be avoided, since you won't be deploying Sitecore as part of your implementation. For modifications to Sitecore itself, you would need a way of handling those inside your implementation, but the config patch system and other mechanisms provide the means for this.
Redundant files in the web site folder will only be a real problem in your development environment. When publishing to a demo environment or to a live environment, you will only publish the material that you actually want. And the deployment-based setup opens up the possibility of always starting from a clean Sitecore installation - as long as you include your Sitecore modifications as part of your implementation (which is not covered in the video). So there is little risk of this being a problem in real life, and the development method in the video makes eliminating this risk entirely possible.
The Sitecore installation should be handled outside of the deployment of your implementation.
It's a good setup, because the method in the video is the method Sitecore recommends for development, and it is also the method Sitecore teaches to developers in development courses. The most obvious advantages of this method are
Clean separation between your web site implementation and the Sitecore installation. There is no risk of accidentally mangling the Sitecore installation, and there is no risk of forgetting unmanaged manual modifications to Sitecore that are needed to run your site. This separation is hard to accomplish if you're not using the method in the video.
By using publishing to deploy your implementation, you know that your implementation is deployable on top of a clean Sitecore installation - and works. This means when deploying to a production or demo server in the future, things will work the same and there will be no surprises. This is very hard to be confident about if you're not using the method in the video.
To test your implementation on a different version of Sitecore, you can just deploy to a clean installation of a different version. This is very hard to test if you're not using the method in the video.
There is sample source code for the video on GitHub, along with instructions on how to set up the development environment, including the publishing parts. This sample source directly and indirectly answers some of your questions.

Cruise control with C++

Is it possible to use the CruiseControl tool with a C++ (Mingw) project on Windows? I need to be able to download the latest sources from XVN, build them, send reports by mail. The application is using http server (lightpd) for work.
So main question is have to use it for email notifications?
Problem is I dont see any destination field in email tag.
I am interested in sending email notifications after build which executes in batch file.
E. g. in my config file I call batch file which executes build, after that I need to send email notification, how can I do it ?
Of course it is possible. There is a Java for Windows, a command line SVN clients, you can invoke gmake or any other build system you are using along with Cygwin, there is even a support for a Visual Studio projects if you need it. There are a lot of people using Cruise Control for C++ projects, thus a lot of documentation, tutorials and examples available online.
Perhaps not exactly what you're asking for, but is there anything preventing you from using Jenkins? People I've talked to that maintains the continuous integration for a living that have used both Jenkins as well as cruise control prefers Jenkins. Of course the bonus with Jenkins is that it's free.
If you can create a script that checks out and builds your project from the command line (in Cygwin's bash, for example), then you can certainly integrate the build into cruise control or Jenkins.
I don't know much about cruise control, but we use Jenkins a lot, and even though it has bugs that need to be worked around overall we find it extremely useful for CI and nightly build jobs.
Regarding the email aspect, Jenkins can be configured to watch the SVN log and when a build fails it can send an email to the people that committed changes since the last successful build. This functionality can be enabled with minimal configuration. There are add-ons that allow you to configure the content of the emails as well.

Is there an ideal way to move from Staging to Production for Coldfusion code?

I am trying to work out a good way to run a staging server and a production server for hosting multiple Coldfusion sites. Each site is essentially a fork of a repo, with site specific changes made to each. I am looking for a good way to have this staging server move code (upon QA approval) to the production server.
One fanciful idea involved compiling the sites each into EAR files to be run on the production server, but I cannot seem to wrap my head around Coldfusion archives, plus I cannot see any good way of automating this, especially the deployment part.
What I have done successfully before is use subversion as a go between for a site, where once a site is QA'd the code is committed and then the production server's working directory would have an SVN update run, which would then trigger a code copy from the working directory to the actual live code. This worked fine, but has many moving parts, and still required some form of server access to each server to run the commits and updates. Plus this worked for an individual site, I think it may be a nightmare to setup and maintain this architecture for multiple sites.
Ideally I would want a group of developers to have FTP access with the ability to log into some control panel to mark a site for QA, and then have a QA person check the site and mark it as stable/production worthy, and then have someone see that a site is pending and click a button to deploy the updated site. (Any of those roles could be filled by the same person mind you)
Sorry if that last part wasn't so much the question, just a framework to understand my current thought process.
Agree with #Nathan Strutz that Ant is a good tool for this purpose. Some more thoughts.
You want a repeatable build process that minimizes opportunities for deltas. With that in mind:
SVN export a build.
Tag the build in SVN.
Turn that export into a .zip, something with an installer, etc... idea being one unit to validate with a set of repeatable deployment steps.
Send the build to QA.
If QA approves deploy that build into production
Move whole code bases over as a build, rather than just changed files. This way you know what's put into place in production is the same thing that was validated. Refactor code so that configuration data is not overwritten by a new build.
As for actual production deployment, I have not come across a tool to solve the multiple servers, different code bases challenge. So I think you're best served rolling your own.
As an aside, in your situation I would think through an approach that allows for a standardized codebase, with a mechanism (i.e. an API) that allows for the customization you're describing. Otherwise managing each site as a "custom" project is very painful.
Update
Learning Ant: Ant in Action [book].
On Source Control: for the situation you describe, I would maintain a core code base and overlays per site. Export core, then site specific over it. This ensures any core updates that site specific changes don't override make it in.
Call this combination a "build". Do builds with Ant. Maintain an Ant script - or perhaps more flexibly an ant configuration file - per core & site combination. Track version number of core and site as part of a given build.
If your software is stuffed inside an installer (Nullsoft Install Shield for instance) that should be part of the build. Otherwise you should generate a .zip file (.ear is a possibility as well, but haven't seen anyone actually do this with CF). Point being one file that encompasses the whole build.
This build file is what QA should validate. So validation includes deployment, configuration and functionality testing. See my answer for deployment on how this can flow.
Deployment:
If you want to automate deployment QA should be involved as well to validate it. Meaning QA would deploy / install builds using the same process on their servers before doing a staing to production deployment.
To do this I would create something that tracks what server receives what build file and whatever credentials and connection information is necessary to make that happen. Most likely via FTP. Once transferred, the tool would then extract the build file / run the installer. This last piece is an area I would have to research as to how it's possible to let one server run commands such as extraction or installation remotely.
You should look into Ant as a migration tool. It allows you to package your build process with a simple XML file that you can run from the command line or from within Eclipse. Creating an automated build process is great because it documents the process as well as executes it the same way, every time.
Ant can handle zipping and unzipping, copying around, making backups if needed, working with your subversion repository, transferring via FTP, compressing javascript and even calling a web address if you need to do something like flush the application memory or server cache once it's installed. You may be surprised with the things you can do with Ant.
To get started, I would recommend the Ant manual as your main resource, but look into existing Ant builds as a good starting point to get you going. I have one on RIAForge for example that does some interesting stuff and calls a groovy script to do some more processing on my files during the build. If you search riaforge for build.xml files, you will come up with a great variety of them, many of which are directly for ColdFusion projects.