Continuous Integration

Paul M. Duvall, Steve Matyas, Andrew Glover

Mentioned 11

Shows how the practice of Continuous Integration (CI) benefits software development by improving quality and reducing risk.

More on Amazon.com

Mentioned in questions and answers.

I've been thinking about CI and automatic builds a lot lately and am interested in knowing if there are any best practices for setting up and maintaining a continuous integration environment. Do you keep all your CI related files checked in with your project source? How would you usually structure your CI and build files? Any tips are welcome!

If you haven't already, definitely check out the Continuous Integration book from the Martin Fowler series, by Duvall/Matyas/Glover. It covers all of the questions you ask in depth with solid examples.

I'm considering writing my own delivery code using PowerShell and/or C#, maybe shelling to NAnt or MSBuild.

  1. Why should I not go this way? Is this such a really hard endeavor compared to using NAnt or MSBuild?
  2. Any good, modern book that can help?
  3. Any better ideas?

Background (P.S. This is a religious issue for some. No insult intended):

One person shop, multiple exploratory projects. As most of us - Now windows and ASP.Net. Considering mobile and cloud.

I've started meddling with NAnt, and have tried to follow Expert .Net Delivery Using NAnt and CruiseControl.Net. The whole issue of "delivery" was put on ice and now it's time to "defrost" it. However, I'm not sure which way to go. From what I've learned:

NAnt is showing its age. It's clumsy: it's much harder to understand and maintain than a modern, OO language such as C#. Even after I've followed the book it seems strange to work in an arcane environment where what you want executed is XML, and looping and inheritance are (as far as I remember before the "ice age") are hard to impossible.

MSBuid is MS specific. I'm not even sure if it would support non MS environment. Team foundation server is expensive.

Even so, they somehow both seem to provide value because on my SO search I haven't heard anybody using their own custom software. However, I don't understand why not use C# and simply call NAnt and/or MSBuild tasks as needed.

SO - NAnt Vs. MSBuild

My advice is just the opposite - Avoid MSBuild like the plague. NANT is far far easier to set up your build to do automatic testing, deploy to multiple production environments, integrate with cruisecontrol for an entry environment, integrate with source control. We've gone through so much pain with TFS/MSBuild (Using TFSDeployer, custom powershell scripts, etc) to get it to do what we were able to do with NANT out of the box. Don't waste your time.

SO - NAnt vs. scripts:

there's much more to building a product than just compiling it. Tasks such as creating installs, updating version numbers, creating escrows, distributing the final packages, etc. can be much easier because of what these tools (and their extensions) provide. While you could do all this with regular scripts, using NAnt or MSBuild give you a solid framework for doing all this

NAnt as a project is dead, or on life support (last release was 0.86 beta 1, two years ago). It was basically cut short by the release of MSBuild. NAnt was nice, MSBuild is OK I guess, but I feel more and more drawn to write code in, well, a language instead of some XML-based procedural stuff. With XML-based build frameworks the debugging experience is awful and you end up cheating by embedding "scripts" in c# which defeats the purpose of declaration over programmation. Same sad story as XSLT, for that matter.

Some good XML-free build frameworks:

I still use MSBuild since it's the format of csproj files, but for specific stuff I shun building logic in XML (no custom MSBuild tasks).

Don't use C# for building script because you don't want to compile it when you make changes to it.

If you plan to use PowerShell, then take a look on PSake.

If you are XML friendly then go with MSBuild or NAnt. Maybe those build scripts are valuable for you. MSBuild has one advantage: Ctrl + F5 builds this script in Visual Studio.

I have been moving slowly to Rake because it's nicer and Ruby is programming language (which means you can do anything): I blogged about how nice it could be, but you have to translate it or look at code only. If you like that, then you may want to see full script and dependencies.

Good book about continuous integration is from Paul Duvall, Steve Matyas and Andrew Glover (Continuous Integration: Improving Software Quality and Reducing Risk).

I checked around, and couldn't find a good answer to this:

We have a multi-module Maven project which we want to continuously integrate. We thought of two strategies for handling this:

  • Have our continuous integration server (TeamCity, in this case, but I've used others before and they seem to have the same issue) point to the aggregator POM file, and just build everything at once
  • Have our continuous integration server point at each individual module

Is there a standard, preferred practice for this? I've checked Stack Overflow, Google, the Continuous Integration book, and did not find anything, but maybe I missed it.

Standard practice with Hudson, at least, is your first option. For one thing, in maven, your build may not work very well if all the projects are not in the reactor together. For another, trying to make them separate builds is going to put you in snapshot-management heck. If one in the middle changes, and you try to build just it, maven will go looking for its dependencies as snapshots. What it gets will depend on the order in which other projects build, and whether you publish snapshots.

If you have so many projects, or such unrelated projects, that building them all is a problem, then I suggest that you need to consider dis-aggregation. Make the parent a separate, released, project, give each of them (or each subgroup of them) a Trunk/Tags/Branches structure, and make them depend on releases, not snapshots.

We are currently setting up the evaluation criteria for a trade study we will be conducting.

One of the criterion we selected is reliability (and/or robustness - are these the same?).

How do you assess that software is reliable without being able to afford much time evaluating it?

Edit: Along the lines of the response given by KenG, to narrow the focus of the question: You can choose among 50 existing software solutions. You need to assess how reliable they are, without being able to test them (at least initially). What tangible metrics or other can you use to evaluate said reliability?

It depends on what type of software you're evaluating. A website's main (and maybe only) criteria for reliability might be its uptime. NASA will have a whole different definition for reliability of its software. Your definition will probably be somewhere in between.

If you don't have a lot of time to evaluate reliability, it is absolutely critical that you automate your measurement process. You can use continuous integration tools to make sure that you only ever have to manually find a bug once.

I recommend that you or someone in your company read Continuous Integration: Improving Software Quality and Reducing Risk. I think it will help lead you to your own definition of software reliability.

I learned nothing about testing during my 3 years of computer engineering. We were mostly just told that it was very important.

Anyways, I managed to get started with Unit Testing on my own to a talk by Roy Osherove and his book The Art of Unit Testing. Very helpful, clear and to the point.

Problem now is that there of course are a lot of code that requires for example databases or web services. I can mock these take them out of the equation which is good for unit tests, but it does leave quite a bit of my code untested. Are there any good books or resources that are up to date, with very little fluff, which can help me get started with integration tests? Preferably with focus on C#.

One book that I read that helped me out on the path of integration testing was Continuous Integration: Improving Software Quality and Reducing Risk

This book did not have a primary focus on .NET but did offer a valuable resource in the whole process of continuous integration.

In respect to integration testing (which is a part of continuous integration) the book covers some aspects of the process of building up a database, going over frameworks to help you seed data, and wrapping that up nicely in a build script (the book uses Nant)

I want to use Maven to handle artifact generation for the different local and testing regions. I believe I can use different profiles but I am not certain.

In Maven can I select different directories to select files used on packaging (such as application.properties)? How would I set that up?

An idea of what I want is to have a the following folders for resources in my project

  • local
  • build server
  • dev
  • sys
  • prod

Each folder should contain a different version of application.resources which is a file in Spring that can be used to handle hard-coded strings for use in variables. For local builds- our developers also work on different operating systems. Should I require I want to make it seamless on different OS' also.

Key outcomes would be:

  • Control Maven lifecycle phases from inside the IDE (IntelliJ)
  • Not complicate phases and team processes
  • Keep things as consistent for each developer
  • Make the different configurations per developer/region appear invisible when running a phase e.g. install

Ideally I would have my project set up according to best practices (Duvall, Matyas, Glover).

I currently have a CI environment setup using the following tools:

VCS - ClearCase (UCM enabled)
CI Server - Jenkins
Build Engine - MSBuild

Basically Jenkins is polling my UCM Project's Integration stream every 2 minutes and building via a msbuild script I wrote.

While in ClearCase it is not a best practice having a individual stream for each developer, good CI demands private builds to be run before commiting the code. Added to that, ideally i would have atomic commits, which ClearCase provides just on the form of Deliver to stream.

Currently we are working directly on integration stream, and sometimes our builds fail because Jenkins starts building before the developer finishes her check-ins.

My question is, how can I have a private work area (Sandbox) and atomic commits on ClearCase without creating a stream for each developer? Am I missing something?

Currently we are working directly on integration stream, and sometimes our builds fail because Jenkins starts building before the developer finishes her check-ins

You can write your build script in order to detect if a deliver is in progress.
A deliver is characterized by an activity named deliver.xxx: you can list its content and see if any version in it is in checkout. If yes, the deliver is in progress.
If the most recent deliver has only checked-in versions, you can safely start your build.


Or:

How can I have a private work area (Sandbox) and atomic commits on ClearCase without creating a stream for each developer

A private area for Jenkins to use would be a snapshot view on each developer stream.
As the name suggests, a snapshot view would take a snapshot of the code, but you need to define a criteria suggesting that Jenkins can build what the snapshot view has updated.

What I have seen used is a 'BUILD' shifting label (a label you re-apply on the newly updated code, and used by Jenkins in his snapshot view with a selection rule based on that label):
The developer move his/her label when he/she thinks the current code is ready to be build, and a Jenkins jobs update its snapshot view on the developer stream, based on the versions referenced by said shifting label 'BUILD'.

I am new to the subject of Testing an app. It's been a few days starting surfing over the Internet to find out some useful tutorials. Honestly, I could just found some good videos giving me a big picture about how to make some Coded UI Test Automation, A Database Unit Test and also Unit testing using MS Visual Studio 2010. But still there are lots of questions on my mind. For example, an automation of CUIT is just recording what I did as running the application and testing on my own. So what..? It just records my actions. This is actually me who tests the application traditionally. I know and sure that there must be some reason I'm not aware of! Can anyone please explain to me how a automated Coded UI Test is to help me? On the other hand, there is a similar question about Database Unit Testing. I've found a video tutorial on YouTube explaining an example of this. It just simply checks if a stored procedure is to work properly! Obviously, as I'm running my application (Pressing F5) I will simply understand if an Insert SP is working perfectly or not! Thus again, I can't get what the role of Database Unit Testing is? I will appreciate in advance if anyone could give me an idea or any useful link. Thank you,

In relation to the note about the testing software recording your actions, this can be quite handy when trying to replicate a bug particularly when you first start writing the tests.

Like Eugene noted that when you get more than one coder things get more complicated, I would also like to add that when the software has to interact with other components (e.g. a server, other software packages) it gets very complicated very quickly. It is not always a safe bet to assume the other component is perfect. So the idea of automating your tests is that whilst you keep writing the software you can test against everything done before without you needing to do any work. For example I write a program that connects using Bluetooth but I add WiFi, I could use most (If not all) of those Bluetooth test cases against the Wifi. In a UI example imagine you add a new button which in the process you accidentally break an old button, if you have 10 buttons and it has no relation to the new button so you don't bother manually testing it but an automated test suite would pick it up straight away.

If you need more justification about testing I would highly suggest reading Continuous Integration which demonstrates why you should test and the benefits as well as giving examples on how to go about it. It also has some examples about DB testing!

I hope this helps, I am also new to testing but have learnt a lot in a short period.

I've been googling a lot lately trying to find articles, books , or even the correct search terms on more 'agile' web application infrastructure/setups but dont think im currently searching the right terms!

Im looking for an overview of best practices that will take me through how i should set things up with regards to things like automating builds, automating deployment to staging and production, continuous integration, versioning, testing etc. etc.

Im working on a pretty complex online store using .net and have so far started getting to grips with using MSBuild to control my builds and TeamCity running builds after commits to GitHub.

I have been working through the 'Inside MSBuild' book which is pretty cool and also a book on brownfield applications which is actually equally useful for a fresh project.

So im getting to grips with individual pieces but really want some concrete processes to follow.

Any help, greatly appreciated as Im fed up with aimlessly googling!

Sam : )

You're on the right track with TeamCity in my opinion; we tried CruiseControl.NET first and found it required more XML-editing.

There is a book on Continuous Integration in .NET; I have not read it.

There is also Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation - I have not read that either, but Fowler-approved books are generally excellent. There's also an older book in the series on Continuous Integration.

If TeamCity is working for you, I'd suggest studying testing first. One of the major values in continuous integration is automated test-running. I can recommend a book on that: The Art of Unit Testing with Examples in .NET.

My personal opinion is that MSBuild scripts are best left to Visual Studio. If having TeamCity run solutions and NUnit/xUnit tests proves insufficient, you might take a look at NAnt. While it is XML-based, I find it easier to understand than MSBuild.

I've got

Expert .NET Delivery using NAnt & CruiseControl.net (Expert's Voice in .NET)

It's old, but covered everything I needed to get up and running last year.

Secondary questions are

  1. How do we best utilize SCM in the build process?
  2. How are code files labed and branched?
  3. Should we the .csproj and .sln files for build? How flexible are these when deploying to several environments? I know these are msbuild files. But as we add new files, this can become a bottlenect of updating and maintaining these .csproj files in SCM.
  4. How is rollback done in case of failed builds that QA missed testing etc,etc.,
  5. Are there any good articles on the build process?

This is more a question on the process and less on the choice of automated build tools. Please share your build process. I would like to get an end-to-end view from developers checking-in to Going Live.

Continuous Integration: Improving Software Quality and Reducing Risk

http://www.amazon.com/gp/product/0321336380

book's website is here:

http://www.integratebutton.com

I guess, it's an off-topic question, but let me try to ask it. I've just started learing about Continuous Integration and now I know that for Java language there are tools like Checkstyle and PMD that enable to enforce coding standards and to report any lines of code that are not meeting these standards.

And now I wonder if there are similar tools for PHP language, that I could incorporate to my Continuous Integration system. What If want all my codes to follow PSR-1 and PSR-2 specification. Are there any tools for PHP that can automate this process - check the whole code base and find files which do not meet the requirements etc.?

Try CodeSniffer.

For example, to verify code for PSR-2 compliance use phpcs --standard=PSR2 src.

Also, I recommend to check out this PHP package boilerplate. It has some basic CI setup with Travis CI and Codeclimate that might be helpful to you.