Implementing Lean Software Development

Mary Poppendieck, Tom Poppendieck

Mentioned 2

A practical implementation and companion guide to the bestselling introduction to Lean Software Development.

More on Amazon.com

Mentioned in questions and answers.

I'm a senior engineer working in a team of four others on a home-grown content management application that drives a large US pro sports web site. We have embarked upon this project some two years ago and chose Java as our platform, though my question is not Java-specific. Since we started, there has been some churn in our ranks. Each one of us has a significant degree of latitude in deciding on implementation details, although important decisions are made by consensus.

Ours is a relatively young project, however we are already at a point when no single developer knows everything about the app. The primary reasons for that are our quick pace of development, most of which occurs in a crunch leading up to our sport's season opener; and the fact that our test coverage is essentially 0.

We all understand the theoretical benefits of TDD and agree in principle that the methodology would have improved our lives and code quality if we had started out and stuck with it through the years. This never took hold, and now we're in charge of an untested codebase that still requires a lot of expansion and is actively used in production and relied upon by the corporate structure.

Faced with this situation, I see only two possible solutions: (1) retroactively write tests for existing code, or (2) re-write as much of the app as is practical while fanatically adhering to TDD principles. I perceive (1) as by and large not practical because we have a hellish dependency graph within the project. Almost none of our components can be tested in isolation; we don't know all the use cases; and the uses cases will likely change during the testing push due to business requirements or as a reaction to unforeseen issues. For these reasons, we can't really be sure that our tests will turn out to be high quality once we're done. There's a risk of leading the team into a false sense of security whereby subtle bugs will creep in without anyone noticing. Given the bleak prospects with regards to ROI, it would be hard for myself or our team lead to justify this endeavor to management.

Method (2) is more attractive as we'll be following the test-first principle, thus producing code that's almost 100% covered right off the bat. Even if the initial effort results in islands of covered code at first, this will provide us with a significant beachhead on the way to project-wide coverage and help decouple and isolate the various components.

The downside in both cases is that our team's business-wise productivity could either slow down significantly or evaporate entirely during any testing push. We can not afford to do this during the business-driven crunch, although it's followed by a relative lull which we could exploit for our purposes.

In addition to choosing the right approach (either (1), (2), or another as of yet unknown solution), I need help answering the following question: How can my team ensure that our effort isn't wasted in the long run by unmaintained tests and/or failure to write new ones as business requirements roll on? I'm open to a wide range of suggestions here, whether they involve carrots or sticks.

In any event, thanks for reading about this self-inflicted plight.

I think that you need to flip around part of the question and the related arguments. How can you ensure that adhering to TDD will not result in wasted effort?

You can't, but the same is true for any development process.

I've always found it slightly paradoxical that we are always challenged to prove the cash benefit of TDD and related agile disciplines, when at the same time traditional waterfall processes more often than not result in

  • missed deadlines
  • blown budgets
  • death marches
  • developer burn-out
  • buggy software
  • unsatisfied customers

TDD and other agile methodologies attempt to address these issues, but obviously introduce some new concerns.

In any case I'd like to recommend the following books that may answer some of your questions in greater detail:

I work in a fledgling software development arm at a large organization. For the past few years, I and a (very) select few have been churning out software products that have been reasonably successful and, I'm happy to report, very maintainable.

If any specific methodology was used, I'd say it was the core focus on agile of delivering functional products in short iteration, as opposed to long months of the customer waiting.

Our development started with going out to the customer/user base and evaluating needs through observation, interview, evaluating current tools, and building requirements on this. We would then deliver core requirements as quickly as possible in iterations (roughly about a month) where the customer/user had a functional product to use with the features included in that particular iteration.

I think that mega-strict TDD might be a bit overkill for me personally, but I understand the value of proper unit testing and understand that it becomes more and more important as teams grow in size -- this is something I probably do not pay enough attention to now. A lot of functional testing is done, but I know that I need to implement alot more unit testing than I do now, and as new developers come on board, I can't afford to let my bad habits become the norm.

So I ask all of you stackoverflow users, what do you think are the most important/useful aspects of Agile/Scrum/XP/(insert your favorite methodology here)

I'm in the fortunate position of being able to determine what processes / methods that will be used going forward as the team begins to grow into a proper software team.

I've spent a lot of time reading up on various methodology and rebukes to them, and I guess what is most important to me is:

  • Short iterations providing functional products -- for our organization and customer, being able to show the sponsor/investor/user something that is "real" and they can use hands on has gone a long way, and kept long products from dying on the vine.
  • A good way to prioritize tasks -- this has been paramount to keeping the above working smoothly.
  • Peer Review
  • Unit Testing -- Can someone point me to a reference that has good info/examples on unit testing that provides good utility, without being overly time-consuming or tedious (I'm not really sure if I need a test to determine the existence of a class/interface before I code that interface, for example)
  • Time estimates - I guess I'm looking at the best way to "guess" at when something is going to be ready, versus whipping a team into getting something done by a deadline that was poorly estimated.

I know this question is nebulous, etc, but thanks for looking :D

TLDR: What parts of Agile, XP, and Scrum, do you folks think are "best" and contribute to healthy products? If you had to set forth a new software management process tomorrow, what would it include?

I assume that you do agile pretty well on your company and you are looking for more advanced stuff.

Technically, I would take a look at Continuous Delivery.

This technique aims to be able to ship software to production at any time of the day. Flickr, for instance, uses it ("In the last week there were 35 deploys of 331 changes by 18 people.").

In order to do that, you must have :

  • A solid test harness, from unit tests to functionnal ones.
  • A good knewledge of your deployment infrastructure. You will have to avoid downtime.
  • Your deployement must be entierly automated.
  • and more.

On the method side, I would take a look at Kanban and other Lean principles. It aims to reduce your time to market: from idea to a production state.

Have fun !