Test-driven Development

Kent Beck

Mentioned 102

Write clean code that works with the help of this groundbreaking software method. Example-driven teaching is the basis of Beck's step-by-step instruction that will have readers using TDD to further their projects.

More on Amazon.com

Mentioned in questions and answers.

If you could go back in time and tell yourself to read a specific book at the beginning of your career as a developer, which book would it be?

I expect this list to be varied and to cover a wide range of things.

To search: Use the search box in the upper-right corner. To search the answers of the current question, use inquestion:this. For example:

inquestion:this "Code Complete"

Applying UML and Patterns by Craig Larman.

The title of the book is slightly misleading; it does deal with UML and patterns, but it covers so much more. The subtitle of the book tells you a bit more: An Introduction to Object-Oriented Analysis and Design and Iterative Development.

Masters of doom. As far as motivation and love for your profession go: it won't get any better than what's been described in this book, truthfully inspiring story!

Beginning C# 3.0: An Introduction to Object Oriented Programming

This is the book for those who want to understand the whys and hows of OOP using C# 3.0. You don't want to miss it.

alt text

Mastery: The Keys to Success and Long-Term Fulfillment, by George Leonard

It's about about what mindsets are required to reach mastery in any skill, and why. It's just awesome, and an easy read too.

Pro Spring is a superb introduction to the world of Inversion of Control and Dependency Injection. If you're not aware of these practices and their implications - the balance of topics and technical detail in Pro Spring is excellent. It builds a great case and consequent personal foundation.

Another book I'd suggest would be Robert Martin's Agile Software Development (ASD). Code smells, agile techniques, test driven dev, principles ... a well-written balance of many different programming facets.

More traditional classics would include the infamous GoF Design Patterns, Bertrand Meyer's Object Oriented Software Construction, Booch's Object Oriented Analysis and Design, Scott Meyer's "Effective C++'" series and a lesser known book I enjoyed by Gunderloy, Coder to Developer.

And while books are nice ... don't forget radio!

... let me add one more thing. If you haven't already discovered safari - take a look. It is more addictive than stack overflow :-) I've found that with my google type habits - I need the more expensive subscription so I can look at any book at any time - but I'd recommend the trial to anyone even remotely interested.

(ah yes, a little obj-C today, cocoa tomorrow, patterns? soa? what was that example in that cookbook? What did Steve say in the second edition? Should I buy this book? ... a subscription like this is great if you'd like some continuity and context to what you're googling ...)

Database System Concepts is one of the best books you can read on understanding good database design principles.

alt text

Algorithms in C++ was invaluable to me in learning Big O notation and the ins and outs of the various sort algorithms. This was published before Sedgewick decided he could make more money by dividing it into 5 different books.

C++ FAQs is an amazing book that really shows you what you should and shouldn't be doing in C++. The backward compatibility of C++ leaves a lot of landmines about and this book helps one carefully avoid them while at the same time being a good introduction into OO design and intent.

Here are two I haven't seen mentioned:
I wish I had read "Ruminations on C++" by Koenig and Moo much sooner. That was the book that made OO concepts really click for me.
And I recommend Michael Abrash's "Zen of Code Optimization" for anyone else planning on starting a programming career in the mid 90s.

Perfect Software: And Other Illusions about Testing

TITLE Cover

Perfect Software: And Other Illusions about Testing by Gerald M. Weinberg

ISBN-10: 0932633692

ISBN-13: 978-0932633699

Rapid Development by McConnell

The most influential programming book for me was Enough Rope to Shoot Yourself in the Foot by Allen Holub.

Cover of the book

O, well, how long ago it was.

I have a few good books that strongly influenced me that I've not seen on this list so far:

The Psychology of Everyday Things by Donald Norman. The general principles of design for other people. This may seem to be mostly good for UI but if you think about it, it has applications almost anywhere there is an interface that someone besides the original developer has to work with; e. g. an API and designing the interface in such a way that other developers form the correct mental model and get appropriate feedback from the API itself.

The Art of Software Testing by Glen Myers. A good, general introduction to testing software; good for programmers to read to help them think like a tester i. e. think of what may go wrong and prepare for it.

By the way, I realize the question was the "Single Most Influential Book" but the discussion seems to have changed to listing good books for developers to read so I hope I can be forgiven for listing two good books rather than just one.

alt text

C++ How to Program It is good for beginner.This is excellent book that full complete with 1500 pages.

Effective C++ and More Effective C++ by Scott Myers.

Inside the C++ object model by Stanley Lippman

I bough this when I was a complete newbie and took me from only knowing that Java existed to a reliable team member in a short time

Not a programming book, but still a very important book every programmer should read:

Orbiting the Giant Hairball by Gordon MacKenzie

The Pragmatic programmer was pretty good. However one that really made an impact when I was starting out was :

Windows 95 System Programming Secrets"

I know - it sounds and looks a bit cheesy on the outside and has probably dated a bit - but this was an awesome explanation of the internals of Win95 based on the Authors (Matt Pietrek) investigations using his own own tools - the code for which came with the book. Bear in mind this was before the whole open source thing and Microsoft was still pretty cagey about releasing documentation of internals - let alone source. There was some quote in there like "If you are working through some problem and hit some sticking point then you need to stop and really look deeply into that piece and really understand how it works". I've found this to be pretty good advice - particularly these days when you often have the source for a library and can go take a look. Its also inspired me to enjoy diving into the internals of how systems work, something that has proven invaluable over the course of my career.

Oh and I'd also throw in effective .net - great internals explanation of .Net from Don Box.

I recently read Dreaming in Code and found it to be an interesting read. Perhaps more so since the day I started reading it Chandler 1.0 was released. Reading about the growing pains and mistakes of a project team of talented people trying to "change the world" gives you a lot to learn from. Also Scott brings up a lot of programmer lore and wisdom in between that's just an entertaining read.

Beautiful Code had one or two things that made me think differently, particularly the chapter on top down operator precedence.

K&R

@Juan: I know Juan, I know - but there are some things that can only be learned by actually getting down to the task at hand. Speaking in abstract ideals all day simply makes you into an academic. It's in the application of the abstract that we truly grok the reason for their existence. :P

@Keith: Great mention of "The Inmates are Running the Asylum" by Alan Cooper - an eye opener for certain, any developer that has worked with me since I read that book has heard me mention the ideas it espouses. +1

I found the The Algorithm Design Manual to be a very beneficial read. I also highly recommend Programming Pearls.

This one isnt really a book for the beginning programmer, but if you're looking for SOA design books, then SOA in Practice: The Art of Distributed System Design is for you.

For me it was Design Patterns Explained it provided an 'Oh that's how it works' moment for me in regards to design patterns and has been very useful when teaching design patterns to others.

Code Craft by Pete Goodliffe is a good read!

Code Craft

The first book that made a real impact on me was Mastering Turbo Assembler by Tom Swan.

Other books that have had an impact was Just For Fun by Linus Torvalds and David Diamond and of course The Pragmatic Programmer by Andrew Hunt and David Thomas.

In addition to other people's suggestions, I'd recommend either acquiring a copy of SICP, or reading it online. It's one of the few books that I've read that I feel greatly increased my skill in designing software, particularly in creating good abstraction layers.

A book that is not directly related to programming, but is also a good read for programmers (IMO) is Concrete Mathematics. Most, if not all of the topics in it are useful for programmers to know about, and it does a better job of explaining things than any other math book I've read to date.

For me "Memory as a programming concept in C and C++" really opened my eyes to how memory management really works. If you're a C or C++ developer I consider it a must read. You will defiantly learn something or remember things you might have forgotten along the way.

http://www.amazon.com/Memory-Programming-Concept-C/dp/0521520436

Agile Software Development with Scrum by Ken Schwaber and Mike Beedle.

I used this book as the starting point to understanding Agile development.

Systemantics: How Systems Work and Especially How They Fail. Get it used cheap. But you might not get the humor until you've worked on a few failed projects.

The beauty of the book is the copyright year.

Probably the most profound takeaway "law" presented in the book:

The Fundamental Failure-Mode Theorem (F.F.T.): Complex systems usually operate in failure mode.

The idea being that there are failing parts in any given piece of software that are masked by failures in other parts or by validations in other parts. See a real-world example at the Therac-25 radiation machine, whose software flaws were masked by hardware failsafes. When the hardware failsafes were removed, the software race condition that had gone undetected all those years resulted in the machine killing 3 people.

It seems most people have already touched on the some very good books. One which really helped me out was Effective C#: 50 Ways to Improve your C#. I'd be remiss if I didn't mention The Tao of Pooh. Philosophy books can be good for the soul, and the code.

Discrete Mathematics For Computer Scientists

Discrete Mathematics For Computer Scientists by J.K. Truss.

While this doesn't teach you programming, it teaches you fundamental mathematics that every programmer should know. You may remember this stuff from university, but really, doing predicate logic will improve you programming skills, you need to learn Set Theory if you want to program using collections.

There really is a lot of interesting information in here that can get you thinking about problems in different ways. It's handy to have, just to pick up once in a while to learn something new.

I saw a review of Software Factories: Assembling Applications with Patterns, Models, Frameworks, and Tools on a blog talking also about XI-Factory, I read it and I must say this book is a must read. Altough not specifically targetted to programmers, it explains very clearly what is happening in the programming world right now with Model-Driven Architecture and so on..

Solid Code Optimizing the Software Development Life Cycle

Although the book is only 300 pages and favors Microsoft technologies it still offers some good language agnostic tidbits.

Managing Gigabytes is an instant classic for thinking about the heavy lifting of information.

My vote is "How to Think Like a Computer Scientist: Learning With Python" It's available both as a book and as a free e-book.

It really helped me to understand the basics of not just Python but programming in general. Although it uses Python to demonstrate concepts, they apply to most, if not all, programming languages. Also: IT'S FREE!

Object-Oriented Programming in Turbo C++. Not super popular, but it was the one that got me started, and was the first book that really helped me grok what an object was. Read this one waaaay back in high school. It sort of brings a tear to my eye...

My high school math teacher lent me a copy of Are Your Lights Figure Problem that I have re-read many times. It has been invaluable, as a developer, and in life generally.

I'm reading now Agile Software Development, Principles, Patterns and Practices. For those interested in XP and Object-Oriented Design, this is a classic reading.

alt text

Kernighan & Plauger's Elements of Programming Style. It illustrates the difference between gimmicky-clever and elegant-clever.

to get advanced in prolog i like these two books:

The Art of Prolog

The Craft of Prolog

really opens the mind for logic programming and recursion schemes.

Here's an excellent book that is not as widely applauded, but is full of deep insight: Agile Software Development: The Cooperative Game, by Alistair Cockburn.

What's so special about it? Well, clearly everyone has heard the term "Agile", and it seems most are believers these days. Whether you believe or not, though, there are some deep principles behind why the Agile movement exists. This book uncovers and articulates these principles in a precise, scientific way. Some of the principles are (btw, these are my words, not Alistair's):

  1. The hardest thing about team software development is getting everyone's brains to have the same understanding. We are building huge, elaborate, complex systems which are invisible in the tangible world. The better you are at getting more peoples' brains to share deeper understanding, the more effective your team will be at software development. This is the underlying reason that pair programming makes sense. Most people dismiss it (and I did too initially), but with this principle in mind I highly recommend that you give it another shot. You wind up with TWO people who deeply understand the subsystem you just built ... there aren't many other ways to get such a deep information transfer so quickly. It is like a Vulcan mind meld.
  2. You don't always need words to communicate deep understanding quickly. And a corollary: too many words, and you exceed the listener/reader's capacity, meaning the understanding transfer you're attempting does not happen. Consider that children learn how to speak language by being "immersed" and "absorbing". Not just language either ... he gives the example of some kids playing with trains on the floor. Along comes another kid who has never even SEEN a train before ... but by watching the other kids, he picks up the gist of the game and plays right along. This happens all the time between humans. This along with the corollary about too many words helps you see how misguided it was in the old "waterfall" days to try to write 700 page detailed requirements specifications.

There is so much more in there too. I'll shut up now, but I HIGHLY recommend this book!

alt text

The Back of the Napkin, by Dan Roam.

The Back of the Napkin

A great book about visual thinking techniques. There is also an expanded edition now. I can't speak to that version, as I do not own it; yet.

Agile Software Development by Alistair Cockburn

Do users ever touch your code? If you're not doing solely back-end work, I recommend About Face: The Essentials of User Interface Design — now in its third edition (linked). I used to think my users were stupid because they didn't "get" my interfaces. I was, of course, wrong. About Face turned me around.

"Writing Solid Code: Microsoft's Techniques for Developing Bug-Free C Programs (Microsoft Programming Series)" by Steve MacGuire.

Interesting what a large proportion the books mentioned here are C/C++ books.

While not strictly a software development book, I would highly recommend that Don't Make me Think! be considered in this list.

As so many people have listed Head First Design Patterns, which I agree is a very good book, I would like to see if so many people aware of a title called Design Patterns Explained: A New Perspective on Object-Oriented Design.

This title deals with design patterns excellently. The first half of the book is very accessible and the remaining chapters require only a firm grasp of the content already covered The reason I feel the second half of the book is less accessible is that it covers patterns that I, as a young developer admittedly lacking in experience, have not used much.

This title also introduces the concept behind design patterns, covering Christopher Alexander's initial work in architecture to the GoF first implementing documenting patterns in SmallTalk.

I think that anyone who enjoyed Head First Design Patterns but still finds the GoF very dry, should look into Design Patterns Explained as a much more readable (although not quite as comprehensive) alternative.

Even though i've never programmed a game this book helped me understand a lot of things in a fun way.

How influential a book is often depends on the reader and where they were in their career when they read the book. I have to give a shout-out to Head First Design Patterns. Great book and the very creative way it's written should be used as an example for other tech book writers. I.e. it's written in order to facilitate learning and internalizing the concepts.

Head First Design Patterns

97 Things Every Programmer Should Know

alt text

This book pools together the collective experiences of some of the world's best programmers. It is a must read.

Extreme Programming Explained: Embrace Change by Kent Beck. While I don't advocate a hardcore XP-or-the-highway take on software development, I wish I had been introduced to the principles in this book much earlier in my career. Unit testing, refactoring, simplicity, continuous integration, cost/time/quality/scope - these changed the way I looked at development. Before Agile, it was all about the debugger and fear of change requests. After Agile, those demons did not loom as large.

One of my personal favorites is Hacker's Delight, because it was as much fun to read as it was educational.

I hope the second edition will be released soon!

You.Next(): Move Your Software Development Career to the Leadership Track ~ Michael C. Finley (Author), Honza Fedák (Author) link text

I've been arounda while, so most books that I have found influential don't necessarily apply today. I do believe it is universally important to understand the platform that you are developing for (both hardware and OS). I also think it's important to learn from other peoples mistakes. So two books I would recommend are:

Computing Calamities and In Search of Stupidity: Over Twenty Years of High Tech Marketing Disasters

Working Effectively with Legacy Code is a really amazing book that goes into great detail about how to properly unit test your code and what the true benefit of it is. It really opened my eyes.

What are the best practices for naming unit test classes and test methods?

This was discussed on SO before, at What are some popular naming conventions for Unit Tests?

I don't know if this is a very good approach, but currently in my testing projects, I have one-to-one mappings between each production class and a test class, e.g. Product and ProductTest.

In my test classes I then have methods with the names of the methods I am testing, an underscore, and then the situation and what I expect to happen, e.g. Save_ShouldThrowExceptionWithNullName().

Kent Beck suggests:

  • One test fixture per 'unit' (class of your program). Test fixtures are classes themselves. The test fixture name should be:

    [name of your 'unit']Tests
    
  • Test cases (the test fixture methods) have names like:

    test[feature being tested]
    

For example, having the following class:

class Person {
    int calculateAge() { ... }

    // other methods and properties
}

A test fixture would be:

class PersonTests {

    testAgeCalculationWithNoBirthDate() { ... }

    // or

    testCalculateAge() { ... }
}

I saw many questions asking 'how' to unit test in a specific language, but no question asking 'what', 'why', and 'when'.

  • What is it?
  • What does it do for me?
  • Why should I use it?
  • When should I use it (also when not)?
  • What are some common pitfalls and misconceptions

LibrarIES like NUnit, xUnit or JUnit are just mandatory if you want to develop your projects using the TDD approach popularized by Kent Beck:

You can read Introduction to Test Driven Development (TDD) or Kent Beck's book Test Driven Development: By Example.

Then, if you want to be sure your tests cover a "good" part of your code, you can use software like NCover, JCover, PartCover or whatever. They'll tell you the coverage percentage of your code. Depending on how much you're adept at TDD, you'll know if you've practiced it well enough :)

I have been reading about Agile, XP methodologies and TDDs.

I have been in projects which states it needs to do TDD, but most of the tests are somehow integration tests or during the course of project TDD is forgotten in effort to finish codes faster.

So, as far as my case goes, I have written unit tests, but I find myself going to start writing code first instead of writing a test. I feel there's a thought / design / paradigm change which is actually huge. So, though one really believes in TDD, you actually end up going back old style because of time pressure / project deliverables.

I have few classes where I have pure unit tested code, but I can't seem to continue with the process, when mocks come into picture. Also, I see at times : "isn't it too trivial to write a test for it" syndrome.

How do you guys think I should handle this?

When you are in a big mess of legacy code I found Working Effectively with Legacy Code extremely useful. I think improve you motivation for TDD allot even though it is about writing unit tests before you do any changes to your old legacy code. And from the undertone of your question it seems like this is the position you have been.

And of course as many others pointed out discipline. After a while forcing your self you will forgot why you ever did it another way.

Buy "Test Driven Development: By Example" by Kent Beck, and read it.

Then, write a failing unit test.

Exact duplicate of many posts:

What is unit testing?
What Makes a Good Unit Test?
New to Unit Testing
Unit Testing - definitions
Learning Unit Testing
How to properly mock and unit test
Unit Testing: Beginner Questions
And many more ...
Also, Google for site:stackoverflow.com "how do you" unit-test

I have read some questions on unit testing, but I don't exactly know WHAT it is or how you do it. I was hoping if someone can tell me the following:

  • What exactly IS unit testing? Is it built into code or run as separate programs? Or something else?
  • How do you do it?
  • When should it be done? Are there times or projects not to do it? Is everything unit-testable?

Thanks a lot for the help.

What exactly IS unit testing? Is it built into code or run as separate programs? Or something else?

From MSDN: The primary goal of unit testing is to take the smallest piece of testable software in the application, isolate it from the remainder of the code, and determine whether it behaves exactly as you expect.

Essentially, you are writing small bits of code to test the individual bits of your code. In the .net world, you would run these small bits of code using something like NUnit or MBunit or even the built in testing tools in visual studio. In Java you might use JUnit. Essentially the test runners will build your project, load and execute the unit tests and then let you know if they pass or fail.

How do you do it?

Well it's easier said than done to unit test. It takes quite a bit of practice to get good at it. You need to structure your code in a way that makes it easy to unit test to make your tests effective.

When should it be done? Are there times or projects not to do it? Is everything unit-testable?

You should do it where it makes sense. Not everything is suited to unit testing. For example UI code is very hard to unit test and you often get little benefit from doing so. Business Layer code however is often very suitable for tests and that is where most unit testing is focused.

Unit testing is a massive topic and to fully get an understanding of how it can best benefit you I'd recommend getting hold of a book on unit testing such as "Test Driven Development by Example" which will give you a good grasp on the concepts and how you can apply them to your code.

What book would you recommend to learn test driven development? Preferrably language agnostic.

Growing Object-Oriented Software, Guided by Tests by Addison-Wesley - it is about mocking frameworks - JMock and Hamcrest in particular.

From description of the book:

Steve Freeman and Nat Pryce describe the processes they use, the design principles they strive to achieve, and some of the tools that help them get the job done. Through an extended worked example, you’ll learn how TDD works at multiple levels, using tests to drive the features and the object-oriented structure of the code, and using Mock Objects to discover and then describe relationships between objects. Along the way, the book systematically addresses challenges that development teams encounter with TDD--from integrating TDD into your processes to testing your most difficult features.

EDIT: I'm now reading Working Effectively with Legacy Code by Michael Feathers which is pretty good. From the description of the book:

  • Understanding the mechanics of software change: adding features,
    fixing bugs, improving design, optimizing performance
  • Getting legacy code into a test harness
  • Writing tests that protect you against introducing new problems
  • This book also includes a catalog of twenty-four dependency-breaking techniques that help you work with program elements in isolation and make safer changes.

I read it already, it is one of the best programming books I've ever read (I personally think that it must be called Refactoring to Testability - it describes the processes for making your code testable). Because a testable code is good code.

The Astels book is a solid introduction, Beck's book is good on the underlying concepts, Lasse Koskela has a newish one (Test Driven: TDD and Acceptance TDD for Java Developers). Osherove's book, as he says, is about Unit Testing, rather than TDD. I'm not sure that the Pragmatics' TDD book has aged as well as their original book.

Most everything is Java or C#, but you should be able to figure it out yourself.

For me, this is the best one:

Lots of people talk about writing tests for their code before they start writing their code. This practice is generally known as Test Driven Development or TDD for short. What benefits do I gain from writing software this way? How do I get started with this practice?

There are a lot of benefits:

  • You get immediate feedback on if your code is working, so you can find bugs faster
  • By seeing the test go from red to green, you know that you have both a working regression test, and working code
  • You gain confidence to refactor existing code, which means you can clean up code without worrying what it might break
  • At the end you have a suite of regression tests that can be run during automated builds to give you greater confidence that your codebase is solid

The best way to start is to just start. There is a great book by Kent Beck all about Test Driven Development. Just start with new code, don't worry about old code... whenever you feel you need to refactor some code, write a test for the existing functionality, then refactor it and make sure the tests stay green. Also, read this great article.

I am a web-developer working in PHP. I have some limited experience with using Test Driven Development in C# desktop applications. In that case we used nUnit for the unit testing framework.

I would like to start using TDD in new projects but I'm really not sure where to begin.

What recommendations do you have for a PHP-based unit testing framework and what are some good resources for someone who is pretty new to the TDD concept?

I highly recommend Test-Driven Development by Kent Beck (ISBN-10: 0321146530). It wasn't written specifically for PHP, but the concepts are there and should be easily translatable to PHP.

Having recently discovered this method of development, I'm finding it a rather nice methodology. So, for my first project, I have a small DLL's worth of code (in C#.NET, for what it's worth), and I want to make a set of tests for this code, but I am a bit lost as to how and where to start.

I'm using NUnit, and VS 2008, any tips on what sort of classes to start with, what to write tests for, and any tips on generally how to go about moving code across to test based development would be greatly appreciated.

Working Effectively with Legacy Code is my bible when it comes to migrating code without tests into a unit-tested environment, and it also provides a lot of insight into what makes code easy to test and how to test it.

I also found Test Driven Development by Example and Pragmatic Unit Testing: in C# with NUnit to be a decent introduction to unit testing in that environment.

One simple approach to starting TDD is to start writing tests first from this day forward and make sure that whenever you need to touch your existing (un-unit-tested) code, you write passing tests that verify existing behavior of the system before you change it so that you can re-run those tests after to increase your confidence that you haven't broken anything.

TDD is something that seems to be on everybody's lips these days, and I have tried some on my own but I don't think I'm getting the idea. I am getting a grip on how to write a unit test, but I don't understand exactly what my unit tests should test.

  1. If I have an action method that returns a list of data, what should I verify? Only that the view name is correct, or should I verify the data as well?
  2. If I should test the data as well, won't I be writing the same code twice? What is the use of testing the data, if I use the same method to retrieve the data I'm comparing to?
  3. Should I test the methods adding/editing my data too? How do I verify that a record has been added/edited/removed, in a correct way?

I know it's quite a lot of large questions, but I haven't become any wiser from reading articles on the internet, as they all seem to be concerned with how to test, and not with what.

As an example - I have (or, am going to write) a GuestbookController, with methods for viewing, adding, editing and removing posts. What do I need to test? How do I do it?

I think there's a bit of Shu-Ha-Ri here. You're asking a question that is hard to explain. It is only after practicing and struggling to apply TDD that you'll get the What. Until then we'll give you answers that don't make sense, telling you stuff in the spirt of Monads Are Burritos. That won't help you and we'll sound like idiots (monads are clearly lemon-chiffon pie).

I'd recommend getting Kent Beck's TDD book and working through it, and then just practicing. "There's no royal road to Ri."

I do write unit tests while writing APIs and core functionalities. But I want to be the cool fanboy who eats, sleeps and breathes TDD and BDD. What's the best way to get started with TDD/BDD the right way? Any books, resources, frameworks, best practices?

My environment is Java backend with Grails frontend, integrated with several external web services and databases.

A good place to start is reading blogs. Then buy the books of the people who are blogging. Some I would highly recommend:

"Uncle Bob" Martin and the guys at Object Mentor: http://blog.objectmentor.com/

P.S. get Bobs book Clean Code:

http://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882

My friend Tim Ottinger (former Object Mentor dude) http://agileinaflash.blogspot.com/ http://agileotter.blogspot.com/

The Jetbrains guys: http://www.jbrains.ca/permalink/285

I felt the need to expand on this, as everyone else seems to just want to give you their opinion of TDD and not help you on your quest to become a Jedi-Ninja. The Michael Jordan of TDD is Kent Beck. He really did write the book on it:

http://www.amazon.com/Test-Driven-Development-Kent-Beck/dp/0321146530

he also blogs at:

http://www.threeriversinstitute.org/blog/?p=29

other "famous" supporters of TDD include:

All are great people to follow. You should also consider attending some conferences like Agile 2010, or Software Craftsmanship (this year they were held at the same time in Chicago)

I've been doing TDD for a couple of years, but lately I've started looking more in to the BDD way of driving my design and development. Resources that helped me get started on BDD was first and formost Dan North's blog (the 'founder' of BDD). Take a look at Introducing BDD. There's also an 'official' BDD Wiki over at behaviour-driven.org with some good post well worth reading.

The one thing that I found really hard when starting out with BDD (and still find a bit hard) is how to formulate those scenarios to make them suitable to BDD. Scott Bellware is a man well skilled in BDD (or Context-Spesification as he like to coin it) and his article Behavior-Driven Development in Code Magazine helped me a lot on understanding the BDD way of thinking and formulating user stories.

I would also recomend the TekPub screencast Behavior-driven Design with Specflow by Rob Conery. A great intro to BDD and to a tool (SpecFlow) very good suited for doing BDD in C#.

As for TDD resources, there's already a lot of good recommendations here. But I just want to point out a couple of books that I can really recommend;

I've heard that projects developed using TDD are easier to refactor because the practice yields a comprehensive set of unit tests, which will (hopefully) fail if any change has broken the code. All of the examples I've seen of this, however, deal with refactoring implementation - changing an algorithm with a more efficient one, for example.

I find that refactoring architecture is a lot more common in the early stages where the design is still being worked out. Interfaces change, new classes are added & deleted, even the behavior of a function could change slightly (I thought I needed it to do this, but it actually needs to do that), etc... But if each test case is tightly coupled to these unstable classes, wouldn't you have to be constantly rewriting your test cases each time you change a design?

Under what situations in TDD is it okay to alter and delete test cases? How can you be sure that altering the test cases don't break them? Plus it seems that having to synchronize a comprehensive test suite with constantly changing code would be a pain. I understand that the unit test suite could help tremendously during maintenance, once the software is built, stable, and functioning, but that's late in the game wheras TDD is supposed to help early on as well.

Lastly, would a good book on TDD and/or refactoring address these sort of issues? If so, which would you recommend?

I would recommended (as others have):

Is it appropriate to use the double type to store percentage values (for example a discount percentage in a shop application) or would it be better to use the decimal type?

Floating-point types (float and double are particularly ill-suited to financial applications.

Financial calculations are almost always decimal, while floating-point types are almost always binary. Many common values that are easy to represent in decimal are impossible to represent in binary. For example, 0.2d = 0.00110011...b. See http://en.wikipedia.org/wiki/Binary_numeral_system#Fractions_in_binary for a good discussion.

It's also worth talking about how you're representing prices in your system. decimal is a good choice, but floating point is not, for reasons listed above. Because you believe in Object Oriented Programming, you're going to wrap that decimal in a new Money type, right? A nice treatment of money comes in Kent Beck's Test Driven Development by Example.

Perhaps you will consider representing percentages as an integer, and then dividing by 100 every time you use it. However, you are setting yourself up for bugs (oops, I forgot to divide) and future inflexibility (customer wants 1/10ths of a percent, so go fix every /100 to be /1000. Oops, missed one - bug.)

That leaves you with two good options, depending on your needs. One is decimal. It's great for whole percentages like 10%, but not for things like "1/3rd off today only!", as 1/3 doesn't represent exactly in decimal. You'd like it if buying 3 of something at 1/3rd off comes out as a whole number, right?

Another is to use a Fraction type, which stores an integer numerator and denominator. This allows you to represent exact values for all rational numbers. Either implement your own Fraction type or pick one up from a library (search the internet).

Intent

I am looking for the following:

  • A solid unit testing methodology
    1. What am I missing from my approach?
    2. What am I doing wrong?
    3. What am I doing which is unnecessary?
  • A way to get as much as possible done automatically

Current environment

  • Eclipse as IDE
  • JUnit as a testing framework, integrated into Eclipse
  • Hamcrest as a "matchers" library, for better assertion readability
  • Google Guava for precondition validation

Current approach

Structure

  • One test class per class to test
  • Method testing grouped in static nested classes
  • Test method naming to specify behaviour tested + expected result
  • Expected exceptions specified by Java Annotation, not in method name

Methodology

  • Watch out for null values
  • Watch out for empty List<E>
  • Watch out for empty String
  • Watch out for empty arrays
  • Watch out for object state invariants altered by code (post-conditions)
  • Methods accept documented parameter types
  • Boundary checks (e.g. Integer.MAX_VALUE, etc...)
  • Documenting immutability through specific types (e.g. Google Guava ImmutableList<E>)
  • ... is there a list for this? Examples of nice-to-have testing lists:
    • Things to check in database projects (e.g. CRUD, connectivity, logging, ...)
    • Things to check in multithreaded code
    • Things to check for EJBs
    • ... ?

Sample code

This is a contrived example to show some techniques.


MyPath.java

import static com.google.common.base.Preconditions.checkArgument;
import static com.google.common.base.Preconditions.checkNotNull;
import java.util.Arrays;
import com.google.common.collect.ImmutableList;
public class MyPath {
  public static final MyPath ROOT = MyPath.ofComponents("ROOT");
  public static final String SEPARATOR = "/";
  public static MyPath ofComponents(String... components) {
    checkNotNull(components);
    checkArgument(components.length > 0);
    checkArgument(!Arrays.asList(components).contains(""));
    return new MyPath(components);
  }
  private final ImmutableList<String> components;
  private MyPath(String[] components) {
    this.components = ImmutableList.copyOf(components);
  }
  public ImmutableList<String> getComponents() {
    return components;
  }
  @Override
  public String toString() {
    StringBuilder stringBuilder = new StringBuilder();
    for (String pathComponent : components) {
      stringBuilder.append("/" + pathComponent);
    }
    return stringBuilder.toString();
  }
}

MyPathTests.java

import static org.hamcrest.Matchers.is;
import static org.hamcrest.collection.IsCollectionWithSize.hasSize;
import static org.hamcrest.collection.IsEmptyCollection.empty;
import static org.hamcrest.collection.IsIterableContainingInOrder.contains;
import static org.hamcrest.core.IsEqual.equalTo;
import static org.hamcrest.core.IsNot.not;
import static org.hamcrest.core.IsNull.notNullValue;
import static org.junit.Assert.assertThat;
import org.junit.Test;
import org.junit.experimental.runners.Enclosed;
import org.junit.runner.RunWith;
import com.google.common.base.Joiner;
@RunWith(Enclosed.class)
public class MyPathTests {
  public static class GetComponents {
    @Test
    public void componentsCorrespondToFactoryArguments() {
      String[] components = { "Test1", "Test2", "Test3" };
      MyPath myPath = MyPath.ofComponents(components);
      assertThat(myPath.getComponents(), contains(components));
    }
  }
  public static class OfComponents {
    @Test
    public void acceptsArrayOfComponents() {
      MyPath.ofComponents("Test1", "Test2", "Test3");
    }
    @Test
    public void acceptsSingleComponent() {
      MyPath.ofComponents("Test1");
    }
    @Test(expected = IllegalArgumentException.class)
    public void emptyStringVarArgsThrows() {
      MyPath.ofComponents(new String[] { });
    }
    @Test(expected = NullPointerException.class)
    public void nullStringVarArgsThrows() {
      MyPath.ofComponents((String[]) null);
    }
    @Test(expected = IllegalArgumentException.class)
    public void rejectsInterspersedEmptyComponents() {
      MyPath.ofComponents("Test1", "", "Test2");
    }
    @Test(expected = IllegalArgumentException.class)
    public void rejectsSingleEmptyComponent() {
      MyPath.ofComponents("");
    }
    @Test
    public void returnsNotNullValue() {
      assertThat(MyPath.ofComponents("Test"), is(notNullValue()));
    }
  }
  public static class Root {
    @Test
    public void hasComponents() {
      assertThat(MyPath.ROOT.getComponents(), is(not(empty())));
    }
    @Test
    public void hasExactlyOneComponent() {
      assertThat(MyPath.ROOT.getComponents(), hasSize(1));
    }
    @Test
    public void hasExactlyOneInboxComponent() {
      assertThat(MyPath.ROOT.getComponents(), contains("ROOT"));
    }
    @Test
    public void isNotNull() {
      assertThat(MyPath.ROOT, is(notNullValue()));
    }
    @Test
    public void toStringIsSlashSeparatedAbsolutePathToInbox() {
      assertThat(MyPath.ROOT.toString(), is(equalTo("/ROOT")));
    }
  }
  public static class ToString {
    @Test
    public void toStringIsSlashSeparatedPathOfComponents() {
      String[] components = { "Test1", "Test2", "Test3" };
      String expectedPath =
          MyPath.SEPARATOR + Joiner.on(MyPath.SEPARATOR).join(components);
      assertThat(MyPath.ofComponents(components).toString(),
          is(equalTo(expectedPath)));
    }
  }
  @Test
  public void testPathCreationFromComponents() {
    String[] pathComponentArguments = new String[] { "One", "Two", "Three" };
    MyPath myPath = MyPath.ofComponents(pathComponentArguments);
    assertThat(myPath.getComponents(), contains(pathComponentArguments));
  }
}

Question, phrased explicitly

  • Is there a list of techniques to use to build a unit test? Something much more advanced than my oversimplified list above (e.g. check nulls, check boundaries, check expected exceptions, etc.) perhaps available in a book to buy or a URL to visit?

  • Once I have a method that takes a certain type of parameters, can I get any Eclipse plug-in to generate a stub for my tests for me? Perhaps using a Java Annotation to specify metadata about the method and having the tool materialise the associated checks for me? (e.g. @MustBeLowerCase, @ShouldBeOfSize(n=3), ...)

I find it tedious and robot-like to have to remember all of these "QA tricks" and/or apply them, I find it error-prone to copy and paste and I find it not self-documenting when I code things as I do above. Admittedly, Hamcrest libraries go in the general direction of specialising types of tests (e.g. on String objects using RegEx, on File objects, etc) but obviously do not auto-generate any test stubs and do not reflect on the code and its properties and prepare a harness for me.

Help me make this better, please.

PS

Do not tell me that I am just presenting code which is a silly wrapper around the concept of creating a Path from a list of path steps supplied in a static factory method please, this is a totally made-up example but it shows a "few" cases of argument validation... If I included a much longer example, who would really read this post?

  1. Consider using ExpectedException instead of @Test(expected.... This is because if for example you expect a NullPointerException and your test throws this exception in your setup (before calling the method under test) your test will pass. With ExpectedException you put the expect immediately before the call to the method under test so there is no chance of this. Also, ExpectedException allows you to test the exception message which is helpful if you have two different IllegalArgumentExceptions that might be thrown and you need to check for the correct one.

  2. Consider isolating your method under test from the setup and verify, this will ease in test review and maintenance. This is especially true when methods on the class under test are invoked as part of setup which can confuse which is the method under test. I use the following format:

    public void test() {
       //setup
       ...
    
       // test (usually only one line of code in this block)
       ...
    
       //verify
       ...
    }
    
  3. Books to look at: Clean Code, JUnit In Action, Test Driven Development By Example

    Clean Code has an excellent section on testing

  4. Most example I have seen (including what Eclipse autogenerates) have the method under test in the title of the test. This facilitates review and maintenance. For example: testOfComponents_nullCase. Your example is the first I have seen that uses the Enclosed to group methods by method under test, which is really nice. However, it adds some overhead as @Before and @After do not get shared between enclosed test classes.

  5. I have not started using it, but Guava has a test library: guava-testlib. I have not had a chance to play with it but it seems to have some cool stuff. For example: NullPointerTest is quote:

  • A test utility that verifies that your methods throw {@link * NullPointerException} or {@link UnsupportedOperationException} whenever any * of their parameters are null. To use it, you must first provide valid default * values for the parameter types used by the class.

Review: I realize the test above was just an example but since a constructive review might be helpful, here you go.

  1. In testing getComponents, test the empty list case as well. Also, use IsIterableContainingInOrder.

  2. In testing of ofComponents, it seems that it would make sense to call getComponents or toString to validate that it properly handled the various non-error cases. There should be a test where no argument is passed to ofComponents. I see that this is done with ofComponents( new String[]{}) but why not just do ofComponents()? Need a test where null is one of the values passed: ofComponents("blah", null, "blah2") since this will throw an NPE.

  3. In testing ROOT, as has been pointed out before, I suggest calling ROOT.getComponents once and doing all three verifications on it. Also, ItIterableContainingInOrder does all three of not empty, size and contains. The is in the tests is extraineous (although it is linguistic) and I feel is not worth having (IMHO).

  4. In testing toString, I feel it is very helpful to isolate the method under test. I would have written toStringIsSlashSeparatedPathOfComponents as follows. Notice that I do not use the constant from the class under test. This is because IMHO, ANY functional change to the class under test should cause the test to fail.

    @Test     
    public void toStringIsSlashSeparatedPathOfComponents() {       
       //setup 
       String[] components = { "Test1", "Test2", "Test3" };       
       String expectedPath =  "/" + Joiner.on("/").join(components);   
       MyPath path = MyPath.ofComponents(components)
    
       // test
       String value = path.toStrign();
    
       // verify
       assertThat(value, equalTo(expectedPath));   
    } 
    
  5. Enclosed will not run any unit test that is not in an inner class. Therefore testPathCreationFromComponents would not be run.

Finally, use Test Driven Development. This will ensure that your tests are passing for the right reason and will fail as expected.

Does anyone know of where to find unit testing guidelines and recommendations? I'd like to have something which addresses the following types of topics (for example):

  • Should tests be in the same project as application logic?
  • Should I have test classes to mirror my logic classes or should I have only as many test classes as I feel I need to have?
  • How should I name my test classes, methods, and projects (if they go in different projects)
  • Should private, protected, and internal methods be tested, or just those that are publicly accessible?
  • Should unit and integration tests be separated?
  • Is there a good reason not to have 100% test coverage?

What am I not asking about that I should be?

An online resource would be best.

I would recommend Kent Beck's book on TDD.

Also, you need to go to Martin Fowler's site. He has a lot of good information about testing as well.

We are pretty big on TDD so I will answer the questions in that light.

Should tests be in the same project as application logic?

Typically we keep our tests in the same solution, but we break tests into seperate DLL's/Projects that mirror the DLL's/Projects they are testing, but maintain namespaces with the tests being in a sub namespace. Example: Common / Common.Tests

Should I have test classes to mirror my logic classes or should I have only as many test classes as I feel I need to have?

Yes, your tests should be created before any classes are created, and by definition you should only test a single unit in isolation. Therefore you should have a test class for each class in your solution.

How should I name my test classes, methods, and projects (if they go in different projects)

I like to emphasize that behavior is what is being tested so I typically name test classes after the SUT. For example if I had a User class I would name the test class like so:

public class UserBehavior

Methods should be named to describe the behavior that you expect.

public void ShouldBeAbleToSetUserFirstName()

Projects can be named however you want but usually you want it to be fairly obvious which project it is testing. See previous answer about project organization.

Should private, protected, and internal methods be tested, or just those that are publicly accessible?

Again you want tests to assert expected behavior as if you were a 3rd party consumer of the objects being tested. If you test internal implementation details then your tests will be brittle. You want your test to give you the freedom to refactor without worrying about breaking existing functionality. If your test know about implementation details then you will have to change your tests if those details change.

Should unit and integration tests be separated?

Yes, unit tests need to be isolated from acceptance and integration tests. Separation of concerns applies to tests as well.

Is there a good reason not to have 100% test coverage?

I wouldn't get to hung up on the 100% code coverage thing. 100% code coverage tends to imply some level of quality in the tests, but that is a myth. You can have terrible tests and still get 100% coverage. I would instead rely on a good Test First mentality. If you always write a test before you write a line of code then you will ensure 100% coverage so it becomes a moot point.

In general if you focus on describing the full behavioral scope of the class then you will have nothing to worry about. If you make code coverage a metric then lazy programmers will simply do just enough to meet that mark and you will still have crappy tests. Instead rely heavily on peer reviews where the tests are reviewed as well.

I insistently recommend you to read Test Driven Development: By Example and Test-Driven Development: A Practical Guide It's too much questions for single topic

Where can I find good literature on unit testing? Book titles and links are welcome.

Update: Here is a list of books mentioned in answers below

xUnit Test Patterns: Refactoring Test Code

Growing Object-Oriented Software Guided by Tests

The Art Of Unit Testing

The real challenge of software testing is solving the puzzle of test design.

Testing Object-Oriented Systems: Models, Patterns, and Tools provides three dozen test design patterns applicable to unit test design. It also provides many design patterns for test automation. These patterns distill many hard-won best practices and research insights.

Pragmatic Unit Testing

Test Driven Development: By Example

I have been using ruby exclusively for about a month and I really love it. However, I am having an incredibly hard time using, or even learning TDD. My brain just doesn't function that way...

I really, really want to learn TDD but to be honest I am a bit confused. All the articles that I find when Googling around are mostly specific to Rails, which is not interesting to me because I want to learn how to do efficient testing for any ruby application, from the simple one-file script to the complicated gem, not for Web apps. Also, there are so many frameworks and so few tutorials to get started.

Could anybody give me any advice on how to learn TDD so that I can at least start to consider myself an aspiring rubyist?

It's tricky getting your head around TDD (and BDD) but the book RSpec Book - BDD helped me a lot. Behaviour driven development is not exactly the same thing as TDD, but it is close and you have to think in a similar way.

I still recommend TDD by Example by Kent Beck. It's an easy read and gives you all the basics.

I have recently (in the last week) embarked on an experiment wherein I attempt to code a new feature in a project I'm working on using TDD principles. In the past, our approach has been a moderately-agile approach, but with no great rigour. Unit testing happens here and there when it's convenient. The main barrier to comprehensive test coverage is that our application has a complicated web of dependencies. I picked a feature that was convenient to wall off to try my experiment on; the details aren't important and probably commercially sensitive, suffice to say that it's a simple optimisation problem.

Thus far I have found that:

  • TDD for me seems to encourage rambling, non-obvious designs to take shape. The restriction that one must not write code without a test tends to block opportunities to factor out functionality into independent units. Thnking up and writing tests for that many features simultaneously is too difficult in practice
  • TDD tends to encourage the creation of 'God Objects' that do everything - because you've written lots of mocking classes for class x already, but few for class y, so it seems logical at the time that class x should also implement feature z instead of leaving it to class y.
  • Writing tests before you write code requires that you have a complete understanding of every intricacy of the problem before you solve it. This seems like a contradiction.
  • I haven't been able to get the team on-side to start using a mocking framework. This means that there is a proliferation of cruft created solely to test a particular feature. For every method tested, you'll tend to need a fake whose only job is to report that the class under test called whatever it's supposed to. I'm starting to find myself writing something resembling a DSL purely for instantiating the test data.
  • Despite the above concerns, TDD has produced a working design with few mysterious errors, unlike the development pattern I'm used to. Refactoring the sprawling mess that results however has required that I temporarily abandon the TDD and just get it done. I'm trusting that the tests will continue to enforce correctness in the method. Trying to TDD the refactoring exercise I feel will just proliferate more cruft.

The question then, is "Does anybody have any advice to reduce the impact of the concerns listed above?". I have no doubt that a mocking framework would be advantageous; however at present I'm already pushing my luck trying something that appears to merely produce rambling code.

edit #1:

Thank you all for your considered answers. I admit that I wrote my question after a few friday-evening beers, so in places it's vague and doesn't really express the sentiments that I really intended. I'd like to emphasise that I do like the philosophy of TDD, and have found it moderately successful, but also surprising for the reasons I listed. I have the opportunity to sleep on it and look at the problem again with fresh eyes next week, so perhaps I'll be able to resolve my issues by muddling through. None of them are non-starters, however.

What concerns me more is that some of the team members are resistant to trying anything that you could call a 'technique' in favour of 'just getting it done'. I am concerned that the appearance of cruft will be taken as a black mark against the process, rather than evidence that it needs to be done completely (i.e. with a mocking framework and strong DI) for best results.

RE "TDD doesn't have to mean test-first": (womp, btreat)

The 'golden rule' in every text I've found on the issue is "Red, Green, Refactor". That is:

  • Write a test that MUST fail
  • Write code that passes the test
  • Refactor the code so that it passes the test in the neatest practical way

I am curious as to how one imagines doing Test-Driven Development without following the core principle of TDD as originally written. My colleague calls the halfway house (or a different and equally valid approach, depending on your perspective) "Test-Validated Development". In this case I think coining a new term - or possibly stealing it off somebody else and taking credit for it - is useful.

RE DSLs for test data: (Michael Venable)

I'm glad you said that. I do see the general form being increasingly useful across the scope of the project, as the application in question maintains a pretty complicated object graph and typically, testing it means running the application and trying things out in the GUI. (Not going to give the game away for commercial sensitivity reasons above, but it's fundamentally to do with optimisation of various metrics on a directed graph. However, there are lots of caveats and user-configurable widgets involved.)

Being able to set up a meaningful test case programmatically will help in all manner of situations, potentially not limited to unit testing.

RE God Objects:

I felt this way because one class seemed to be taking up most of the feature-set. Maybe this is fine, and it really is that important, but it raised a few eyebrows because it looked just like older code that wasn't developed along these lines, and appeared to violate SRP. I suppose it's inevitable that some classes will function primarily as seams between numerous different encapsulated bits of functionality and others will seam only a few. If it's going to be that way, I suppose what I need to do is purge as much of the logic as possible from this apparent God Object and recast its behaviour as a junction point between all the factored-out parts.

(to the moderators: I've added my responses to posts up here because the comment field isn't long enough to contain the detail I'd like.)

edit #2 (after about five months):

Well, I felt it might be nice to update with some more thoughts after mulling the issue over for a while.

I did end up abandoning the TDD approach in the end, I'm sorry to say. However, I feel that there are some specific and justified reasons for this, and I'm all ready to continue with it the next time I get an opportunity.

A consequence of the unapologetic refactoring mentality of TDD is that I was not greatly upset when, upon taking a brief look at my code, the lead dev declared that the vast majority of it was pointless and needed to go. While there is a twinge of regret at having to cast off a huge swathe of hard work, I saw exactly what he meant.

This situation had arisen because I took the rule of 'code to an interface' literally, but continued to write classes that tried to represent reality. Quite a long time ago I first made the statement:

Classes should not attempt to represent reality. The object model should only attempt to solve the problem at hand.

...which I have repeated as often as I can since; to myself and to anybody else who will listen.

The result of this behaviour was an object model of classes that performed a function, and a mirroring set of interfaces which repeated the functionality of the classes. Having had this pointed out to me and after a brief but intense period of resistance, saw the light and had no problem with deleting most of it.

That doesn't mean that I believe 'code to an interface' is bunk. What it does mean is that coding to an interface is primarily valuable when the interfaces represent real business functions, rather than the properties of some imagined perfect object model that looks like a miniature copy of real life, but doesn't consider its sole meaning in life to be answering the question you originally asked. The strength of TDD is that it can't produce models like this, except by chance. Since it starts with asking a question and only cares about getting an answer, your ego and prior knowledge of the system aren't involved.

I'm rambling now, so I'd do best to finish this and just state that I am all raring to go at trying TDD again, but have a better overview of the tools and tactics available and will do my best to decide how I want to go about it before jumping in. Perhaps I should transplant this waffle to a blog where it belongs, once I have something more to say about it.

To begin with, try reviewing your method against a reference book (e.g. Beck's book). When you're learning something - follow the rule without questioning. A common mistake to adapt a method prematurely without understanding the implications of your changes.
e.g. (as Carl posted) the books I've read advocate writing one unit test at a time and watching it fail before filling in the implementation.

Once it passes, you need to "refactor". Small word but big implications - its the make-or-break step. You improve your design in a flurry of little steps. However TDD is no substitute for experience... which stems from practice. So an experienced programmer with/without TDD may still end producing better code than a novice with TDD - because he/she knows what to look out for. So how do you get there ? You learn from the people who have been doing it a while.

  • I'd recommend Beck's TDD By Example book first. (Freeman and Pryce's GOOS book is good but you'd get better value from it once you've been doing TDD for a while.)
  • For tapping into the Guru's mind, check out Clean Code by Bob Martin. It gives you simple heuristics to evaluate your choices against. I'm at chapter 3 ; I've even setup a group-reading exercise @ work. As the book says, Clean code is equal measures of discipline, technique + "code-sense".

I'm looking for resources that provide an actual lesson plan or path to encourage and reinforce programming practices such as TDD and mocking. There are plenty of resources that show examples, but I'm looking for something that actually provides a progression that allows the concepts to be learned instead of forcing emulation.

My primary goal is speeding up the process for someone to understand the concepts behind TDD and actually be effective at implementing them. Are there any free resources like this?

Books are always a good resource - even though not free - they may be worth your time searching for the good free resources - for the money those books cost.

"Test driven development by example" by Kent Beck.

"Test Driven Development in Microsoft .NET" by James W. Newkirk and Alexei A. Vorontsov

please feel free to add to this list

I'm working with a Python development team who is experienced with programming in Python, but is just now trying to pick up TDD. Since I have some experience working with TDD myself, I've been asked to give a presentation on it. Mainly, I'm just wanting to see articles on this so that I can see how other people are teaching TDD and get some ideas for material to put in my presentation.

Preferably, I'd like the intro to be for Python, but any language will do as long as the examples are easy to read and the concepts transfer to Python easily.

Kent Beck's book gives some examples in Java and some in Python (to be honest, Kent doesn't strike me as a superstar in either language, judging from the example code in this book... but he definitely comes across as a superstar in TDD &c -- as well he should, given he's basically invented it as well as extreme programming, see his wikipedia entry).

I am doing my first steps with TDD. The problem is (as probably with everyone starting with TDD), I never know very well what kind of unit tests to do when I start working in my projects.

Let's assume I want to write a Stack class with the following methods(I choose it as it's an easy example):

Stack<T>
 - Push(element : T)
 - Pop() : T
 - Peek() : T
 - Count : int
 - IsEmpty : boolean

How would you approch this? I never understood if the idea is to test a few corner cases for each method of the Stack class or start by doing a few "use cases" with the class, like adding 10 elements and removing them. What is the idea? To make code that uses the Stack as close as possible to what I'll use in my real code? Or just make simple "add one element" unit tests where I test if IsEmpty and Count were changed by adding that element?

How am I supposed to start with this?

EDIT

Here's my rough tests' implementation:

    [TestMethod]
    public void PushTests() {
        StackZ<string> stackz = new StackZ<string>();

        for (int i = 0; i < 5; ++i) {
            int oldSize = stackz.Size;
            stackz.Push(i.ToString());
            int newSize = stackz.Size;
            Assert.AreEqual(oldSize + 1, newSize);
            Assert.IsFalse(stackz.IsEmpty);
        }
    }

    [TestMethod, ExpectedException(typeof(InvalidOperationException))]
    public void PeekTestsWhenEmpty() {
        StackZ<double> stackz = new StackZ<double>();
        stackz.Peek();
    }

    [TestMethod]
    public void PeekTestsWhenNotEmpty() {
        StackZ<int> stackz = new StackZ<int>();
        stackz.Push(5);

        int firstPeekValue = stackz.Peek();

        for (int i = 0; i < 5; ++i) {
            Assert.AreEqual(stackz.Peek(), firstPeekValue);
        }
    }

    [TestMethod, ExpectedException(typeof(InvalidOperationException))]
    public void PopTestsWhenEmpty() {
        StackZ<float> stackz = new StackZ<float>();
        stackz.Pop();
    }

    [TestMethod]
    public void PopTestsWhenNotEmpty() {
        StackZ<int> stackz = new StackZ<int>();

        for (int i = 0; i < 5; ++i) {
            stackz.Push(i);
        }

        for (int i = 4; i >= 0; ++i) {
            int oldSize = stackz.Size;
            int popValue = stackz.Pop();
            Assert.AreEqual(popValue, i);
            int newSize = stackz.Size;
            Assert.AreEqual(oldSize, newSize + 1);
        }

        Assert.IsTrue(stackz.IsEmpty);
    }

Any corrections/ideas about it? Thanks

If you read the book about the Test-Driven development by Kent Beck, you might have noticed an idea that sounds frequently in the book: you should write tests to what you are curently missing. As long as you do not need something, do not write tests and do not implement it.

While your implementation of the Stack class fits your needs, you do not need to thoroughly implement it. Under the hood, it can even return constants to you or do nothing.

Testing should not become overhead to your development, it should speed up your development instead, supporting you when you do not want to keep everything in your head.

The main advantage of the TDD that it makes you write code that is testable in a small lines of code, because usually you do not want to write 50 lines of code to test a method. You become more concerned with interfaces and ditribution of the functionality among classes, because, once again, you do not want to write 50 lines of code to test a method.

Having said that, I can tell you that is not interesting and probably useful to learn TDD by implementing unit tests to superutil interfaces that are gained through suffering of several generation of developers. You will just not feel anynthing exciting. Just take any class from an application written by you and try to write tests to it. Refactoring them will give you much pleasure.

I've had been hearing about test driven development for a couple years now, and I never really paid too much attention to it from a practical level until recently when I started getting more interested in the .NET MVC. I've been playing around a lot with the MVC Storefront Sample, and I am realizing just how cool and helpful that the test driven approach can be. However, I've been programming using a "test last" approach for a long time now, and when it comes down to business, I can always best estimate my effort with the approach that I am most familiar with.

I'm guessing that learning how to use the test driven approach is less like learning another programming language, but more of a change in how you approach laying the framework for, and planning the requirements for building an application. I don't think I could just pick up a book and start a project for one of my clients using TDD, I'm guessing my introduction to it need to be more methodical.

What is the best way for me to shift my mind-set of planning to build an application so I can become effective with test-driven development in the shortest amount of time?

You need to practice.

You can start with test-first programming. Design the code just as you usually do, perhaps not in deep details, and start implementing its test first: start with a class that has no dependencies, see how it can be tested, write down a list of the tests you can think of. Start writing the simplest test. Then write just enough code to make it pass. Cross the test on your list and write it, write the code.

When you have an idea for a new test or ask yourself a question about how the code behaves under certain condition, add a new test to your list.

I'd recommend you read Test Driven Development ; it's a very good introduction to TDD and also contains a lot of reference materials (called patterns).

Regarding the estimates, one thing to keep in mind is that although writing the code and the tests simultaneously is slightly longer than just writing the code, you end with code that works.

Some more advice, once you'll get started:

  • Add a new failing test before to fix any problem in the code.

  • Strive never to write any line of code without having a failing test - that's the ultimate goal.

What is a quality real world example of TDD in action? What small-to-medium open source projects in .net are out there that show off best practice in TDD and might work as a reference for doing TDD right?


I am looking more for an example of a living breathing project(s) that serves as a good example of TDD. Something that would supplement the books and references which explain and demonstrate the process in isolation.

Something that would be helpful for an aspirational developer who want to level up from beginner/intermediate TDD practitioner...

If you are looking into this for personal usage of TDD, I don't think browsing other projects will really help you. Instead, you should dive in and just do it, and learn as you go.

To get started, check out this article about testing and how to get started, and this book about TDD by Kent Beck (very simple, easy to understand and insightful).

I'm the sole developer for an academic consortium headquartered at a university in the northeast. All of my development work involves internal tools, mostly in Java, so nothing that is released to the public. Right now, I feel like my development workflow is very "hobbyist" and is nothing like you would see at an experienced software development firm. I would be inclined to say that it doesn't really matter since I'm the only developer anyway, but it can't hurt to make some changes, if for no other reason than to make my job a little easier and get a few more technologies on my resume. Right now my workflow is something like this:

  • I do most of my development work in Eclipse on my laptop. Everything is saved locally on my laptop, and I don't use a VCS, nor do I really backup my code (except for occasionally emailing it to myself so I can see it on a different computer - yeah, I told you my development environment needs work).

  • When I'm done with a project and want to deploy it or if I just want to test it, I use the built-in Jar tool in Eclipse to make an executable .jar of my project. If I use external .jar libraries, I use the Fat-Jar plugin to include those .jars in my executable .jar.

  • After I create the .jar, I manually upload it to the server via SFTP and test it with something like java -jar MyProject.jar.

Oh yeah, did I mention that I don't unit test?

The most glaringly obvious problem that I'd like to fix first is my lack of source control. I like git because of it's distributed nature, but it doesn't seem to integrate with Eclipse well and I've heard that it doesn't work very well on Windows, which is my primary development OS. So, I'm leaning toward SVN, which I do have some experience with. I do have my own personal server, and I think I'll use that for my source control, because I'd rather be my own admin than have to deal with university bureaucracy. I had some trouble setting up SVN once before, but I'll give it another shot. Maybe I'll also install something like Trac or Redmine for bug-tracking, todo list, etc?

What about building and deployment? There has to be a better way than using Fat-Jar and manually uploading my jar to the server. I've heard about tools like Ant and Maven - do these apply to what I want to do? How can I get started using those?

I suppose I'd eventually like to integrate unit testing with JUnit too. Even though it probably should be, that is not my primary concern right now, because so far my applications aren't terribly complex. I'd really like to work on simplifying and streamlining my workflow right now, and then I'll ease into unit testing.

Sorry for the long question. I guess my question boils down to, for a sole developer, what tools and methodologies can/should I be using to not only make my job easier, but also just to expose myself to some technologies that would be expected requisite knowledge at a dedicated development house?


edit: Thanks for the great answers so far. I didn't mean to suggest that I wanted to make my workflow "enterprisey" just for the sake of doing it, but to make my job simpler and to get a few technologies under my belt that are typically used in enterprise development environments. That's all I meant by that.

Like others have said, you already clearly know what you need to do. A VCS is a must, CI or bug-tracking may be overkill (for a single developer a spreadsheet might suffice for bug-tracking).

One thing that might benefit you greatly is keeping an organized product backlog. In solo development, I find keeping focused on the high-priority features and avoiding feature creep to be one of my biggest challenges. Keeping a backlog helps immensely. It doesn't have to be much more than a prioritized list of features with some notes about the scope of each. At my workplace, we keep this info in Trac, but here again, a spreadsheet may be all you need.

And I want to put in a plug for unit testing, particularly Test Driven Development (TDD). Kent Beck's book is a good place to start. I find that TDD helps keep me honest and focused on what I really need to do, particularly on a single-developer project without QA. Sometimes it seems like the code writes itself.

I'm working on some code that includes database access. Does test-driven development include integration tests as well as the usual unit tests?

Thanks!

AFAIK, TDD originally didn't distinguish between unit tests and integration tests. It remains that an integration test is generally much more costly in terms of resources you need to set up, which is why mocks were identified as a good practice even in early TDD literature.

From Test-Driven Development By Example ("Mock object" pattern) :

The solution is not to use a real database most of the time

Still, it shouldn't prevent you from writing a few other tests that verify if your production code plays well with the real database or expensive resource in question, if needed :

What if the mock object doesn't behave like the real object ? You can reduce this strategy by having a set of tests for the Mock Object that can also be applied to the real object when it becomes available.

All in all, I guess the whole integration vs unit test thing is orthogonal to TDD. In other words : having a small red/green/refactor feedback loop as your atomic building block doesn't determine which flavor of overall application development workflow you should pick or which other feedback loops should surround it - it could be acceptance driven as @lazyberezovsky explained, outside-in or inside-out, integration-centered or isolation-centered, etc, as long as you remain truthful to the test-first approach.

I’ve almost 6 years of experience in application development using .net technologies. Over the years I have improved as a better OO programmer but when I see code written by other guys (especially the likes of Jeffrey Richter, Peter Golde, Ayende Rahien, Jeremy Miller etc), I feel there is a generation gap between mine and their designs. I usually design my classes on the fly with some help from tools like ReSharper for refactoring and code organization.

So, my question is “what does it takes to be a better OO programmer”. Is it

a) Experience

b) Books (reference please)

c) Process (tdd or uml)

d) patterns

e) anything else?

And how should one validate that the design is good, easy to understand and maintainable. As there are so many buzzwords in industry like dependency injection, IoC, MVC, MVP, etc where should one concentrate more in design. I feel abstraction is the key. What else?

To have your design reviewed by someone is quite important. To review and maintain legacy code helps you to realize what makes the software rotten. Thinking is also very important; One one hand don't rush into implementing the first idea. On the other hand, don't think everything at once. Do it iteratively.

Regular reading of books/articles, like Eric Evan's Model Driven Design, or learning new languages (Smalltalk, Self, Scala) that take different approach to OO, helps you to really understand.

Software, and OO, is all about abstractions, responsibilities, dependencies and duplication (or lack of it). Keep them on your mind on your journey, and your learning will be steady.

It takes being a better programmer to be a better OO programmer.

OO has been evolving over the years, and it has a lot to do with changing paradigms and technologies like n-tier architecture, garbage collection, Web Services, etc.. the kind of things you've already seen. There are fundamental principles such as maintainability, reusability, low coupling, KISS, DRY, Amdahl's law, etc. you have to learn, read, experience, and apply it yourself.

OO is not an end on its own, but rather a means to achieve programming solutions. Like games, sports, and arts, practices cannot be understood without principles; and principles cannot be understood without practices.

To be more specific, here are some of the skills that may make one a better programmer. Listen to the domain experts. Know how to write tests. Know how to design a GUI desktop software. Know how to persist data into database. Separate UI layer and logic layer. Know how to write a class that acts like a built-in class. Know how to write a graphical component that acts like a built-in component. Know how to design a client/server software. Know networking, security, concurrency, and reliability.

Design patterns, MVC, UML, Refactoring, TDD, etc. address many of the issues, often extending OO in creative ways. For example, to decouple UI layer dependencies from logic layer, an interface may be introduced to wrap the UI class. From pure object-oriented point of view, it may not make much sense, but it makes sense from the point of view of separation of UI layer and logic layer.

Finally, realizing the limitations of OO is important too. In modern application architecture, the purist data + logic view of OO doesn't always mesh very well. Data transfer object (Java, MS, Fowler) for example intentionally strips away logic part of the object to make it carry only the data. This way the object can turn itself into a binary data stream or XML/JSON. The logic part may be handled both at client and server side in some way.

Something that's worked for me is Reading. I just had a Bulb moment with this book... David West's Object Thinking which elaborates Alan Kay's comment of 'The object revolution has yet to happen'. OO is different things to different people.. couple that with with the fact that your tools influence how you go about solving a problem. So learn multiple languages.

Object Thinking David West

Personally I think understanding philosophy, principles and values behind a practice rather than mimic-ing a practice helps a lot.

I have a server application and I was wondering where I should start if I want to start implementing TDD and using Moq.

What good books I could read on the subject, which aren't too "web-oriented"?

I have questions on the matter, like:

Should I mock every object I want to test, or only those which I can't implement, like text writers?

My server needs a lot of setup before it can actually do anything I want to test, should I just cram that into a [TestInitialize] function?

How should I chain my tests, if I want to test deeper functionality?

One of my favorite books on TDD is Test Driven Development By Example (Kent Beck). I also really liked a 4-part screen cast he did.

Episode 1: Starter Test (28 minutes)

In this episode we take the first test for the first feature our sample application and slice it up to provide more-frequent feedback.

Episode 2: Isolated Tests (23 minutes)

In this episode we ensure that tests don’t affect each other. Once the tests are isolated we implement several new operations.

Episode 3: Big Feature (25 minutes)

In this episode we take a large feature and slice it up to provide more-frequent feedback. At the end we clean the code to remove duplication and make the code easier to read.

Episode 4: Finishing (20 minutes)

In this episode we finish the functionality of the sample application and prepare it for use by others. Design decisions that were deferred earlier in development are now clearer. The series closes with a summary of lessons from all of the episodes.

I recommend two books: Test Driven Development by Example, by Kent Beck. It's an excellent book on TDD, which I particularly enjoy because he walks through an example, which is very useful in getting a sense for the rhythm and thought process. On the other hand, it's a bit light on mocking. For that I would read The Art of Unit Testing, by Roy Osherove. As the title suggests, it's not focused on TDD specifically, but rather on how to write good unit tests; he has a good coverage on mocks and stubs.

Regarding what you should mock, the idea of mocking is to allow you to isolate the class/function you are testing from the rest of the environment, so that you can test its behavior against a fake environment you control. In that frame, you should not be mocking the class, but rather things it depends upon.

A trivial example: if you had a class using a Logger, testing that the class "writes" to the logger would be very painful, and could involve things like checking whether the logger has written in a text file. This is not a good idea on lots of levels - starting with the fact that your class doesn't care about how the logger does its job specifically. In that case you would replace the Logger instance in your class with a Fake, mocked Logger, and you can then verify that your class is calling the Logger at appropriate times, without worrying about what the logger does, exactly.

Regarding server initialization: a unit test is typically in memory, with no dependencies to the environment, so if you are doing TDD, you should probably not have to do that. In general, too much (any?) initialization code in a unit test is a bad sign.

This suggests that you are looking more for acceptance tests / BDD style tests. I recomment this recent article in MSDN Magazine on Behavior-Driven Development with SpecFlow and WatiN; it explains how you can develop in a test-first manner by developing together high-level tests which verify that the application is doing what the user wants (acceptance tests, where you would run your actual server and app), and that it's doing it by having small pieces of code that do what the developer intends (unit tests).

Hope this helps, and happy testing!

I have been used to do some refactorings by introducing compilation errors. For example, if I want to remove a field from my class and make it a parameter to some methods, I usually remove the field first, which causes a compilation error for the class. Then I would introduce the parameter to my methods, which would break callers. And so on. This usually gave me a sense of security. I haven't actually read any books (yet) about refactoring, but I used to think this is a relatively safe way of doing it. But I wonder, is it really safe? Or is it a bad way of doing things?

When you are ready to read books on the subject, I recommend Michael Feather's "Working Effectively with Legacy Code". (Added by non-author: also Fowler's classic book "Refactoring" - and the Refactoring web site may be useful.)

He talks about identifiying the characteristics of code you are working before you make a change and doing what he calls scratch refactoring. That is refectoring to find characteristics of the code and then throwing the results away.

What you are doing is using the compiler as an auto-test. It will test that your code compiles but not if the behaviour has changed due to your refactoring or if there were any side affects.

Consider this

class myClass {
     void megaMethod() 
     {
         int x,y,z;
         //lots of lines of code
         z = mysideEffect(x)+y;
         //lots more lines of code 
         a = b + c;
     }
}

you could refactor out the addtion

class myClass {
     void megaMethod() 
     {
         int a,b,c,x,y,z;
         //lots of lines of code
         z = addition(x,y);
         //lots more lines of code
         a = addition(b,c);  
     }

     int addition(int a, b)
     {
          return mysideaffect(a)+b;
     }
}

and this would work but the second additon would be wrong as it invoked the method. Further tests would be needed other than just compilation.

This sounds similar to an absolutely standard method used in Test-Driven Development: write the test referring to a nonexistent class, so that the first step to make the test pass is to add the class, then the methods, and so on. See Beck's book for exhaustive Java examples.

Your method for refactoring sounds dangerous because you don't have any tests for safety (or at least you don't mention that you have any). You might create compiling code that doesn't actually do what you want, or breaks other parts of your application.

I'd suggest you add a simple rule to your practise: make noncompiling changes only in unit test code. That way you are sure to have at least a local test for each modification, and you are recording the intent of the modification in your test before making it.

By the way, Eclipse makes this "fail, stub, write" method absurdly easy in Java: each nonexistent object is marked for you, and Ctrl-1 plus a menu choice tells Eclipse to write a (compilable) stub for you! I'd be interested to know if other languages and IDEs provide similar support.

Just as the title said. What ways do you use to test your own code so that it wouldn't be a boring task? Do you use any tool? For my projects, I use a spreadsheet to list all the possible routines i.e. from the basic CRUD and also all the weird routines. i make about 10 routines.

I get about 2-3 bugs and sometimes major ones by doing this. And if i'm not doing this the client reports another bug.

So tell me what technique do you use in testing your own code in such a way that it doesn't bore you?

Edit:

I forgot to mention that i am particularly working on web based apps and my language is PHP & Cakephp framework.

I used to think the same as you. When I first started programming, we had to work out what the output would be on paper and then do visual comparisons of the actual and expected output. Talk about tedious. A couple of years ago, I discovered Test Driven Development and xUnit and now I love tests.

Basically, in TDD, you have a framework designed to allow you to write tests and run them very easily. So, writing tests just becomes writing code. The process is:

  1. Just write enough to allow you to write a test. E.g you're adding a method to a class, so you just write the method sig and any return statement needed to get it to compile.
  2. Then you write your first test and run the framework to see that it fails.
  3. Then you add code to/refactor your method to get the test to pass.
  4. Then you add the next test and see that it fails.
  5. Repeat 3 and 4 until you can't think of any more tests.
  6. You've finished.

That's one of the nice things about TDD: once your code passes every test you can think of, you know you're finished - without TDD, sometimes it's difficult to know when to stop. Where do your tests come from? They come from the spec. TDD often helps you to realise that the spec. is full of holes as you think of test cases for things that weren't in the spec. You can get these questions answered before you start writing the code to deal with them.

Another nice thing is that when you discover a bug later, you can start reworking your code safe in the knowledge that all of the existing tests will prove your code still works for all the known cases, whilst the new tests you've written to recreate the bug will show you when you've fixed it.

You can add unit tests to existing code - just add them for the bits you're changing. As you keep coming back to it, the tests will get more and more coverage.

xUnit is the generic name for a bunch of frameworks that support different languages: JUnit for Java, NUnit for .NET, etc. There's probably already one for whatever language you use. You can even write your own framework. Read this book - it's excellent.

There are enough books on how to do unit testing.

Do you know any good books (or other good resources) on integration testing?

What I am particularly interested in is

  • Define scope (unit testing < integration testing < automated func. testing
  • What is a good and bad integration test
  • Data access
  • Service layers
  • Configuration
  • Spring or other DI containers for integration testing
  • ...

?

I like "Test Driven Development: By Example" (Amazon link).

Kent Beck is a good writer and obviously knows what he is talking about.

After reading this post I kinda felt in the same position as the guy who asked the question. I love technology and coming up with new ideas to solve real world problems just gets my neurons horny, but the other part of the equation - actually getting things done (fast) - is normally a pain in the ass to accomplish, specially when I'm doing this for myself.

Sometimes I kinda feel plain bored with code, some other times I spend more time moving the cursor in the text editor and staring at my code, trying to come up with a solution that is better than the one I already have. I heard this is a disease called perfectionism.

I've read in that same post (and also a few times here on SO too) that TDD is actually good to stop coding like a girl, however I've never given a chance at TDD - either because I'm too lazy to learn / set it up or because I don't think I need it because I can do all the tests I need inside my head.

  • Do you also believe that TDD actually helps to GTD?
  • What do I need to know about TDD?
  • What about alternatives to TDD?
  • What would be the best methodology to organize / develop a TDD web app?
  • What libraries should I use (if any) to make my life easier?

PS: I'm primarily (but not exclusively) working with PHP here.

Personally I think TDD is at best overkill and at worst an impediment to a the creative process of programming. Time that is spent laboriously writing unit tests for each as yet unwritten methods/classes would be better spent solving the original problem. That being said I am a big fan of unit tests and believe wholeheartedly in them. If I have a particularly complex or troublesome piece of code I'm more than happy to write 20 unit tests for a single method but generally AFTER I have solved the problem. TDD, just like every other programming paradigm, is no silver bullet. If is suits you use it if not keep looking.

But take my opinion with a grain of salt. A much more interesting one comes from Kent Beck and How deep are your unit tests?.

I have read other related posts, but am still not quite sure how, or if it is possible to dynamically cast (interface to implementation) in Java. I am under the impression that I must use reflection to do so.

The particular project I am working on requires a usage of many instanceof checks, and it is — in my opinion — getting a bit out of hand, so would appreciate any ideas/solutions.

Below is a mini example I wrote up just to clarify exactly what I'm wanting to do. Let me know if you need more information:

Interface:

public interface IRobot {
    String getName();
}

Implementations:

public class RoboCop implements IRobot {
    String name = this.getClass()+this.getClass().getName();
    public RoboCop() {}
    public String getName() { return name; }
}

public class T1000 implements IRobot {
    String name = this.getClass()+this.getClass().getName();
    public T1000() {}
    public String getName() { return name; }
}

The class that handles the implementations:

import java.util.LinkedList;
import java.util.List;
public class RobotFactory {

    public static void main(String[] args) {
        new RobotFactory();
    }

    public RobotFactory() {
        List<IRobot> robots = new LinkedList<IRobot>();
        robots.add( new RoboCop() );
        robots.add( new T1000() );
        System.out.println("Test 1 - Do not cast, and call deploy(robot)");
        for(IRobot robot : robots) {
            deploy(robot);  // deploy(Object robot) will be called for each..
        }
        System.out.println("Test 2 - use instanceof");
        for(IRobot robot : robots) { // use instanceof, works but can get messy
            if(robot instanceof RoboCop) {
                deploy((RoboCop)robot);
            }
            if(robot instanceof T1000) {
                deploy((T1000)robot);
            }
        }
        System.out.println("Test 3 - dynamically cast using reflection?");
        for(IRobot robot : robots) {
            //deploy((<Dynamic cast based on robot's type>)robot);  // <-- How to do this?
        }
    }

    public void deploy(RoboCop robot) {
        System.out.println("A RoboCop has been received... preparing for deployment.");
        // preparing for deployment
    }

    public void deploy(T1000 robot) {
        System.out.println("A T1000 has been received... preparing for deployment.");
        // preparing for deployment
    }

    public void deploy(Object robot) {
        System.out.println("An unknown robot has been received... Deactivating Robot");
        // deactivate
    }
}

Output:

[RoboCop@42e816, T1000@9304b1]
Test 1 - Do not cast, and call deploy(robot)
An unknown robot has been received... Deactivating Robot
An unknown robot has been received... Deactivating Robot
Test 2 - use instanceof
A RoboCop has been received... preparing for deployment.
A T1000 has been received... preparing for deployment.
Test 3 - dynamically cast using reflection?

So, to sum up my question, how can I completely avoid having to use instanceof in this case. Thanks.

Kent Beck says in his book Test Driven Development: Any time you're using run-time type-checking, polymorphism should help. Put the deploy() method in your interface and call it. You'll be able to treat all of your robots transparently.

Forget Reflection, you're just over thinking it. Remember your basic Object Oriented principles.

I'm at a point in my freelance career where I've developed several web applications for small to medium sized businesses that support things such as project management, booking/reservations, and email management.

I like the work but find that eventually my applications get to a point where the overhear for maintenance is very high. I look back at code I wrote 6 months ago and find I have to spend a while just relearning how I originally coded it before I can make a fix or feature additions. I do try to practice using frameworks (I've used Zend Framework before, and am considering Django for my next project)

What techniques or strategies do you use to plan out an application that is capable of handling a lot of users without breaking and still keeping the code clean enough to maintain easily? If anyone has any books or articles they could recommend, that would be greatly appreciated as well.

I'd honestly recommend looking at Martin Fowlers Patterns of Enterprise Application Architecture. It discusses a lot of ways to make your application more organized and maintainable. In addition, I would recommend using unit testing to give you better comprehension of your code. Kent Beck's book on Test Driven Development is a great resource for learning how to address change to your code through unit tests.

I need recommendations on a good Unit Testing book for use with ASP.NET MVC. Based on books you have actually read and use (your bible), what do you recommend?

I like Kent Beck's "Test Driven Development: By Example" (amazon link) as an introduction to TDD, it's not specific to C# nor ASP.NET MVC.

It seems that all of the newer ASP.NET MVC books have at least one chapter on unit testing.

I have recently completed Steve Sanderson's book Pro ASP.NET MVC Framework and I thought the author placed a strong emphasis on unit testing. The book doesn't have a dedicated chapter on unit testing, but just about every chapter has relevant sections or call-outs/sidebars that deal with testing routing (inbound and outbound), controllers, repositories, model binders, etc. If I remember he uses the nUnit and Moq libraries in great detail. You can preview parts of his book on Google Books : Pro ASP.NET MVC Framework or order it from Apress (their eBooks are reasonably priced but password protected with your email address which is sort of a pain for me) or from Amazon.

I haven't seen any plans for a book solely on ASP.NET MVC unit testing, so you're going to probably have to stick to blogs or with whatever content you can find in the upcoming ASP.NET MVC books (like I said, it seems that all of them cover unit testing to varying degrees).

Some of the books that I know of:

Wrox: Beginning ASP.NET MVC 1.0 --- Has a sample chapter on testing for download here.

Manning: ASP.NET MVC in Action --- Doesn't have an explicit chapter on testing, but if you download the CodeCampServer reference application you will find a ton of unit, integration, and regression tests.

Wrox: Professional ASP.NET MVC 1.0 --- Has unit tests in the NerdDinner sample application and a dedicated chapter on testing. Testing Guru Roy Osherove (author of The Art of Unit Testing) reviews the NerdDinner tests here.

Packt: ASP.NET MVC 1.0 Quickly --- Has a chapter on unit testing and the author has a pretty good blog that talks about various ASP.NET MVC issues including testing.

Sams: ASP.NET MVC Framework Unleashed --- Browsing the Table of Contents for the book reveals a fair amount of content dedicated to testing (mocking, TDD, etc). You can check out the author's blog for sample content from the upcoming book and other ASP.NET MVC and TDD related posts.

I was a C++ developer (mostly ATL/COM stuff) until, as many of us, I switched to C# in 2001. I didn't do much C++ programming since then.

Do you have any tips on how to revive my C++ skills? What has changed in C++ in the last years? Are there good books, articles or blogs covering the language. The problem is that most material I could find either targets people who are new to the language or those with a lot of experience.

Which C++ libraries are popular these days? I guess I will need to read on the STL because I didn't use it much. What else? Boost? ATL? WTL?

Pickup one of the C++ Unit Test frameworks out there (I suggest Google C++ Testing Framework, aka. gtest). Pick a small project that you can start from scratch and try some TDD. The TDD will encourage you to make small steps and to reflect on your code. Also, as you build your suite of unit tests, it gives you a base from which you can experiment with different techniques.

I'm teaching/helping a student to program.

I remember the following process always helped me when I started; It looks pretty intuitive and I wonder if someone else have had a similar approach.

  1. Read the problem and understand it ( of course ) .
  2. Identify possible "functions" and variables.
  3. Write how would I do it step by step ( algorithm )
  4. Translate it into code, if there is something you cannot do, create a function that does it for you and keep moving.

With the time and practice I seem to have forgotten how hard it was to pass from problem description to a coding solution, but, by applying this method I managed to learn how to program.

So for a project description like:

A system has to calculate the price of an Item based on the following rules ( a description of the rules... client, discounts, availability etc.. etc.etc. )

I first step is to understand what the problem is.

Then identify the item, the rules the variables etc.

pseudo code something like:

function getPrice( itemPrice, quantity , clientAge, hourOfDay ) : int 
   if( hourOfDay > 18 ) then
      discount = 5%

   if( quantity > 10 ) then
      discount = 5%

   if( clientAge > 60 or < 18 ) then
      discount = 5%


        return item_price - discounts...
end

And then pass it to the programming language..

public class Problem1{
    public int getPrice( int itemPrice, int quantity,hourOdDay ) {
        int discount = 0;
        if( hourOfDay > 10 ) {
             // uh uh.. U don't know how to calculate percentage... 
             // create a function and move on.
            discount += percentOf( 5, itemPriece );
            .
            .
            .
            you get the idea..

        }
     }
    public int percentOf( int percent, int i ) {
             // .... 
    }


}

Did you went on a similar approach?.. Did some one teach you a similar approach or did you discovered your self ( as I did :( )

A good book for beginners looking for a process: Test Driven Development: By Example

I read the latest coding horror post, and one of the comments touched a nerve for me:

This is the type of situation that test driven design/refactoring are supposed to fix. If (big if) you have tests for the interfaces, rewriting the implementation is risk-free, because you will know whether you caught everything.

Now in theory I like the idea of test driven development, but all the times I've tried to make it work, it hasn't gone particularly well, I get out of the habit, and next thing I know all the tests that I had originally written not only don't pass, but they're no longer a reflection of the design of the system.

It's all well and good if you've been handed a perfect design from on high, straight from the start (which in my experience never actually happens), but what if halfway through the production of a system you notice that there's a critical flaw in the design? Then it's no longer a simple matter of diving in and fixing "the bug", but you also have to rewrite all the tests. A fundamental assumption was wrong, and now you have to change it. Now test driven development is no longer a handy thing, but it just means that there's twice as much work to do everything.

I've tried to ask this question before, both of peers, and online, but I've never heard a very satisfactory answer. ... Oh wait.. what was the question?

How do you combine test driven development with a design that has to change to reflect a growing understanding of the problem space? How do you make the TDD practice work for you instead of against you?

Update: I still don't think I fully understand it all, so I can't really make a decision about which answer to accept. Most of my leaps in understanding have happened in the comments sections, not in the answers. Here' s a collection of my favorites so far:

"Anyone who uses terms like "risk-free" in software development is indeed full of shit. But don't write off TDD just because some of its proponents are hyper-susceptible to hype. I find it helps me clarify my thinking before writing a chunk of code, helps me to reproduce bugs and fix them, and makes me more confident about refactoring things when they start to look ugly"

-Kristopher Johnson

"In that case, you rewrite the tests for just the portions of the interface that have changed, and consider yourself lucky to have good test coverage elsewhere that will tell you what other objects depend on it."

-rcoder

"In TDD, the reason to write the tests is to do design. The reason to make the tests automated is so that you can reuse them as the design and code evolve. When a test breaks, it means you've somehow violated an earlier design decision. Maybe that's a decision you want to change, but it's good to get that feedback as soon as possible."

-Kristopher Johnson

[about testing interfaces] "A test would insert some elements, check that the size corresponds to the number of elements inserted, check that contains() returns true for them but not for things that weren't inserted, checks that remove() works, etc. All of these tests would be identical for all implementations, and of course you would run the same code for each implementation and not copy it. So when the interface changes, you'd only have to adjust the test code once, not once for each implementation."

–Michael Borgwardt

I think you have some misconceptions about TDD. For a good explanation and example of what it is and how to use it, I recommend reading Kent Beck's Test-Driven Development: By Example.

Here are a few further comments that may help you understand what TDD is and why some people swear by it:

"How do you combine test driven development with a design that has to change to reflect a growing understanding of the problem space?"

  • TDD is a technique for exploring a problem space and creating and evolving a design that meets your needs. TDD is not something you do in addition to doing design; it is doing design.

"How do you make the TDD practice work for you instead of against you?"

  • TDD is not "twice as much work" as not doing TDD. Yes, you'll write a lot of tests, but that doesn't really take much time, and the effort isn't wasted. You have to test your code somehow, right? Running automated tests are a lot quicker than manually testing whenever you change something.

  • A lot of TDD tutorials present highly detailed tests of every method of every class. In real life, people don't do this. It is silly to write a test for every setter, every getter, and so on. The Beck book does a good job of showing how to use TDD to quickly design and implement something, slowing down to "baby steps" only when things get tricky. See How Deep Are Your Unit Tests for more on this point.

  • TDD is not about regression testing. TDD is about thinking before you write code. But having regression tests is a nice side benefit. They don't guarantee that code will never break, but they help a lot.

  • When you make changes that cause tests to break, that's not a bad thing; it's valuable feedback. Designs do change, and your tests aren't written in stone. If your design has changed so much that some tests are no longer valid, then just throw them away. Write the new tests you need to be confident about the new design.

I want to learn how to build “robust” software that is designed to test itself. In other words, how do I implement automated tests in my software ( using java or groovy or c++ ).

So I want to know where to learn this (books or websites) and which tools and libraries I will need for this?

I found The Art of Unit Testing by Roy Osherove to be very helpful in understanding the basics of unit testing, integeration testing, TDD and so on. It's a bit tailored for .Net languages, but it also provides very good information on the ideas behind automated testing.

Look at the xUnit testing frameworks (cppUnit for C++, JUnit for Java) and check out the wonderful book xUnit Test Patterns: Refactoring Test Code.

And if you really want to get into it, check out test-driven development. A good introduction is Uncle Bob's The Three Laws of TDD and the bowling game kata (see also bowling game episode). A great book on the subject is Test Driven Development: By Example.

let me first explain what I'm aiming for with this question:

What kind of dev I am? I'm the guy who thinks about the problem, writes the code and then tests it by myself. I'm developing web-apps mainly but there are also projects which are UI based too (RCP/Swing apps). I run my app and click here, test this... You probably know this "style".

Well I'm a guy who tries to improve himself with every line/project and I want my code/apps to be tested pragmatically. I write in code - I want test in code.

So I started for some of my classes/functions to use unit tests (junit 4). This works for backend stuff where no UI is involved - tbh: I find it hard to write the most of tests. If we're building a webapp there are probably interactions with the session or something. I guess you get the point.

What I'm looking for are some resources probably with examples. Any good book advice would be welcome too. Don't get me wrong - I don't want only stuff for logic testing, I'm interested in ways to test my UI.

Maybe this is an important part too: I developing in Java (85% of the time) and PHP/Python (the rest)

Regards

Test Driven Development: By Example, by Kent Beck A practical guide is the seminal work.

Others I'd suggest are:

The RSpec Book represents the next generation - Behavior Driven Development. If you're interested in TDD, you'll be interested in BDD as well.

But more important than reading is to practice. Try to do it yourself, and see if you can find or create a local group that does Code Katas using TDD.

I am a trainee in development sector. My Boss says that i should be an agile programmer.

I went through through the net and found some interesting things about agile programming. Being a newbee how should I start with agile?
What should be my first step in Agile programming?

At present I am in pair programming. But it's not exactly pair programming as I am just watching what my co-developer is doing. I also wish to be an agile developer.
Can you suggest a way for me stepwise?

I wish to develop myself and also my programming skills.

Doing this all by yourself is going to be very difficult. If you take the principles on the agilemanifesto site you'll see that at least 6 of the items deal with groups of people and teams. You're going to need some buy-in from your co-workers and boss.

I'd start with your pair partner. Ask for a turn occasionally. You could try something like so, "lets see if I undersdand this, can I try to add the next feature point."

That being said, there's a good Ghandi quote, "Be the change you want to see in the world." There are a lot of actions you can take by yourself to raise the level of your game. Writing tests, getting a continuous build working, set achievable goals that have some basis in past experience, refactoring.

There are also tons of books that will be very helpful to someone getting started. There's probably someone at your site who would like to mentor you. If you show you're interested in continuing to learn someone would likely be able to help you. Talk to your boss too. If he wants something from you he should be able to at least point you in the direction of someone who could help.

I am on a team where i am trying to convince my teammates to adopt TDD (as i have seen it work in my previous team and the setup is similar). Also, my personal belief is that, at least in the beginning, it really helps if both TDD and Pair Programming are done in conjunction. That way, two inexperienced (in TDD) developers can help each other, discuss what kind of tests to write and make good headway.

My manager, on the other hand, feels that if we introduce two new development practices in the team at once, there is a good chance that both might fail. So, he wants to be a little more conservative and introduce any one.

How do i convince him that both of these are complementary and not orthogonal. Or am i wrong?

I really want to get into TDD development but I have no point of reference where to start.

I think that looking at code and seeing how they write tests and make classes testable that it'll be easier for me to digest and start using myself.

Is anyone aware of any sample or small open source C# applications that include unit tests?

I highly recommend "Test Driven Development: By Example (Addison-Wesley Signature Series)" by Kent Beck.

Far, far better than any other resources I've found on the net or elsewhere. Well worth the $40 - $50.

I want to start working with TDD but I don't know really where to start. We coding with .NET (C#/ASP.NET).

There's a good book called Test Driven Development in Microsoft .NET that you might check out. It is essentially the same as the classic Test Driven Development by Example, but with the Microsoft platform in mind.

I am looking for an easily digestible book to present to my boss/team.


Background info: More and more of our meetings at work involve my boss/team pondering how to implement more "best practices" around here. ("Here" = a very small application development shop. 4 developers)

The following things are items that my whole team agrees that we need:

  • Nightly builds
  • Decomposing "bugs" in our bug-tracker into smaller, more-specific items
  • Automated testing

The problem we face is how to get started.

I believe that if my shop could simply choose a clear and specific plan or set of rules, then everything else would fall into place. Right now we are stuck in discussions of fuzzy, feel-good ideas and nice-sounding buzzwords.

Please recommend to me your favorite book (or online resource) that contains clear, discrete, sequential steps for implementing a management scheme for guiding a TDD or Agile team/shop.

I realize that there are other paradigms besides TDD and Agile that would also address these concerns, but my own self-interests and biases point toward TDD and Agile so I would love to harness my team's desire for change and "nudge" it in that direction. Or feel free to slap me down if you vehemently disagree with my sentiments! I will take no offense. :)

As others have stated, I think these questions are answered best when respondents list only one book recommendation per answer.


Thank you all.

my favorite is Planning Extreme Programming

EDIT: it provides a complete replacement for traditional project management geared towards an XP/Agile team

the danger is, adopting modern development methods and then strangling them with archaic project-management and administration practices!

For your needs I recommend Test Driven Development: By Example (Kent Beck). It is clearly-written, more practical than theoretical, and prescribes time-tested recipes to adopt an agile, test-driven approach.

I am a student and I have developed a web application based on JSP. Now my professor has suggested that i should do some tests like unit test etc for my web application. Can anybody suggest what other test can I use to demonstrate the performance of my application. And also any good resource from where I can study how to do unit testing, as I have never done any testing before.

Thanks!

Selenium is a popular framework for client-side unit tests (i.e. automating client input on a web page). The site also has a lot of introductory material.

For testing the server-side stuff the good, old JUnit will suffice - it's integrated in all major IDEs.

You should look into Kent Beck's Test-driven Development, although Test-Driven development is more than regular unit testing, this book will enlighten you (I bet) and you will write way better unit tests, too.

i have a hashMap which i would like its data to be viewed in a JTable how ever i am having trouble getting the hashMap amount of columns and rows and the data to be displayed.i have a hashmap which takes a accountID as the key and a object of students in which each students have their data like name,id, age, etc.however referring to the JTable docs, it says i would need ints for the row and column and a multidimension array of type Object. how can i do it? can i change my hashMap into a multidimenion array?

--Edit i have edited my question so it could be more clear , i am fairly new to Java i do not really get what some of you have posted, especially since the work i am doing is quite related to OO and grasping OO concepts is my biggest challenge,

/I have a dataStorage class, the registered user is added to the HashMap with a Key input of his Username, which is getUser ./

import java.util.*;

public class DataStorage 
{
    HashMap<String, Student> students = new HashMap<String, Student>();  
    HashMap<String, Staff> staffMembers = new HashMap<String, Staff>();  
    //Default constructor
    public DataStorage(){
    }

    public void addStaffMember(Staff aAcc) 
    {
     staffMembers.put(aAcc.getUser(),aAcc);
    }

    public void addStudentMember(Student aAcc)
    {
     students.put(aAcc.getUser(),aAcc);
    }

   public Staff getStaffMember(String user)
   {
   return   staffMembers.get(user);
   }

   public Student getStudent(String user)
   {
    return students.get(user);
   }

   public int getStudentRows()
   {
        return students.size();
   }


}

/**** This is a student class which extends Account***/

public class Student extends Account {

    private String studentNRIC;
    private String diploma;
    private String gender;
    private double level;
    private int credits;
    private int age;
    private boolean partTime;
    private boolean havePc;
    private boolean haveChild;

    public Student(String n, String nr, String id, String dep, String user, String pass)
    {
        super(n, dep, user, pass, id);
        studentNRIC = nr;
    }

    public void setPartTime(boolean state)
    {
        if(state == true)
        {
            partTime = true;
        }
        else
        {
            partTime = false;
        }
    }

    public boolean getPartTime()
    {
        return partTime;
    }

    public void setHavePc(boolean state)
    {
        if(state == true)
        {
            havePc = true;
        }
        else
        {
            havePc = false;
        }
    }

    public boolean getHavePc()
    {
        return havePc;
    }

    public void setHaveChild(boolean state)
    {
        if(state == true)
        {
            haveChild = true;
        }
        else
        {
            haveChild = false;
        }
    }

    public boolean getHaveChild()
    {
        return haveChild;
    }
    public void setDiploma(String dip)
    {
        diploma = dip;
    }

    public String getDiploma()
    {
        return diploma;
    }

    public void setCredits(String cre)
    {
        credits = Integer.parseInt(cre);
    }

    public int getCredits()
    {
        return credits;
    }

    public void setGender(String g)
    {
        gender = g;
    }

    public String getGender()
    {
        return gender;
    }

    public void setAge(String a)
    {
        age = Integer.parseInt(a);
    }

    public int getAge()
    {
        return age;
    }
    public void setLevel(String lvl)
    {
        level = Double.parseDouble(lvl);
    }

    public double getLevel()
    {
        return level;
    }
    public void setStudentNRIC(String nr)
    {
        studentNRIC = nr;
    }

    public String getStudentNRIC()
    {
        return studentNRIC;
    }

}

/**** This is a the Account superclass***/

public class Account {

    private String name;
    private String department;
    private String username;
    private String password;
    private String accountID;
    public Account()
    {
    }   
    public Account(String nm,String dep,String user,String pass, String accID) 
    {
        name = nm;
        department = dep;
        username = user;
        password = pass;
        accountID = accID;

    }

    public void setName(String nm)
    {
        name = nm;
    }

    public String getName()
    {
        return name;
    }

    public void setDep(String d)
    {
        department = d;
    }

    public String getDep()
    {
        return department;
    }

    public void setUser(String u)
    {
        username = u;
    }
    public String getUser()
    {
        return username;
    }

    public void setPass(String p)
    {
        password = p;
    }

    public String getPass()
    {
        return password;
    }

    public void setAccID(String a)
    {
        accountID = a;
    }

    public String getAccID()
    {
        return accountID;
    }
}

Your DataStorage is like the StudentRegistration is used in the sample code.

 // TIP: It can be handy to place the student in some order in the Map 
    //      (therefore using the sorted map).
    private SortedMap students = new TreeMap();  
    // QUESTION: Why not use argument name 'student'?
    public void addStudentMember(Student aAcc)
    {
        students.put(aAcc.getUser(),aAcc);
    }
    // Updated implementation
    public void addStudent(Student student)
    {
        students.put(student.getAccID(), student);
    }
 // QUESTION: Would a method name 'getNumberOfStudents' not be better?  
    public int getStudentRows()

For me it is a little unclear why Student extends from Account. The account identification, is that an unique-id, through the hole system? Do staff (users) and student (users) all have that as unique identification? Where / who creates them? If not the system self, it can never be guranteed that they also enter correctly into your system. Even when checking on uniqueness within your system, helps. But who say not someone else (by accedent) used someone else his/her unique id? (How are the student and staff (accounts) created? If these id's are indeed unique, why not use those for placing the student into a SortedMap? If the sorting is not important. Why not just use a List of students?

Is the name parameter unique (by which you place the student in the Map)?

Programming is little more then learning a programming language. As once understanding the OO-language Java it is good to read some more general programming books. In your specific case I would say start with Domain Driven Design. And then continue with books like these Test Driven Development, Refactoring to Patterns and Design Patterns.

I have this method signature: List<ITMData> Parse(string[] lines)

ITMData has 35 properties.

How would you effectively test such a parser?

Questions:

  • Should I load the whole file (May I use System.IO)?
  • Should I put a line from the file into a string constant?
  • Should I test one or more lines
  • Should I test each property of ITMData or should I test the whole object?
  • What about the naming of my test?

EDIT

I changed the method signature to ITMData Parse(string line).

Test Code:

[Subject(typeof(ITMFileParser))]
public class When_parsing_from_index_59_to_79
{
    private const string Line = ".........";
    private static ITMFileParser _parser;
    private static ITMData _data;

    private Establish context = () => { _parser = new ITMFileParser(); };

    private Because of = () => { _data = _parser.Parse(Line); };

    private It should_get_fldName = () => _data.FldName.ShouldBeEqualIgnoringCase("HUMMELDUMM");
}

EDIT 2

I am still not sure if I should test only one property per class. In my opinion this allows me to give more information for the specification namely that when I parse a single line from index 59 to index 79 I get fldName. If I test all properties within one class I loss this information. Am I overspecifying my tests?

My Tests now looks like this:

[Subject(typeof(ITMFileParser))]
public class When_parsing_single_line_from_ITM_file
{
    const string Line = ""

    static ITMFileParser _parser;
    static ITMData _data;

    Establish context = () => { _parser = new ITMFileParser(); };

    private Because of = () => { _data = _parser.Parse(Line); };

    It should_get_fld??? = () => _data.Fld???.ShouldEqual(???);
    It should_get_fld??? = () => _data.Fld???.ShouldEqual(???);
    It should_get_fld??? = () => _data.Fld???.ShouldEqual(???);
    It should_get_fld??? = () => _data.Fld???.ShouldEqual(???);
    It should_get_fld??? = () => _data.Fld???.ShouldEqual(???);
    It should_get_fld??? = () => _data.Fld???.ShouldEqual(???);
    It should_get_fld??? = () => _data.Fld???.ShouldEqual(???);
    ...

}

Regarding your newer questions:

Should I test each property of ITMData or should I test the whole object?

If you want to be on the safe side, you should probably have at least one test which checks that each property was matched.

What about the naming of my test?

There are quite a few discussions on this topic, such as this one. The general rule is that you would have multiple methods in your unit test class, each aimed at testing something specific. In your case, it might be things like:

public void Check_All_Properties_Parsed_Correctly(){.....}

public void Exception_Thrown_If_Lines_Is_Null(){.....}

public void Exception_Thrown_If_Lines_Is_Wrong_Length(){.....}

So, in other words, testing for the exact behaviour that you consider "correct" for the parser. Once this is done, you will feel much more at ease when making changes to the parser code, because you will have a comprehensive test suite to check that you didn't break anything. Remember to actually test often, and to keep your tests updated when you make changes! There's a fairly good guide about unit testing and Test Driven Development on MSDN.

In general, I think you can find answers to most of your questions by googling a bit. There are also several excellent books on Test Driven Development, which will drive home not only the how of TDD, but the why. If you are relatively programming language agnostic, I would recommend Kent Beck's Test Driven Development By Example, otherwise something like Test-Driven Development in Microsoft .NET. These should get you on the right track very quickly.

EDIT:

Am I overspecifying my tests?

In my opinion, yes. Specifically, I don't agree with your next line:

If I test all properties within one class I loss this information.

In what way do you lose information exactly? Let's say there are 2 ways to do this test, other than having a new class per test:

  1. Have different methods for each property. Your test methods could be called CheckPropertyX, CheckPropertyY, etc. When you run your tests, you will see exactly which fields passed and which fields failed. This clearly satisfies your requirements, although I would say it's still overkill. I would go with option 2:
  2. Have a few different methods, each testing one specific aspect. This is what I originally recommended, and I think what you are referring to. When one of the tests fails, you will only get information about the first thing that failed, per method, but if you coded your Assert nicely, you will know exactly which property is incorrect. Consider the following code:

Assert.AreEqual("test1", myObject.PropertyX, "Property X was incorrectly parsed"); Assert.AreEqual("test2", myObject.PropertyY, "Property Y was incorrectly parsed");

When one of those fails, you will know which line failed. When you have fixed the relevant error, and re-run your tests, you will see if any other properties have failed. This is generally the approach that most people take, because creating a class or even method per property results in too much code, and too much work to keep up to date.

I've been programming for years in plenty of languages and like to think I'm generally pretty good at it. However, I haven't ever written any automated testing: no unit tests, no TDD, no BDD, nothing.

I've tried to start writing proper test suites for my projects. I can see the theoretical value of being able to test all the code in a project automatically after making any changes. I can see how test frameworks like RSpec and Mocha should make setting up and running said tests reasonably easy, and I like the DSLs they provide for writing tests.

But I have never managed to write an actual unit test for any part of my code. Stuff I write never seems to be very testable in a way that's actually useful.

  • Functions don't seem to be very callable outside the context in which they're used. Many functions I write make HTTP-request calls, or database queries, or some other non-easily-testable call.
  • Some functions return strings of HTML. I can compare the HTML string against a hardcoded version of the same string, but that only seems to limit my ability to change that part of the code. Plus having loads of HTML in my test code is a mess.
  • I can pass mock/spy objects into a method, and make sure they get certain method calls, but as far as I can tell that's only testing implementation details of the method I'm "testing".

How would I go about getting started with proper BDD testing? (I'd preferably like to do this using Mocha and Node.js, but general advice on BDD is fine too.)

It looks like the main question you're asking is, "how do I write testable code"?

Being a fan of object oriented programming I know I'm biased, but in my experience it's far easier to test code that is written in an OO style. The reason for this is that unit tests are meant to test small, isolated components of a system, and well designed object oriented code (mostly) provides this.

I agree that functions are often linked to the context that they're in, making them difficult to test. I don't have a lot of experience with functional programming, but I know that context is often passed around in some sort of variable, making it difficult to separate concerns of functions.

With OO programming, I have successfully tested objects that wrap around HTTP requests, database queries, etc, by mocking the object that does the actual network request to return a known set of data. You then test that your wrapper object handles that data in the right way. You can also test for failures and unexpected data. Another way of doing this is by setting up a local server that you use instead of the normal endpoint, but this gives your test suite an external dependency, which should be avoided when possible.

When testing HTML, many people don't do this at all, due to the highly changeable nature of the view layer. However, there are some things that are really worth testing, but never the full string of HTML - as you've discovered, just a tiny change would mean that the whole test breaks. What are you really testing in that case, that two strings in separate parts of your code base are the same?

The best thing to do is the load the HTML string from your function/object into an HTML parser library, and you can normally use Xpath or CSS selectors to check for tags with particular classes, IDs or other attributes, and check the number of elements that match certain requirements. Rspec has this built in (the have_tag() method), as do many testing libraries.

Something else you might like to look at is integration testing (e.g. Capybara, Selenium). This will load your web app with a JavaScript engine, so you can check for HTML elements and also JavaScript events.

On the whole mocking/stubbing thing, you generally only want to do this with objects that are dependencies of the object you're testing. Otherwise you can pretty much manipulate anything to assert as true!

As for resources on testing, I'd recommend looking at test driven development books even if you don't plan to practice TDD. The main reason is that they throw you head first into testing. here are a few:

  1. Kent Beck's book Test Driven Development: By Example
  2. Free ebook on TDD with PHP, Practical PHP Testing
  3. This website, Art of Unit Testing
  4. Slideshare - just search for unit testing or BDD and read as many as possible!
  5. David Chelimsky et. al.: The RSpec Book

What is the best way to start Domain Driven Design?

What are the recommended resources?

EDIT:

I mean, I'd like to know how to start learning DDD (the same way as to start TDD by reading K. Beck).

There's a really big book available on domain driven design, which was brilliantly abridged and made available as a free download here:

http://www.infoq.com/minibooks/domain-driven-design-quickly

To start "doing" domain driven design, you just need to follow the points in this book. Share a language with the business, create objects that represent something that the business would recognise and so on.

It is more difficult to get in full swing on large existing applications (but not impossible) but if you are writing something new, that's a great opportunity to go at it 100%.

The definitive book on DDD is Domain-Driven Design: Tackling Complexity in the Heart of Software

However its a book that takes some gestation that is best backed up with practice and observing how experienced DDD'ers think.
The site http://domaindrivendesign.org/ has some excellent resources including example projects. I also find it useful to trawl the various open source code repositories such as GitHub, Codeplex and SourceForge for projects that use DDD

In addition there is an excellent discussion forum where a lot of very experienced DDD'ers hang out.

Good luck on your DDD journey, its a long road without a turn!

My personal advice is to forget the "DDD Quickly" book and go straight to the "Domain-Driven Design: Tackling Complexity in the Heart of Software" book from Eric Evans. I'd also suggest not to read the book in the original order, but to read the intro and then move to the Strategic Design section, and only then go back to the first part of the book. You'll discover that there's more to DDD than a collection of patterns.

However, after the book has been published there's been some evolution in the DDD community (have a look to this video as a refresher). A new pattern Domain Event has been published, and many alternative supporting architectures have been discussed: CQRS and Event Sourcing above all.

When you've come up with an overall design / idea for how a part of a system should work, how do you decide where to start when doing TDD, or rather, how do you decide your first test to start with?

When faced with a list of tests to implement, you'll have categories

  1. Trivial to test but you're sure you can get this done.
  2. Non-trivial but you're reasonably confident of getting it done.
  3. Non-trivial but difficult-absolutely no clue of getting it done.

In such a scenario, Pick one from the Category2 bucket because it will provide with max knowledge/learning per unit of invested time. Plus it will get you rolling and boost confidence to ramp up to the more difficult Category3 tests.

I think I got this from TDD By Example - Kent Beck. Have a look in case you have the time.. Recommended.

I am writing a small app to teach myself ASP.NET MVC, and one of its features is the ability to search for books at Amazon (or other sites) and add them to a "bookshelf".

So I created an interface called IBookSearch (with a method DoSearch), and an implementation AmazonSearch that looks like this

public class AmazonSearch : IBookSearch
{
   public IEnumerable<Book> DoSearch(string searchTerms)
   {  
      var amazonResults = GetAmazonResults(searchTerms);
      XNamespace ns = "http://webservices.amazon.com/AWSECommerceService/2005-10-05";
      var books= from item in amazonResults.Elements(ns + "Items").Elements(ns + "Item")
                 select new Book
                 {
                      ASIN = GetValue(ns, item, "ASIN"),
                      Title = GetValue(ns, item, "Title"),
                      Author = GetValue(ns, item, "Author"),
                      DetailURL = GetValue(ns, item, "DetailPageURL")
                 };
      return books.ToList();
  }

  private static XElement GetAmazonResults(string searchTerms)
  { 
      const string AWSKey = "MY AWS KEY";
      string encodedTerms = HttpUtility.UrlPathEncode(searchTerms);
      string url = string.Format("<AMAZONSEARCHURL>{0}{1}",AWSKey, encodedTerms);
      return XElement.Load(url);
  }

  private static string GetValue(XNamespace ns, XElement item, string elementName)
  {
     //Get values inside an XElement
  }

}

Ideally I would like to have done this TDD-style, writing first a test and all. But I gotta confess I am having trouble getting my head around it.

I could create a FakeSearch that implements DoSearch() and return some ad-hoc books, but I don't think that brings any value at the moment, does it? Maybe later when I have some code that uses a list of books.

What else could I test first? The only test I can think of would be one that mocks the call to the cloud (at GetAmazonResults) and then checks that DoSearch can execute the Linq2XML select correctly and return the correct list. But it seems to me that this type of test can only be written after I have some code in place so I know what to mock.

Any advice on how you guys and girls would go around doing this test-first style?

It seems that your main issue here is knowing when to write mock code. I see your point: if you haven't written the code yet, how can you mock it?

I think the answer is that you want to start your TDD with very, very simple tests, as Kent Beck does in Test Driven Development. Start by writing a test that calls DoSearch and asserts that what you receive isn't null, and write some code to make that pass. Then write a test that asserts that you're retrieving the proper number of Books for a known search term, and write the code to make that pass. Eventually you'll get to a point where you need to receive actual, valid Book data to pass a test, and at that point, you'll have a portion of DoSearch written, and you can think about mocking it (or portions of it).

I'm trying to grasp test driven development, and I'm wondering if those unit tests is fine. I have a interface which looks like this:

public interface IEntryRepository
{
    IEnumerable<Entry> FetchAll();
    Entry Fetch(int id);
    void Add(Entry entry);
    void Delete(Entry entry);
}

And then this class which implements that interface:

public class EntryRepository : IEntryRepository
{
    public List<Entry> Entries {get; set; }

    public EntryRepository()
    {
        Entries = new List<Entry>();
    }

    public IEnumerable<Entry> FetchAll()
    {
        throw new NotImplementedException();
    }

    public Entry Fetch(int id)
    {
        return Entries.SingleOrDefault(e => e.ID == id);
    }

    public void Add(Entry entry)
    {
        Entries.Add(entry);
    }

    public void Delete(Entry entry)
    {
        Entries.Remove(entry);
    }
}

Theese are the unit tests I have written so far, are they fine or should I do something different? Should i be mocking the EntryRepository?

[TestClass]
public class EntryRepositoryTests
{
    private EntryRepository rep;

    public EntryRepositoryTests()
    {
        rep = new EntryRepository();
    }

    [TestMethod]
    public void TestAddEntry()
    {
        Entry e = new Entry { ID = 1, Date = DateTime.Now, Task = "Testing" };
        rep.Add(e);

        Assert.AreEqual(1, rep.Entries.Count, "Add entry failed");
    }

    [TestMethod]
    public void TestRemoveEntry()
    {
        Entry e = new Entry { ID = 1, Date = DateTime.Now, Task = "Testing" };
        rep.Add(e);

        rep.Delete(e);
        Assert.AreEqual(null, rep.Entries.SingleOrDefault(i => i.ID == 1), "Delete entry failed");
    }

    [TestMethod]
    public void TestFetchEntry()
    {
        Entry e = new Entry { ID = 2, Date = DateTime.Now, Task = "Testing" };
        rep.Add(e);

        Assert.AreEqual(2, rep.Fetch(2).ID, "Fetch entry failed");
    }
}

Thanks!

Here's some thoughts:

Positive

  • You're Unit Testing!
  • You're following the convention Arrange, Act, Assert

Negative

  • Where's the test to remove an entry when there's no entry?
  • Where's the test to fetch an entry when there's no entry?
  • What is supposed to happen when you add two entries and remove one? Which one should be left?
  • Should Entries be public. The fact that one of your asserts calls rep.Entries.SingleOrDefault suggests to me you're not constructing the class correctly.
  • Your test naming is a bit vague; typically a good pattern to follow is: {MethodName}_{Context}_{Expected Behavior} that remove the redundancy "test" redundancy.

As a beginner to TDD I found the book Test-Driven Development By Example to be a huge help. Second, Roy Osherove has some good Test Review video tutorials, check those out.

Just off the top of my head...

Although your testing of add really only tests the framework:

  • You've got adding 1 item, that's good
  • what about adding LOTS of items (I mean, ridiculous amounts - for what value of n entries does the container add fail?)
  • what about adding no items? (null entry)
  • if you add items to the list, are they in a particular order? should they be?

likewise with your fetch:

  • what happens in your fetch(x) if x > rep.Count ?
  • what happens if x < 0?
  • what happens if the rep is empty?
  • does x match performance requirements (what's it's algorithmic complexity? is it within range when there's just one entry and when there's a ridiculously large amount of entries?

There's a good checklist in the book Pragmatic Unit Testing (good book, highly recommended)

  • Are the results right?
  • Are all the boundary conditions CORRECT
    • Conform to an expected format
    • Ordered correctly
    • In a reasonable range
    • Does it Reference any external dependencies
    • Is the Cardinality correct? (right number of values)
    • does it complete in the correct amount of Time (real or relative)
  • Can you check inverse relationships
  • Can you cross check the results with another proven method
  • Can you force error conditions
  • Are performance characteristics within bounds

I am reading Test Driven Development: By Example. All examples use Java and Junit (I am on chapter 10). There are one test method that test for equality of two objects. I already override Equals of the class but when run my test it failed.

This is sample code

public class BaseX
{
    public string Test { get; set; }

    public override bool Equals(object obj)
    {
        return this.Test == ((BaseX)obj).Test;
    }

    public override string ToString()
    {
        return string.Format("Tyep: {0}, Test: {1}", this.GetType().Name, this.Test);
    }
}

public class A : BaseX
{

}

This is my test code

[Fact]
public void FunTest2()
{
    var b1 = new BaseX();
    var a1 = new A();

    b1.Test = "a";
    a1.Test = "a";

    Assert.Equal(a1, b1);
}

When I run the test, it will failed with this message.

TDD1.UnitTest.UnitTest1.FunTest2 : Assert.Equal() Failure
Expected: Tyep: A, Test: a
Actual:   Tyep: BaseX, Test: a

I think Assert.Equal compare both value and type of objects. So, I looked on xunit code and found that Assert.Equal call IEqualityComparer.Equals. If I want to compare two object with override method, what method should I use?

Update
I test this on Windows 7, Visual Studio 11 Beta, xunit.net 1.9.0.1566 (get files from nuget)

Before comparing both objects using T's Equals method, xunit compares types:

// Same type?
if (!skipTypeCheck && x.GetType() != y.GetType())
    return false;

As I see it, you have two choices:

The simple choice

Assert.True(b1.Equals(a1));

It might be less expected than an Equal overload, but KISS...

The less simple choice

public class BaseXComparer : IEqualityComparer<BaseX>
{
    public bool Equals(BaseX x, BaseX y)
    {
        return x.Test.Equals(y.Test);
    }

    public int GetHashCode(BaseX obj)
    {
        return obj.Test.GetHashCode();
    }
}

And then:

Assert.Equal(a1, b1, new BaseXComparer());

In this case, consider this.

Until someone will add a new overload (shouldn't be tricky, as the inner implementation has a bool parameter for this) or an extension, I'd recommend using the simple method above.

Ideas on how to develop software that is capable of adapting to meet changing business requirements? Any patterns, architectures, etc. Possibly some anecdotal examples would be great. This is more of a survey than a concrete questions. Thanks

You will want to learn more about the entire Agile Development movement. Agile is just what it says: able to quickly adapt.

I have an application that returns data dependent on the time specified, I can specify days, months or years. The issue is that if I were to run the application today and ask it to return data from 1 month ago and in 3 months time I were to ask the application to return data from that date for the previous 1 month (i.e. 1 month from date) the results will obviously be different. Due to the dynamic nature of this I am finding it difficult to create unit tests because I have to change the date depending on when I am running the tests. Does this symbolize bad design or is this an exception case?

What you are doing is not a unit test. Unit tests should exercise a small "unit" of your code and your code only. By incorporating the system time, you are also testing the environment in which you are currently running. That is a system test. Unit tests are very effective tools to make sure that the code you wrote was written correctly, but it helps a lot of you write your code in "testable" manner.

There are a few tricks that are easy to learn but difficult to master that will help you write testable code. They generally all follow the same pattern of creating what they call "seams" in your code and then injecting "stubs" or "mock objects" into those seams at test time.

The first important thing to figure out is where your seams go. This isn't that hard. Basically, any time you construct a new object, that's a good place for a seam. A prerequisite for this rule is that you have a pretty decent object-oriented design to begin with. (The guys over at the Google Testing Blog argue that you cannot unit test imperative code because you can't do dependency injection.) The other good place for a seam is any time you talk to an external data source, like the operating system, the file system, a database, the Internet, etc. This is what you are doing.

You need the system time. That's where your seam should go. I recommend you get a good book on this for a full treatment of all your options here, but here's one example of what you could do. There are at least 2 or 3 other ways to "inject your dependency" on the current system time. I'll use Python for pseudo-code, but it works in any OO-language:

class MyClass(object):
    def _get_current_time(self):
        '''This is a test seam'''
        return datetime.datetime.now()

    def age(self):
        return self._get_current_time() - self._birthday

Then in your test code, do this:

class FakeMyClass(MyClass):
    def __init__(self, test_time, *args, **kwargs):
        self._test_time = test_time
        MyClass.__init__(self, *args, **kwargs)

    def _get_current_time(self)
        return self._test_time

Now, if you test with FakeMyClass, you can inject whatever system time you want:

myclass = FakeMyClass(t)
self.assertEqual(myclass.age(), expected_age)

Again, this is a pretty big topic, so I recommend getting a good book.

I need good examples of Junit tests for Java classes, to spend in training, anyone have suggestions of good examples?

From memory, I think Kent Beck's Test Driven Development walks through some good examples. Probably a good book to refer to in testing training courses. http://www.amazon.co.uk/Test-Driven-Development-Addison-Wesley-Signature/dp/0321146530

I've heard this quite a bit when it comes to TDD, "You should always test before you code" actually I've never done full TDD or probably havent take advantage out of it, but how is it possible to test something that you havent even done???

Can you give me a clear example on how to do this??

Read Test Driven Development: By Example for a thorough explanation.

A summary is: You don't write all of your tests before you write your code; you write one test, run it to be sure it fails (if it passes before you write the code, you have a bad test), then code enough to make it run. Now you know you have that functionality written and tested. At this point you refactor (if there is any refactoring to do), then move on to write the next test.

The advantage is that by adding small pieces of functionality with tests, you end up with both a full suite of tests and a slim, well-organized design. The claim is that this design will be better than what you probably would have written had you not used test-first design (and there is some evidence to back this up).

There are lots of questions on SO about TDD, and a lot of misconceptions. Where can I point people to when trying to answer questions?

There's a great book on this topic Test-Driven Development By Example by Kent Beck and in this book TDD looks like this (with my own words):

  1. Write a test;
  2. Make it compile;
  3. Make it pass;
  4. Remove redundancies.

this is something I know I should embrass in my coding projects but, due to lack of knowledge and my legendary lazyness, I do not do.

In fact, when I do them, I feel like overloaded and I finally give up.

What I am looking for is a good book/tutorial on how to really write good tests -i.e useful and covering a large spectrum of the code -.

Regards

Excellent book that covers unit tests for legacy code in particular is Michael Feather's "Working Effectively with Legacy Code"

Working Effectively with Legacy Code

If you are looking for a good book going over how to Unit Test I would recommend Kent Beck's : Test Driven Development: By Example. He is the person who really started the idea of Unit Testing, so regardless of language this would be a great book to read to get a good foundation.

Also, Don't let the title discourage you. It does talk about TDD, but it's really just a good easy overview, of how to write effective unit tests, and how they should affect your design which is a key component of writing unit tests.

We have a group of a few developers and some business analysts. We as developers would like to start adding unit testing as part of our coding practices so that we can deliver maintainable and extensible code, especially since we will also be the ones supporting and enhancing the application in the future. But in this economic downturn we are struggling with the push to get started because we are challenged to just deliver solutions as fast as possible, with quality not being the top priority. What can we do or say to show that we will be able to deliver faster and with higher quality, as well as preparing for future enhancements.

Basically we just need to get over the learning curve of incorporating unit testing into our daily work, but we cannot do that now because it is viewed as an unnecessary overhead that would delay our projects that the business needs now.

We as developers want to provide the highest value to the business, especially quickly, but we know that we will also need to do this 6 months from now and we need to plan for that as well, and we believe that unit testing will help us greatly down the line.

EDIT All awesome input, thank you. I personally know how to write unit test, but I don't have the experience in me to say whether or not that unit test is good. I have just ordered Test Driven Development: By Example and will take the initiative to get the ball rolling on incorporating unit testing in our group.

Start unit testing the functions or classes when you create them. Begin with simple classes/functions that do not have external dependencies (DB, file system).

Share your progress inside the team. Count the number of tests and display a big chart showing your progress (unless the management/analysts are very hostile against unit testing).

Read about TDD: "Test-Driven Development : by example". Writing the tests first leads to code that can be easily tested. If you write the production code first, you may have hard time putting it under tests.

what are the best books to learn about junit, jmock and testing generally? Currently I'm reading pragmatic unit testing in Java, I'm on chapter 6 its good but it gets complicated.. is there a book for a bottom up? from your experience which helped you get the testing concept

Test Driven Developemnt by Kent Beck is the original. Read it's great. But the best way to learn is to practice. Check out different Katas (exercises) at the dojo site.

For me, the best thing that has helped me learn unit testing is reading the many blogs out there.

After that there are books such as Test Driven Development by Example by Kent Beck, xUnit Test Patterns, The Art of Unit Testing etc.

Some books are for java, others for C#...I don't really think it matters which language you read about TDD in as it all helps in one way or another.

  • How to write a unit test framework?
  • Can anyone suggest some good reading?

I wish to work on basic building blocks that we use as programmers, so I am thinking of working on developing a unit test framework for Java. I don't intend to write a framework that will replace junit; my intention is to gain some experience by doing a worthy project.

There are several books that describe how to build a unit test framework. One of those is Test-Driven Development: By Example (TDD) by Kent Beck. Another book you might look at is xUnit Test Patterns: Refactoring Test Code by Gerard Meszaros.

  • Why do you want to build your own unit test framework?
  • Which ones have you tried and what did you find that was missing?

If (as your comments suggest) your objective is to learn about the factors that go into making a good unit test framework by doing it yourself, then chapters 18-24 (Part II: The xUnit Example) of the TDD book show how it can be done in Python. Adapting that to Java would probably teach you quite a lot about Python, unit testing frameworks and possibly Java too.

It will still be valuable to you to have some experience with some unit test framework so that you can compare what you produce with what others have produced. Who knows, you might have some fundamental insight that they've missed and you may improve things for everyone. (It isn't very likely, I'm sorry to say, but it is possible.)

Note that the TDD people are quite adamant that TDD does not work well with databases. That is a nuisance to me as my work is centred on DBMS development; it means I have to adapt the techniques usually espoused in the literature to accommodate the realities of 'testing whether the DBMS works does mean testing against a DBMS'. I believe that the primary reason for their concern is that setting up a database to a known state takes time, and therefore makes testing slower. I can understand that concern - it is a practical problem.

I'm new to unit tests for my own projects, so this is my first attempt to write a unit test from scratch. I'm using python, and the unittest module. The TodoList class being tested here is a wrapper for actual lists, with a few extra methods for stuff like saving to disc. It also defines a few methods for getting items by their ID in the list (which isn't the same as the list index).

Tests (I've cut out a few helper methods, and a good few tests for the sake of not having people to scroll forever):

class TodoListTests(unittest.TestCase):

    def setUp(self):
        self.testdata = open("./testdata.json", "r")
        self.testdata_text = self.testdata.read()
        self.testdata.close()

    def tearDown(self):
        try:
            os.remove("./todo.json")
        except OSError:
            # File not created, no need to delete.
            pass

    def create_todolist_and_safe_list(self):
        self.create_data_file()
        self.todolist = todolist.TodoList("./todo.json")
        self.list = json.loads(self.testdata_text)

    def create_data_file(self):
        datafile = open("./todo.json", "w")
        datafile.write(self.testdata_text)
        datafile.close()

    # Snip out a few more helper methods

    def test_loop(self):
        self.create_todolist_and_safe_list()
        test_list = []
        for item in self.todolist:
        test_list.append(item)

        self.assertEquals(test_list, self.list)


    def test_save(self):    
        self.create_todolist_and_safe_list()
        self.todolist.save()
        newfile_text = self.get_data_file_as_string()
        self.assertEquals(newfile_text, self.testdata_text)

    # Snip out the rest of the tests.

Full link to source

I think that you are going in the right way. But I will send some suggestions;

  • Move the self.testdata.close() from setUp() to the tearDown() function.
  • Surround the others open/close with try/finally blocks. So, if a file didn't open with success it will be closed.

    try:
        file.open()
    finally:
        file.close()

  • Organize better your test folders. I suggest you to create a folder named _tests and inside this folder you should put the tests module (at your case you only have one module). Then, for each module create a folder with the name of the module and puts the files used by the tests of a module inside this folder.

To know more about TDD and tests you should read the book Test Driven Development: By Example

I'm new to Unit Tests so I've been trying to code some examples to learn the right way to use them. I have an example project which uses Entity Framework to connect to a database.

I'm using an n-tier architecture composed by a data access layer which queries the database using EF, a business layer which invokes data access layer methods to query database and perform its business purpose with the data retrieved and a service layer which is composed of WCF services that simply invoke business layer objects.

Do I have to code unit tests for every single layer (data access, business layer, services layer?

Which would be the right way to code a unit test for a method that queries a database? The next code is an example of a method in my data access layer which performs a select on the database, how should its unit test be like?

public class DLEmployee
{

    private string _strErrorMessage = string.Empty;
    private bool _blnResult = true;

    public string strErrorMessage
    {
        get
        {
            return _strErrorMessage;
        }
    }
    public bool blnResult
    {
        get
        {
            return _blnResult;
        }
    }

    public Employee GetEmployee(int pintId)
    {
        Employee employee = null;
        _blnResult = true;
        _strErrorMessage = string.Empty;

        try
        {
            using (var context = new AdventureWorks2012Entities())
            {
                employee = context.Employees.Where(e => e.BusinessEntityID == pintId).FirstOrDefault();
            }
        }
        catch (Exception ex)
        {
            _strErrorMessage = ex.Message;
            _blnResult = false;

        }

        return employee;
    }

Here are my 2 cents based on Domain Driven Design principles:

  • Your business layer should not depend on the concrete data layer, it should just define some abstract interfaces that the data layer can implement (repositories).
  • You should definitely unit-test your business layer with a fake data layer that does not touch the file system.
  • You might create integration tests including your service and business layer with a fake data layer. There is no point mocking out the business layer and check what the service layer calls on the business layer (behavior testing), rather check what state changes it makes on the business objects that are observable through the business layer.
  • You should create some end-to-end tests with a real data layer, the service and business layer and exercise some use-cases on the service layer.

If you just started on unit testing, I advise you to read Kent Beck's Test Driven Development by Example and Gerard Meszaros' xUnit Test Patterns

I'm reading through Test Driven Development: By Example and one of the examples is bugging me. In chapter 3 (Equality for all), the author creates an equals function in the Dollar class to compare two Dollar objects:

public boolean equals(Object object)
{
    Dollar dollar= (Dollar) object;
    return amount == dollar.amount;
}

Then, in the following chapter (4: Privacy), he makes amount a private member of the dollar class.

private int amount;

and the tests pass. Shouldn't this cause a compiler error in the equals method because while the object can access its own amount member as it is restricted from accessing the other Dollar object's amount member?

//shouldn't dollar.amount be no longer accessable?
return amount == dollar.amount

Am I fundamentally misunderstanding private?

UPDATE I decided to go back and code along with the book manually and when I got to the next part (chapter 6 - Equality For All, Redux) where they push amount into a parent class and make it protected, I'm getting access problems:

public class Money
{
    protected int amount;
}

public class Dollar : Money
{
    public Dollar(int amount)
    {
        this.amount = amount;
    }
    // override object.Equals
    public override bool Equals(object obj)
    {
        Money dollar = (Money)obj;
        //"error CS1540: Cannot access protected member 'Money.amount'
        // via a qualifier of type 'Money'; the qualifier must be of 
        // type 'Dollar' (or derived from it)" on the next line:
        return amount == dollar.amount;
    }
}

Does this mean that protected IS instance-based in C#?

Yep, you're fundamentally misunderstanding private. Privacy is class-specific, not instance-specific.

I'm really getting frustrated with learning how to properly develop software using TDD. It seems that everyone does it differently and in a different order. At this point, I'd just like to know what are all the considerations? This much is what I've come up with: I should use rspec, and capybara. With that said, what are all the different types of test I need to write, to have a well built and tested application. I'm looking for a list that comprises the area of my application being tested, the framework needed to test it, and any dependencies.

For example, it seems that people advise to start by unit testing your models, but when I watch tutorials on TDD it seems like they only write integration test. Am I missing something?

End-to-end development of real-world applications with TDD is an underdocumented activity indeed. It's true that you'll mostly find schoolbook examples, katas and theoretical articles out there. However, a few books take a more comprehensive and practical approach to TDD - GOOS for instance (highly recommended), and, to a lesser extent, Beck's Test Driven Development by Example, although they don't address RoR specifically.

The approach described in GOOS starts with writing end-to-end acceptance tests (integration tests, which may amount to RSpec tests in your case) but within that loop, you code as many TDD unit tests as you need to design your lower-level objects. When writing those you can basically start where you want -from the outer layers, the inner layers or just the parts of your application that are most convenient to you. As long as you mock out any dependency, they'll remain unit tests anyway.

I think I am pretty good with programming C# syntax. What I am looking for now is some resources, books(preferable), websites, blogs, that deal with the best way to design object oriented Desktop Applications and Web applications, especially when it comes to data and databases.

Thanks

Martin Fowler's Enterprise-Application-Architecture is a great book for common pattern's you'll see in a lot of client server applications.

More of a book on thinking about object oriented problems is Eric Evan's Domain-Driven Design: Tackling Complexity in the Heart of Software

You are asking to drink from a firehose. Let me encourage you to write some small programs before you tackle big ones. However, here are a few books about design and a paper which argues that a lot of design can't be learned from books:

  • On System Design is a good short paper that articulates what a lot of experienced programmers think about the art of design.

  • Programming Pearls by Jon Bentley presents some lovely examples of design in the small. It's a fun read and includes many classic stories.

  • The Unix Programming Environment by Kernighan and Pike presents one of the great software-design philosophies of the 20th century. Still required reading after almost 25 years.

  • Software Tools in Pascal is narrower and deeper but will tell you a lot about the specifics of building software tools and the design philosophy.

  • Abstraction and Specification in Program Development by Barbara Liskov and John Guttag will teach you how to design individual modules so they can fit with other modules to form great libraries. It is out of print but your local University library may have it.

  • C Interfaces and Implementations presents a very well designed library that gives C programmers the abstractions found in much higher-level languages.

  • Finally, Test-Driven Development will teach you how to articulate and develop a design through the stuff that matters: what your software actually does.

I learned a lot from Composite/Structured Design by Glenford Myers, but it bears a little less directly on the topics you asked about. It talks primarily about good and bad ways modules can interdepend.

For a book on how to develop software I would recommend The Pragmatic Programmer. For design you may want to look at Interface Oriented Design. Code Complete is an "A to Z" reference on developing software. You might also want to consider the O'Reilly Head First books, especially Head First Object-Oriented Analysis and Design, as something a little easier to start with.

EDIT I don't know how I forgot about Bob Martin, but you could also read any of the books that Object Mentor has on any of it's lists. Here is their section on Software Design. In particular I'd recommend Agile Software Development: Principles, Patterns, and Practices (Amazon, but it's also the second book on the Object Mentor list).

I haven't been thrilled with any of the recent books, so much so that I'm seriously thinking about writing a new one. The "Head First" books generally have read to me ike one step above the "For Dummies" books (to be fair, I haven't read that one.)

I'm actually fond of Peter Coad's Java Design; you can get one cheaply used, it's no longer in print. Obviously, it's Java heavy, but the design part is good, and pretty lightweight.

Ivar Jacobson's Object Oriented Software Engineering is also very good (it introduced the idea of "use cases", among other things) and does appear to still be in print, but there are zillions of used copies around.

OK, I know what a unit test is but I use it in some projects and not others...some clients don't know how its done and follow one convention...blah blah.

so here I am asking, how EXACTLY is a unit test process created? I hear, and read, that you write the tests first then the functionality and write the tests for that functionality and also use code coverage to identify any "slips" of not having tests for that code which has not been covered.

So, lets use a simple example:

Requirement: "application must return the result of 2 numbers combined."

you and I know we would have a class, something like "Addition" and a method "Add" which returns an integer like so:

public class Addition
{
   public int Add(int num1, int num2)
   {
      return num1 + num2;
   }
}

But even before writing this class, how do you write tests first? What is your process? What do you do? What would the process be when you have that spec doc and going into development?

Many thanks,

Process you're referring to is called Test-Driven Development. Idea is simple and close to what you described; given functionality, you start writing code by writing test for this functionality. In your add example, before any working code is written you should have a simple test - a test that fails.

Failing Test

[Test]
public void TestAdd()
{
     var testedClass = new Addition();

     var result = testedClass.Add(1, 2);

     Assert.AreEqual(result, 3);
}

This is a simple test for your .Add method, stating your expectations of the soon-to-be working code. Since you don't have any code just yet, this test will naturally fail (as it is supposed to - which is good).

Passing test

Next step is to write the most basic code that makes the test pass (naturally, the most basic code is return 3; but for this simple example this level of details is not necessary):

public int Add(int num1, int num2)
{
    return num1 + num2;
}

This works and test passes. What you have at this point, is basic proof that your method works in the way you stated it in your assumptions/expectations (the test).

However, you might notice that this test is not a good one; it tests only one simple input data of many. Not to mention, in some cases one test might not be enough and even though you had initial requirements, testing might reveal more is needed (for example arguments validation or logging). This is the part when you go back to reviewing your requirements and writing tests, which leads us to...

Refactor

At this point you should refactor code you just wrote. And I'm talking both unit test methods code and tested implementation. Since Add method is fairly simple and there's not much you can improve in terms of adding two numbers, you can focus on making test better. For example:

  • add more test cases (or consider data driven testing)
  • make test name more descriptive
  • improve variables naming
  • extract magic numbers to constants

Like this:

[TestCase(0, 0, 0)]
[TestCase(1, 2, 3)]
[TestCase(1, 99, 100)]
public void Add_ReturnsSumOfTwoNumbers(int first, int second, int expectedSum)
{
     var testedClass = new Addition();

     var actualSum = testedClass.Add(first, second);

     Assert.That(actualSum, Is.EqualTo(expectedSum));
}

Refactoring is topic worth it's own book (and there are many), so I won't go into details. The process we just went through is often referred to as Red-Green-Refactor (red indicating failing test, green - passing one), and it's part of TDD. Remember to rerun the test one more time in order to make sure refactoring didn't accidentally break anything.

This is how the basic cycle goes. You start with requirements, write failing test for it, write code to make test pass, refactor both code and tests, review requirements, proceed with next task.

Where to go from here?

Few useful resources that are good natural follow-up once you get to know the idea behind TDD (even from such brief explanation as presented in this post):

Over the summer, I've been reading a lot about design. Two notable pieces that I read were Test Driven Development: By Example by Kent Beck, and another named Pattern-Oriented Analysis and Design: Composing Patterns to Design Software Systems by Yacoub.

These books take two very different approaches to designing software systems. My impression is that both of these techniques are at opposite ends of the spectrum. I come to this conclusion because:

  • In POAD, we design strictly from patterns. In TDD, we refactor to patterns on an as-needed basis.
  • In TDD, we keep models light weight. In POAD, we rigorously develop models for different levels of granularity. We use code generation and round tripping to keep the models and code consistent.
  • POAD seems more theoretical, and less proven to work. Many software developers practice TDD.

I realize that TDD is well proven to work, though I see a lot of reasoning behind using a pattern-oriented approach to design:

  • Patterns offer a higher level of abstraction than classes.
  • Patterns allow you to ignore details of higher granularity.
  • Trying to understand the system only in terms of code may become more difficult, however POAD provides higher level models that provide traceability. Using these models in conjunction with the code can make the system easier to understand at a faster pace.

So, given this, are these methodologies competing and mutually exclusive? I feel like I prefer POAD, though I certainly see TDD as a valuable design methodology. Is there a preferred approach?

I wouldn't consider those two approaches competing or mutually exclusive.

Even using POAD, you're still writing code. That code can still be Unit Tested. You can still write those tests before you write the code, watch them fail, write the code, and check for success.

It seems to me that they should be able to walk happily down the road to better code, hand in hand.

BACKGROUND

I'm writing a little game using Java, slick2d and other frameworks. Slick2d does not make it easy to write unit tests, but that's something I can't get around. One of the goals of the project was to have some test coverage but...

THE PROBLEM

Well... I wrote a 200-line test case, with 15 tests, and all for a class with only a single method.

I tested everything I could think of: invalid arguments, combinations of invalid arguments, swapping method calls and so on. I know I can't test everything, and I know I don't need to test code from libraries (Java API, slick2d API, logback API, etc.), but even in that case, I can get pretty crazy with tests, and I believe that I won't be able to finish it if I write 15 tests for every method I create. So...

THE QUESTION

Where does good TDD draw the line at writing tests? Exactly what should I test, and what can I safely ignore?

OBS: For those of you wondering, the single-method class for which I wrote 15 tests was loading some strings into an array, and its method would retrieve the string, given the line and file as argument.

OBS2: I'm not skeptical of unit testing at all. I actually want to incorporate them in my project (whenever my API allows me) from the ground up. I just want to finish the project too, and don't die writing tests all day long.

I would suggest the following book: http://www.amazon.com/dp/0321146530/?tag=stackoverfl08-20 from Amazon
Beside the book recommendation, When you design your tests, you have a lot of work at the beginning, but at a point, for every new code, most of your test logic will already be on place.
I would also suggest to make sure you are focused on intrusion prevention as well (code that test for SQL injection, buffer ovf and so)
Another point to remember is that when the one who wrote the code is the one who wrote the tests, you might want someone else that will try to break it down... not for everything, but at least for part of it.

Duplicate: http://stackoverflow.com/questions/135651/learning-unit-testing


I'm trying to develop some software for my research group to analyze and plot experimental data. I would like to make it where it's pretty error free. Would this be a situation for unit testing? If so could you possibly point me to some good references for unit testing?

Pretty much any code is a good candidate for unit testing; it will help you document the intent of your code, and prove that your code works as intended. It will definitely help you find bugs before releasing your code to the rest of your group. You don't say what platform you're using, so I can't recommend a testing framework, but I can recommend Kent Beck's excellent Test Driven Development: By Example as a good general starting place.

There's one question I can't find an answer to, concerning TDD with the outside-in approach:

I implement a new unit (A), write a test for it and this unit needs a dependency (B) that does not exist yet. In my test it's easy to mock this dependency, but what do I do in my production code?

Do I implement (B) first and let my tests for (A) fail meanwhile because I haven't gone on implementing it to make its tests pass yet?

Or do I complete (A) first and meanwhile let tests for (B) fail because it e.g. just returns "empty" objects instead of actually doing what its specification tells it to do?

Or should I let (B)'s tests temporarily check that it returns "empty" objects while I keep implementing (A) - although that's actually not what (B)'s specification is?

The fundamental strategy of TDD is to keep all your tests passing except for the one you're working on right now. Make (A)'s tests pass before you worry about (B).

The order in which you'd write tests and code for a class (A) and its complicated dependency (B) is

  • Write a test for (A). [Suite is red.]
  • Begin implementing enough of (A) to get the test you just wrote to pass. Discover that you need (B). [Suite is red.]
  • Mock (B). [Suite is red.]
  • Finish making the test of (A) that you just wrote pass. [Suite is green. Ahhh!] Refactor.
  • If you're not at a good stopping point with (A) yet, go back to the top and repeat until you are at a good stopping point with (A).
  • Write a test for (B) that requires (B) to do part or all of what the mock of (B) does. [Suite is red.]
  • Make the test you just wrote pass. [Suite is green. Ahhh!] Refactor.
  • If you haven't replicated all of what the mock of (B) does in tests and code of (B) yet, go back two steps and repeat until you've replicated all of what the mock of (B) does.

At this point you can choose to work some more on (A) or (B) or start something new.

Although this strategy keeps your tests passing at all times, it does not ensure that your application does something useful right away. The way to ensure that your application eventually does something useful goes beyond TDD: start by writing an acceptance test (which runs against the entire application without mocks) and TDD until the acceptance test and your unit tests all pass. (See for more.)

Acceptance tests (or other integration tests) also ensure that you correctly replicate mocks in your tests of and code for mocked classes.

Note also that it's critical to keep track of requirements that you've thought of but not implemented yet, or that you've 'implemented' only in a mock and need to implement in tests of and code for the mocked dependency. That's why TDD By Example and other examples of how TDD is done talk so much about actual or mental to-do lists. In the case of a class (A) with a mocked dependency (B), after you write the mock you can either go back to working on (A) or implement in (B) what you just did with the mock. Either way, you have to keep track of what you chose not to do until you're ready to go back and do it.

Are you doing test first anyway? Or in some cases you are doing some coding and then writing your tests to make sure code works? As for me I prefer to create a class. Sure, during class creation I think about its interface and how to test the class. But I dont write testing code first. Do you write it first? Do you think you should always write test code first?

I'm not a purist in this matter (TDD involves more than just writing the tests first, it's also about initially writing very minimal, "hard coded" tests and refactoring them a lot -- see The Book by The Master himself).

I tend to test-first when I'm doing incremental development to add a feature to an existing module, and I insist on test-first when the incremental development I'm doing is to fix a bug (in the latter case I absolutely want a unit-test AND an integration-test that both reproduce the bug, before I fix the code that caused the bug).

I tend to be laxer when I'm doing "greenfield" development, especially if that's of an exploratory, "let's see what we can do here that's useful", nature -- which does happen, e.g. in data mining and the like -- you have a somewhat vague idea that there might be a useful signal buried in the data, some hypothesis about its possible nature and smart ways to [maybe] extract it -- the tests won't help until the exploration has progressed quite a bit.

And, once I start feeling happy with what I've got, and thus start writing tests, I don't necessarily have to redo the "exploratory" code from scratch (as I keep it clean and usable as I go, not too hard to do especially in Python, but also in R and other flexible languages).

I am preparing a presentation. My topic is innovative software engineering methodologies. Agile is modern and innovative methodology but is the answer just Agile? What are the other innovative and modern methodologies? Are test driven development and behavior driven development also innovative methodologies? And eXtreme Programming is traditional methodology like waterfall?

I am not sure we can categorize these methodologies or frameworks as innovative, traditional or something else.

The choosing for a methodology or framework is completely depend on the product and customer needs. Any of them that meet the product requirements and provide efficiency to your team can be innovative in that scope.

Most of the software development process is developing complex products in complex environments in today's development World. I totally agree that agile methodologies, extreme programming, TDD and BDD are matches very well to the before definition of developing complex products in complex environment. Therefore, most of the agile methodologies are inspection of developing complex products.

Agile methodologies

The term agile is a really popular term used by software development professionals. There are lots of agile methodologies, frameworks like scrum, kanban or XP. They suggest methods to use to make us agile. The term agile is all covers these methods. Most of them solves the prediction, adaptation, transparency, inspection and empirical processes. All agile methodologies try to solve these problems faces during software development.

Extreme Programming

Extreme programming focuses on developing qualified software products and adopting to changing requirements and environment. Honestly, I really like XP. It does not suggest only development methodologies. It also suggests some about customer management, cost management etc. It is really basic, but hard to implement. I highly suggest to read the book Extreme Programming Explained by Kent Beck.

See Also :

Extreme Programming

Extreme Programming Explained, by Kent Beck

Scrum

Scrum is another framework for software development based on empirical process control: transparency, inspection, and adaptation. It is really simple and defines some role and event during software development. The roles are Scrum Master, Product Owner and Development team. The events are Sprint Planing, Daily Scrum, Sprint review and Sprint Retrospective. I suggest to read Scrum guide for further information.

See Also

Scrum Guide

Test Driven Development

Test driven development is a software development process. I can not say that it is an agile methodology itself. It helps software development to be agile. Test driven development support developers to test in the first phase. Test driven development also requires a mind set to think test before every development. It is not only writing unit test.

See also

Test-driven development

Test-driven development by Martin Fowler

Test Driven Development: By Example, Kent Beck

Behavior-driven development

It is another software development process and emerged from test driven development. It focuses on the cross team like development, management and customer have shared tools and shared processes to have the same understanding of the requirements. BDD suggest that business people, customer and technical teams should have same understanding for the product. Customer requirements, lets sat customer sentences, can be automatically tested by the tools.

See also

Behavior-driven development

10 Tips for Writing Good User Stories

Cucumber.io

Summary

The term Agile itself is missing without XP, Scrum, Kanban or any other methodology or framework. Any agile methodology or framework is also missing without TDD, BDD or Continuous integration. Any of these items must be supported by company culture, customer or business people. Every stakeholder in the project should have the mindset for product over project. Otherwise, Agile methodologies may not be helpful.

As a last word, I highly suggest to get well understanding of continuous integration.

See also

Continious Integration

Products over projects

The Clean Coder: A Code of Conduct for Professional Programmers

I am reading this books and there is a code example which seems confusing to me, the confusing part of the fragment I am describing below:

private Hashtable rates = new Hashtable();

void addRate(String from, String to, int rate) {
  rates.put(new Pair(from , to), new Integer(rate)); // Pair is a value object.
}

int rate(String from, String to) {
  Integer rate = (Integer) rates.get(new Pair(from, to)); // The confusing part.
  return rate.intValue();
}

Why the author needs to type cast while getting the value from HashTable if the value was already of type Integer?

As per the documentation, default usage of Hashtable creates a hash map where the key is an object of type Object and also a value which is of type Object.

Although you are placing integer values, when you are reading them back, you have no guarantee of what the object is, thus you will need to type cast it. Since what you are placing can be casting to an integer, the casting does not fail.

An alternative which does not require casting would be to use the generic version.

something to you:

you are reading this:

Product Details Paperback: 240 pages Publisher: Addison-Wesley Professional; 1 edition (November 18, 2002) Language: English ISBN-10: 0321146530 ISBN-13: 978-0321146533 Product

for that date, that could be either java 1.3 or maybe at the most java 1.4

So that must be a really old book, actual java HashTable uses generics and the get method returns the class parameter and not a Object class

public V get(Object key)

so for the new java version no cast is needed...


please use new literature, technology changes really fast... we are almost here with java 8 and looking forward for the 9th


Edit:

Ive found what I was looking for:

in java 4:

HashTable class get method returns an object and cast is necessary... enter image description here

but in java 7 for example:

HashTable class get method returns an V Type and cast is NOT necessary...

enter image description here

I am teaching myself PHP as well as TDD (Using PHP and PHPUnit). I am working through the book Test-driven Development by Example by Kent Beck.

In Chapter 3, for example, He suggests this for the equality test:

public void testEquality() {
    assertTrue(new Dollar(5).equals(new Dollar(5)));
}

In the Dollar class he rewrites the equals method as such:

public boolean equals(Object object) {
   Dollar dollar = (Dollar) object;
   return amount == dollar.amount;
}

It probably does not help that I am new to PHP, but I am not sure how to translate that into PHP.

For the first function I tried:

public function testEquality(){
    $a = new Dollar(5);
    $this->assertTrue($a->equals($b = new Dollar(5)));
}

Is this the right track? As much as I know of PHP right now objects have to be assigned to a variable, correct? Before that route I explored

$this->assertTrue(new Dollar(5)->equals(new Dollar(5))); 

threw a syntax error. Surprising as refactoring the earlier testMultiplication method with $this->assertEquals(new Dollar(10), $five->times(2)); passed, however.

As far as the equals method is concerned, that's completely foreign to me, and I just don't know where to start.

How can I correctly reconstruct the above in PHP? If I can get a few right I think I can handle the rest of the examples.

As a side question, does anyone know if this Money example has been approached in PHP and is the code out there?

Your code is the following

public boolean equals(Object object) {
   Dollar dollar = (Dollar) object;
   return amount == dollar.amount;
}

And it means this (line for line)

  1. define a new method called equals that will return either true or false and expects an object of type Object (in Java that means every object) as argument
  2. convert the object of type Object to type Dollar (wiki on type conversion), now you are able to use the public methods and attributes of this object
  3. compare the attribute amount from the current context with the one that comes from dollar (which comes from object)

You can now translate the method line by line to PHP without the need of typecasting

public function equals($object)
{
    return $this->amount == $object->amount;
}

It is possible to use a different method declaration:

public function equals(Dollar $object)

This way you can be sure only to compare two objects of the same type.

The usage is similar to the Java one as well:

$a = new Dollar(5);
$b = new Dollar(5);
if ($a->equals($b))
    print "TRUE";
else
    print "FALSE";

To learn more about object comparison in PHP, you better read the manual

I am trying to attempt mocking on some reflection (code below). I have been advised to use NSubstitue but I am struggling on how to implement this and to get it started.

At the moment my test stubs are simply like the one below, however on the build server these obviously fail as the DLLs are not present.

   [TestMethod]
    public void CanGetStudentXml()
    {
        var student = new ReadStudent();

        var results = student.GetStudentXml();

        Assert.AreNotEqual(string.Empty, results);
    }

Can anyone give me any pointers on how I should go about doing this? Do I need to create mock assemblies? If so, based on the one below, how would I achieve that?

Also is Nsubsitute the best for the job, or would moq be better suited? Which would be the best mocking framework to use?

Sample code:

namespace MokPoc
{
using System.Reflection;
using System;
using System.Linq;

class Program
{
    static void Main(string[] args)
    {
        var students = new ReadStudent();
        var results = students.GetStudentXml();
        var contacts = students.GetTelephoneXml();
    }
}

public enum ReflectedAssembyType
{
    SimsProcessesTpPersonStudent,
    SimsProcessesTpPersonContact
}

internal class ReflectedAssemblyFactory
{
    public static ReflectedAssemblyBase GetReflectedAssembly(ReflectedAssembyType reflectedAssembyType)
    {
        ReflectedAssemblyBase value = null;

        switch (reflectedAssembyType)
        {
            case ReflectedAssembyType.SimsProcessesTpPersonStudent:
                value = new SimsProcessesTpPersonStudent("ThirdPartyProcesses.dll");
                break;
            case ReflectedAssembyType.SimsProcessesTpPersonContact:
                value = new SimsProcessesTpPersonContact("PersonContacts.dll");
                break;
        }
        return value;
    }
}

internal abstract class ReflectedAssemblyBase
{
    private string path = string.Empty;
    private string type = string.Empty;

    public string Path
    {
        get { return this.path; }
        set { this.path = value; }
    }
    public string Type
    {
        get { return this.type; }
        set { this.type = value; }
    }

    public object InvokeFunction(string name, object[] args)
    {
        var assemblyToLoad = Assembly.LoadFrom(this.path);
        var typeToLoad = assemblyToLoad.GetType(this.type);
        var methodToInvoke = typeToLoad.GetMethod(name, args.Select(o => o.GetType()).ToArray());
        object obj = Activator.CreateInstance(typeToLoad);
        return methodToInvoke.Invoke(obj, args);
    }
}

internal sealed class SimsProcessesTpPersonStudent : ReflectedAssemblyBase
{
    public SimsProcessesTpPersonStudent(string assembly)
    {
        this.Path = System.IO.Path.Combine(@"C:\Program Files\Zoosk", assembly);
        this.Type = "SIMS.Processes.TPPersonStudent";
    }
}
public class ReadStudent 
{
    public string GetStudentXml()
    {
        var contacts = ReflectedAssemblyFactory.GetReflectedAssembly(ReflectedAssembyType.SimsProcessesTpPersonStudent);
        return  (string)contacts.InvokeFunction("GetXmlStudents", new object[] { DateTime.Today });
    }

    public string GetTelephoneXml()
    {
        var contacts = ReflectedAssemblyFactory.GetReflectedAssembly(ReflectedAssembyType.SimsProcessesTpPersonContact);
        return (string)contacts.InvokeFunction("GetXmlTelephone", new object[] { DateTime.Today });
    }
}
}

I have refactored you code to understand what you are trying to test, it seems like you had a lot of classes to do something that seems could be the responsibility of one class, the heart of what you are trying to do is in GetStudentAttributes, I would create a test.dll with a class and public method that returns some strings and then run an actual method to test, in that case you are not using a stub or mock but it is a valid test to ensure your code works. You should also test GetTelephoneXml and GetStudentXML but the only thing you are really testing there is that GetStudentAttributes is inkoved with the appropriate parameters, so when GetStudentXML is called you invokeGetStudentAttributes with "ThirdpartyProcesses.dll" and "GetXmlStudents".

Depending on the framework you use the solution to testing will be different, with Rhynomocks you will have to make the methods virtual to allow the proxy to inherit and invoke your methods, but you can certainly test that the method was called and that the parameters are what you expect, I haven't used nSubstitute, so not sure how to do it there but if the framework is decent you should be able to test those calls and the parameters.

One of the first things that you should do when using driven development is to start by writing the tests first, making sure it fails, making it pass and refactor, usually when you try and retrofit tests to existing code it could get really hard, there are some good resources out there about unit testing, this is a great book about it http://www.amazon.com/Test-Driven-Development-By-Example/dp/0321146530, but in my experience when something is hard to test it usually tells you that your code is too complex or something can be improved, once the code is simplified or fixed testing is usually not a problem.

Good luck and hope this helped a bit!

using System;
using System.IO;
using System.Linq;
using System.Reflection;

namespace MokPoc
{
    internal class Program
    {
           private static void Main(string[] args)
           {
                var students = new ReadStudentsService();
                string results = students.GetStudentXml();
                string contacts = students.GetTelephoneXml();
           }
    }


    public class ReadStudentsService
    {
        private const string ProgramFilesZooskDirectory = @"C:\Program Files\Zoosk";
        private const string SimsProcessesTppersonstudent = "SIMS.Processes.TPPersonStudent";

        public string GetStudentXml()
        {
            return GetStudentAttributes("ThirdPartyProcesses.dll", "GetXmlStudents");
        }

        public string GetTelephoneXml()
        {
            return GetStudentAttributes("ThirdPartyContacts.dll", "GetXmlTelephone");
        }

        public string GetStudentAttributes(string dllToUse, string methodToExecute)
        {
            var fullpath = Path.Combine(ProgramFilesZooskDirectory, dllToUse);
            var args = new object[] {DateTime.Today};
            var assemblyToLoad = Assembly.LoadFrom(fullpath);
            var typeToLoad = assemblyToLoad.GetType(SimsProcessesTppersonstudent);
            var methodToInvoke = typeToLoad.GetMethod(methodToExecute, args.Select(o => o.GetType()).ToArray());
            var obj = Activator.CreateInstance(typeToLoad);
            return (string) methodToInvoke.Invoke(obj, args);
        }
    }
}

I am very new to TDD. I am reading TDD By Example and it says "never try to use the same constant to mean more than one thing" and it show an example of Plus() method.

In my opinion, there is no difference between Plus(1, 1) which uses same constant value and Plus(1, 2). I want to know what are pros and cons of using same constant value in test method?

I think you misinterprete that statement. What the author (imho) is trying to convey is that following code is a recipe for disaster.

const SomeRandomValue = 32;
...
// Plus testcase
Plus(SomeRandomValue, SomeRandomValue)
...
// Divide testcase
Divide(SomeRandomValue, SomeRandomValue)

You have two testcases reusing a none descriptive constant. There is no way to know that by changing SomeRandomValue to 0 your testsuite will fail.

A better naming would be something like

const AdditionValue = 32;
const DivisorValue = 32;
...
// Plus testcase
Plus(AdditionValue, AdditionValue)
...
// Divide testcase
Divide(DivisorValue, DivisorValue)

where it should be obvious as to what the constants are used for.You should not get to hung up on the idea of code reuse when creating testcases.

Or to put it in other words:

I don't see anything wrong with reusing the DivisorValue constant in multiple testcases > but there is definitly something wrong trying to shoehorn one value into a none descriptive variable just in the name of code reuse.

I begin studying Unit testing with "(NUnit)". I know that this type of testing is used to test "classes" , "functions" and the "interaction between those functions".

In my case I develop "asp.net web applications".

  • How can i use this testing to test my pages(as it is considered as a class and the methods used in)and in which sequence?, i have three layers:

    1. Interface layer(the .cs of each page).

    2. Data access layer(class for each entity)(DAL).

    3. Database layer (which contains connection to the database,open connection,close connection,....etc).

    4. Business layer(sometimes to make calculation or some separate logic).

  • How to test the methods that make connection to the database?

  • How to make sure that my testing not a waste of time?.

There are unit and integration tests. Unit testing is testing single components/classes/methods/functions and interaction between them but with only one real object (system under test-SUT) and test doubles. Test doubles can be divided to stubs and mocks. Stubs provide prepared test data to SUT. That way you isolate SUT from the environment. So You don't have to hit database, web or wcf services and so on and you have same input data every time. Mocks are used to verify that SUT works as expected. SUT calls methods on mock object not even knowing it is not real object. Then You verify that SUT works by asserting on mock object. You can write stubs and mocks by hand or use one of many mocking frameworks. One of which is http://code.google.com/p/moq/

If You want to test interaction w/database that's integration testing and generally is a lot harder. For integration testing You have to setup external resources in well known state.

Let's take your layers:

  1. You won't be able to unit test it. Page is to tightly coupled to ASP.NET runtime. You should try to not have much code in code behind. Just call some objects from your code behind and test those objects. You can look at MVC design patters. If You must test Your page You should look at http://watin.org/. It automates Your internet browser, clicks buttons on page and verifies that page displays expected result's.

  2. This is integration testing. You put data in database, then read it back and compare results. After test or before test You have to bring test database to well known state so that tests are repeatable. My advice is to setup database before test runs rather then after test runs. That way You will be able to check what's in database after test fails.

  3. I don't really know how that differs from that in point no. 2.

  4. And this is unit testing. Create object in test, call it's methods and verify results.

How to test methods that make connections to the database is addresed in point 2. How to not waste time? That will come with experience. I don't have general advice other then don't test properties that don't have any logic in it.

For great info about unit testing look here:

http://artofunittesting.com/

http://www.amazon.com/Test-Driven-Development-Kent-Beck/dp/0321146530

http://www.amazon.com/Growing-Object-Oriented-Software-Guided-Tests/dp/0321503627/ref=sr_1_2?ie=UTF8&s=books&qid=1306787051&sr=1-2

http://www.amazon.com/xUnit-Test-Patterns-Refactoring-Code/dp/0131495054/ref=sr_1_1?ie=UTF8&s=books&qid=1306787051&sr=1-1

Edit:

SUT, CUT - System or Class under test. That's what You test. Test doubles - comes from stunt doubles. They do dangerous scenes in movies so that real actors don't have to. Same here. Test doubles replace real objects in tests so that You can isolate SUT/CUT in tests from environment.

Let's look at this class


public class NotTestableParty
{
    public bool ShouldStartPreparing()
    {
        if (DateTime.Now.Date == new DateTime(2011, 12, 31))
        {
            Console.WriteLine("Prepare for party!");
            return true;
        }
        Console.WriteLine("Party is not today");
        return false;
    }
}

How will You test that this class does what it should on New Years Eve? You have to do it on New Years Eve :)

Now look at modified Party class Example of stub:

    public class Party
    {
        private IClock clock;

        public Party(IClock clock)
        {
            this.clock = clock;
        }

        public bool ShouldStartPreparing()
        {
            if (clock.IsNewYearsEve())
            {
                Console.WriteLine("Prepare for party!");
                return true;
            }
            Console.WriteLine("Party is not today");
            return false;
        }
    }

    public interface IClock
    {
        bool IsNewYearsEve();
    }

    public class AlwaysNewYearsEveClock : IClock
    {
        public bool IsNewYearsEve()
        {
            return true;
        }
    }

Now in test You can pass the fake clock to Party class

        var party = new Party(new AlwaysNewYearsEveClock());
        Assert.That(party.ShouldStartPreparing(), Is.True);

And now You know if Your Party class works on New Years Eve. AlwaysNewYearsEveClock is a stub.

Now look at this class:

    public class UntestableCalculator
    {
        private Logger log = new Logger();

        public decimal Divide(decimal x, decimal y)
        {
            if (y == 0m)
            {
                log.Log("Don't divide by 0");
            }

            return x / y;
        }
    }

    public class Logger
    {
        public void Log(string message)
        {
            // .. do some logging
        }
    }

How will You test that Your class logs message. Depending on where You log it You have to check the file or database or some other place. That wouldn't be unit test but integration test. In order to unit test You do this.

    public class TestableCalculator
    {
        private ILogger log;
        public TestableCalculator(ILogger logger)
        {
            log = logger;
        }
        public decimal Divide(decimal x, decimal y)
        {
            if (y == 0m)
            {
                log.Log("Don't divide by 0");
            }
            return x / y;
        }
    }

    public interface ILogger
    {
        void Log(string message);
    }
    public class FakeLogger : ILogger
    {
        public string LastLoggedMessage;
        public void Log(string message)
        {
            LastLoggedMessage = message;
        }
    }

And in test You can

var logger = new FakeLogger();
        var calculator = new TestableCalculator(logger);
        try
        {
            calculator.Divide(10, 0);
        }
        catch (DivideByZeroException ex)
        {
            Assert.That(logger.LastLoggedMessage, Is.EqualTo("Don't divide by 0"));
        }

Here You assert on fake logger. Fake logger is mock object.

I change my environment to testing on basespec with sqlite as driver and store in the memory.

function it_should_validate()
    {
        // Create the user
        $data = array('email' => 'test@gmail.com', 'password' => 'password', 'remember' => false);

        $this->validate($data)->shouldBe(true);
    }

How do i insert information to the testing database whenever i run the test? Right now the test fails because it always return false;

What you're trying to do is integration testing.

PhpSpec is a tool for unit testing. It puts a big emphasis on testing in isolation. That means, that the only object that's created is the object being tested. All its dependencies should be stubbed or mocked.

In other words there should be no database involved in specs you write.

You didn't show the full example of a class you're trying to spec, so it's hard to advice how your spec should look like.

Adviced reading:

I'm building a Java EE web application using JSF, Netbeans and Glassfish. I just built a standard form login for my application. The problem is that now every time I deploy the project, which is very frequent, it clears the authentication and I have to log in again.

I am new to Java EE so it is possible that this is a configuration problem, but from what I read this is normal behavior.

During the development cycle, what methods are there to handle this? I could disable the authentication during development but that just doesn't seem like a "good" solution.

Thanks

I had an epiphany about my problem. The simple answer is Test Driven Development.

I found myself developing by making a change to my application. Clean, Compile, and Deploy to Glassfish. I was finding this increasingly frustrating because of how slow that process is. In addition, I'm new to JavaEE development so I am working in small incremental steps.

The epiphany came when I saw and remembered reading about Test Driven Development from Kent Beck. I only made it to Chapter 6 but shelved it as other books took higher priority at the time. Now it's time to read it.

I highly recommend reading the book. Basically, the process is to build your unit tests first and use them to build the application.

Here is a link to the book I'm reading: http://www.amazon.com/Test-Driven-Development-Kent-Beck/dp/0321146530/ref=sr_1_1?ie=UTF8&qid=1324507172&sr=8-1

Is there a good book or online site discussing the use of the CppUnit, for a beginner?

Suppose I was writing a clone of the game 2048 (http://gabrielecirulli.github.io/2048/) and I want to write a test to verify that "the right thing" happens when the game is "won". Suppose that my game state is encapsulated in a class and that the state itself is private.

I suppose that I could write code to play the game, evaluate through the public interface when I'm about to win and then make the winning move; however, this seems like overkill. I would instead like to set a game state, make the winning move and verify that the object behaves as expected.

What is the recommended way of designing such a test? My current thought is that the test should either be a public member function of the class or that the test infrastructure should be friended by the class. Both of these seem distasteful.

Edit: In response to the first question: I'm assuming in this example that I don't have a method to set the game state and that there's no reason to write one; therefore it would be adding additional functionality just to write a test... ...one that then requires another member function to test, a get game state function. So then I'm writing at least two more public methods and test just to write this one test. Worse, these are methods that essentially break encapsulation such that if the internal details change I have to change these two methods and their tests for no other reason than to have a test. This seems more distasteful than friending a test function.

First, remember that Test-Driven Development is a design-oriented methodology. The primary goal of the tests is to influence the design of the SUT and its collaborators; everything else is just along for the ride.

Second, TDD emphasizes small steps. In his book, Test-Driven Development: By Example, Kent Beck says:

If you have to spend a hundred lines creating the objects for one single assertion, then something is wrong. Your objects are too big and need to be split. (p. 194)

This means you should listen to your intuition about writing the code necessary to win the game being overkill.

You also said:

I would instead like to set a game state, make the winning move and verify that the object behaves as expected.

Which is exactly what you should do.

Why? Because you're testing end-game scenarios. Most/all of the details that led to the end-game are irrelevant - you just want to make sure the program does "the right thing... when the game is won." As such, these are the only details that are relevant to your tests.

So what are these details that are relevant to your tests? To figure them out, it helps to discuss things with a colleague.


Q: How does the test configure the system to indicate the game has been won - without actually playing the game?

A: Tell something that the game has been won.


Q: What object would the test tell that the game has been won?

A: I don't know. But to keep things simple, let's say it's some object serving the role of "Referee".


By asking these questions, we've teased out some details of the design. Specifically, we've identified a role which can be represented in OOP by an interface.

What might this "Referee" role look like? Perhaps:

(pseudocode)

begin interface Referee
    method GameHasBeenWon returns boolean
end interface

The presence of an interface establishes a seam in the design, which allows tests to use test-doubles in place of production objects. Not only that, it allows the implementation of this functionality to change (e.g., a rule change affecting how a game is determined to be "won") without having to modify any of the surrounding code.

This ties in directly with something else you mentioned:

I'm assuming in this example that I don't have a method to set the game state and that there's no reason to write one; therefore it would be adding additional functionality just to write a test...

A test is a consumer of your code. If it is difficult for a test to interact with your code, then it will be even more difficult for production code (having many more constraints) to interact with it. This is what is meant by "Listening to your tests".

Note that there are a lot of possible designs that can fall out of TDD. Every developer is going to have their own preferences which will influence the look and feel of the architecture. The main takeaway is that TDD helps break your program up into many small pieces, which is one of the core tenets of object oriented design.

Consider the following Python code from Kent Beck's book Test Driven Development Chapter 18 where he is building a framework for unit testing.

class TestCaseTest(TestCase):
    def testRunning(self):
       test= WasRun("testMethod")
       assert(not test.wasRun)
       test.run() // Here run() is called once
       assert(test.wasRun)
  TestCaseTest("testRunning").run()//Here run() is called again

The implementation of the base class TestCase looks like the following:

TestCase
 def __init__(self, name):
    self.name= name
def  run(self):
    method = getattr(self, self.name)
    method()
  1. Why is run() method called twice in the above code snippet?
  2. And who is calling the method testRunning() and when? Here it is only defining the method but no one seems to be calling this method.

P.S: I come from a Java background and I am not much familiar with Python syntax as such.

I don't have access to the book you're talking about, so I'm going off the code snippet you posted.

It looks like run() methods on two different objects are being called -- one of class TestCaseTest, one of class WasRun (or whatever a function named WasRun returns).

As for who is calling testRunning(), .run() is called on the TestCaseTest object and this is presumably a method from the superclass TestCase. Look up run() in TestCase and see whether self.testRunning() is called there.

Does Test Driven Development requires Unit Tests? I frequently find opinions that there is no TDD without Unit Test. I'm unable to confirm it with respected sources like Wiki or books I have access to.

From Wikipedia:

Test-driven development (TDD) is a software development process that relies on the repetition of a very short development cycle: first the developer writes an (initially failing) automated test case that defines a desired improvement or new function(...)

If Unit Test is not required does it mean creating integration test is enough to follow TDD?

When you are developing TDD, it is said that the best tests are atomic and isolated. This means that they are testing something very specific without depending of other stuff in your project. UnitTest is used exactly for this, so I guess that there is NO TDD without Unit Testing.

The idea in developing using TDD, is to provide yourself and the team with certainty that all your code is still working as it is supposed to. To achieve this you will need an integrated server where tests must run every time you integrate everything. If you made Unit Tests with a framework, this is easy to achieve.

I really recommend this book, it is short, easy to read, and really shows you the way:

Test Driven: Practical TDD and Acceptance TDD for Java Developers

If the integration tests are meaningful and provide useful (and quick) feedback, then sure. It's not about the purity of the testing paradigm, it's about the quick feedback loop and validation of the code being written. As long as you have that, you have TDD.

On a side note, I wouldn't refer to Wikipedia as a respected resource in cases like this. A source of reference material, sure. But if you have questions about TDD, I doubt there's a more respected resource than Kent Beck's book on the subject.

I am currently doing a uni assignment as a fresher so I'm not very experienced with coding.

The assignment is to allow the user to create a class diagram. Currently I have a secondary frame that the user inputs a UML design of the class in a JTextArea and then this should be passed through to my GUI by taking what the user wrote and drawing the string (drawString) onto the created class.

Currently I am getting a nullpointerexception when trying to get the String from the JTextArea to drawString and I don't understand this because surely inputUML.getText() would be a String so should be able to be passed through?

This is the class that I am trying to pass the string into

package main;
import java.awt.*;
import javax.swing.*;
import classdesign.ClassCreation;



public class GroupCreateClass {
private double x;
private double y;
private double r;
private String message;
private String classdesign;


public GroupCreateClass(double x, double y, double r) {
    this.x = x;
    this.y = y;
    this.r = r;
}

public void draw(Graphics g) {
    makeRectangles(g);
    deleteBox(g);
    userdesignofclass(g);
}

public void makeRectangles(Graphics g) {

        g.drawRect((int)Math.round(x-r),(int)Math.round(y-r),
            (int)Math.round(325.0*r),(int)Math.round(350.0*r));

        g.drawRect((int)Math.round(x-r),(int)Math.round(y-r),
            (int)Math.round(325.0*r),(int)Math.round(19.5*r));

}

public void deleteBox(Graphics g) {

        g.fillRect((int)Math.round(x-r),(int)Math.round(y-r),
            (int)Math.round(19.5*r),(int)Math.round(19.5*r));
}

public void userdesignofclass(Graphics g){

    message = new String("Class");
    g.drawString(message,(int)Math.round(x+140),(int)Math.round(y+15));

    classdesign = (String.valueOf(inputUML.getText())); // This is the code that is giving me a nullpointerexception. I don't understand why, as surely inputUML.getText() should be a String...?
    g.drawString(classdesign,(int)Math.round(x+200), (int)Math.round(y+15));//
}


public double distanceTo(double x, double y) {
    return (Math.abs(this.x-x) + Math.abs(this.y-y));
}

public void update(double x, double y) {
    this.x = x;
    this.y = y;
}


}

This is the class that has the JTextArea, it is in another class and package, but the class that should be retrieving the string Extends this class.

 package classdesign;
 import java.awt.*;

 import javax.swing.*;

 public class ClassCreation extends JFrame {

private JFrame frame;
private JLabel instructionlabel;
protected JTextArea inputUML; //This is the JTextArea containing the text that I am trying to pass through.
private JButton create;

public void initGUI(){

   frame = new JFrame();
   frame.setSize(325, 350);
   frame.setLocationRelativeTo(null);
   frame.setVisible(true);
   frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
   frame.setTitle("Class Design - Fill in Class UML");

   JPanel CreationPanel = new JPanel();
   CreationPanel.setLayout(new BorderLayout());

   inputUML = new JTextArea("Write UML here");
   inputUML.setLineWrap(true);
   inputUML.setWrapStyleWord(true);
   CreationPanel.add(new JScrollPane(inputUML),BorderLayout.CENTER);
   CreationPanel.add(inputUML,BorderLayout.CENTER);

   Create = new JButton("Create Class");
   CreationPanel.add(Create,BorderLayout.SOUTH);
   //Create.addActionListener(this);


   frame.add(CreationPanel);
 }

 public Frame getFrame() {
       return frame;
   }



 }

So what I want to know, simply, is how I can fix that, is there just a line I'm going wrong? or do I have a massive logic problem in what I'm trying to achieve?

Thanks :)

The simple answer to your question is that it's quite possible you haven't instantiated inputUML [ostensibly by calling initGUI()] before calling strings() [which is a terrible name for a method].

But your problems go far deeper than this. This use of inheritance is completely wrong. Favor composition over inheritance. The use of Magic Numbers is not recommended. The variable names are poor (Java is case sensitive, btw. uml is not the same as UML)...do yourself a favor and start browsing Head First Java and Head First Object Oriented Analysis and Design. Also consider Test Driven Development by Example I know you think you don't have the time, but it's much harder to unlearn bad habits than take a little extra time at the beginning to build good ones.

I am new to Symfony, I have wrote small app now have to add unit tests, here is my controller:

<?php
namespace myBundle\Controller;

use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\RedirectResponse;

class IndexController extends AbstractController
{

    /**
     * @param \Symfony\Component\HttpFoundation\Request $request
     * @return \Symfony\Component\HttpFoundation\Response
     */
    public function indexAction(Request $request)
    {

        if ($this->getRequest()->isMethod('POST')) {
            // Do Something
        }

        // Do Something otherwise
    }
}

My test:

class IndexControllerTest extends \PHPUnit_Framework_TestCase
{
    protected $testController;

    public function setUp()
    {

        $this->testController =
            $this->getMockBuilder('myBundle\Controller\IndexController')
                ->disableOriginalConstructor()
                ->getMock();

    }

    public function testPostSaveActionWithBadRequest()
    {
        $expectation = 'Some text ';

        $response = $this->testController->indexAction(new Request);
        $this->assertInstanceOf(
            'Symfony\Component\HttpFoundation\JsonResponse',
            $response
        );
        $content = json_decode($response->getContent());
        $this->assertEquals($expectation, $content->error);
    }

}

When I run this test I get following:

PHP Fatal error: Call to a member function get()

which is basically on following line

if ($this->getRequest()->isMethod('POST')) {

this tells me the container is null (I verified it by printing dump of the container).

any idea what am I missing here or is there a way to provide container as dependency for that test.

I really appreciate all the help.

thanks FI

You're trying to mock the class you're suppose to test:

$this->testController =
    $this->getMockBuilder('myBundle\Controller\IndexController')
        ->disableOriginalConstructor()
        ->getMock();

You should actually instantiate the class you're testing, and mock or stub its collaborators.

However, in this particular scenario, instead of writing a unit test, write a functional test. There's a chapter on writing functional tests in Symfony docs that'll help you.

Your controller uses lots of framework classes (classes that don't belong to you), and you shouldn't mock them either. That's why functional tests are better in this case. Also, make sure you move as much code as possible out of your controller, so you can properly unit test that part (and write as little functional tests as possible).

In the meantime read some books on unit testing (in the following order):

I am reading Test Driven Development: By Example. I am on chapter 13. Chapter 12 and 13 introduced Plus operation to Money object. A Money object can plus by other Money object.

The author added two classes (Bank and Sum) and one interface (IExpression) to the solution. This is the class diagram of the final solution.

Class diagram

Money store amount and currency for example 10 USD, 5 BAHT, 20 CHF. The Plus method returns Sum object.

public class Money : IExpression
{
    private const string USD = "USD";
    private const string CHF = "CHF";

    public int Amount { get; protected set; }

    public string Currency { get; private set; }

    public Money(int value, string currency)
    {
        this.Amount = value;
        this.Currency = currency;
    }

    public static Money Dollar(int amount)
    {
        return new Money(amount, USD);
    }

    public static Money Franc(int amount)
    {
        return new Money(amount, CHF);
    }

    public Money Times(int times)
    {
        return new Money(this.Amount * times, this.Currency);
    }

    public IExpression Plus(Money money)
    {
        return new Sum(this, money);
    }

    public Money Reduce(string to)
    {
        return this;
    }

    public override bool Equals(object obj)
    {
        var money = obj as Money;

        if (money == null)
        {
            throw new ArgumentNullException("obj");
        }

        return this.Amount == money.Amount &&
            this.Currency == money.Currency;
    }

    public override string ToString()
    {
        return string.Format("Amount: {0} {1}", this.Amount, this.Currency);
    }
}

Sum store two Money objects which come from constructor arguments. It has one method Reduce that return new Money object (create new object by add amount of two object)

public class Sum : IExpression
{
    public Money Augend { get; set; }

    public Money Addend { get; set; }

    public Sum(Money augend, Money addend)
    {
        this.Augend = augend;
        this.Addend = addend;
    }

    public Money Reduce(string to)
    {
        var amount = this.Augend.Amount + this.Addend.Amount;

        return new Money(amount, to);
    }
}

Bank has one method - Reduce. It just call Reduce method of incoming IExpression argument.

public class Bank
{
    public Money Reduce(IExpression source, string to)
    {
        return source.Reduce(to);
    }
}

IExpression is an interface that implemented by Money and Sum.

public interface IExpression
{
    Money Reduce(string to);
}

These are my questions.

  1. Bank does nothing good for the solution at this stage. Why do I need it?
  2. Why do I need Sum since I can create and return Money object inside Plus of the class Money (Like what the author did with Times)?
  3. What is the purpose of Bank and Sum? (Right now, it doesn't make any sense for me)
  4. I think Reduce sound strange for me as the method name. Do you think it is a good name? (please suggest)

Keep reading. Kent Beck is a very smart guy. He either has a very good reason why he created the example that way, and it will be clear later on, or it's a poor solution.

"Reduce" is a very good name if map-reduce is the ultimate goal.

I don't know how use TDD in C++ projects, but I decided use "Google Mock Framework" for a start.

But I have one question:
When I finish testing, do I have to clean up my code from TDD's macros, class and etc?
In other words, should the release version of my project include Google Mock?

P.S. What do you advise for learning TDD on practice? (Articles, Books and etc.)

You can try this book: TDD By Example. It uses java, but I think it will help :)

I have been reading Test Driven Development: By Example for a couple weeks. Right now, I stuck on chapter 31 - Refactoring. My question is about Migrate Data on page 183. I don't understand this topic. The example about TestSuite doesn't help me understand this too.

I want to know what does migrate data mean and how to use it in TDD.


[Update] content from the book
Migrate Data

How do you move from one representation? Temporarily duplicate the data.

How: Here is the internal-to-external version, where you change the representation internally and then change the externally visible interface:

  • Add an instance variable in the new format
  • Set the new format variable everywhere you set the old format
  • Use the new format variable everywhere you use the old format
  • Delete the old format
  • Change the external interface to reflect the new format

Sometimes, though, you want to change the API first. Then you should: - Add a parameter in the new format - Translate from the new format parameter to the old format internal representation - Delete the old format parameter - Replace uses of the old format with the new format - Delete the old format

Why: One to Many creates a data migration problem every time. Suppose we wanted to implement TestSuite using One to Many. We would start with:

def testSuite(self):
suite= TestSuite()
suite.add(WasRun("testMethod"))
suite.run(self.result)
assert("1 run, 0 failed" == self.result.summary())

Which is implemented (in the “One” part of One to Many) by:

class TestSuite:
def add(self, test):
self.test= test
def run(self, result):
self.test.run(result)

Now we begin duplicating data. First we initialize the collection of tests:

TestSuite
def __init__(self):
self.tests= []
Everywhere “test” is set, we add to the collection, too:
TestSuite
def add(self, test):
self.test= test
self.tests.append(test)

Now we use the list of tests instead of the single test. For purposes of the current test cases this is a refactoring (it preserves semantics) because there is only ever one element in the collection.

TestSuite
def run(self, result):
for test in self.tests:
test.run(result)

We delete the now-unused instance variable “test”:

TestSuite
def add(self, test):
self.tests.append(test)

You can also use stepwise data migration when moving between equivalent formats with different protocols, as in moving from Java’s Vector/Enumerator to Collection/ Iterator.

I have some questions about using Microsoft::VisualStudio::CppUnitTestFramework and Test Explorer for native C++ projects in Visual Studio 2012.

  1. How can i run tests in specific order? In this msdn article said:

    For more information, see Organizing C++ Tests.

    But there is no information about it on msdn, or i just can't find it. (All i found was some articles about ordered tests in managed projects)

  2. In this article said, that in case when my under_test code is static library - i can access private methods from tests. But i can't direct access to it, and found nothing better then define private/protected as public when testing, but i think that it isn't right way.

  3. I only recently start using unit tests and not very familiar with it. What is the best framework to start with (for Windows + native C++ usage)? May be boost or googletest, or any other? My main IDE is Visual Studio, so i want good integration between test framework and IDE, like manage/run tests from native Test Explorer, etc.

For the first question: You can't. You should write your unit tests in a way, that they do not interfere with each others results.

Second: The only way to do that is with reflections. Again I do not recommed it. If you feel, that there are private method that should be tested, that is almost always the sign of that you should divide your class into smaller classes.

Third: The Visual Studio test framework is fine, if you do not have any specific need to look for others, then just stuck with it.

And good luck with unit testing :) If you feel a little lost, let me recommend you this book:

http://www.amazon.com/Test-Driven-Development-By-Example/dp/0321146530

It explains everything you need.

Over the last week or so I have been trying to learn how to unit test in Angular. But I have found my self asking a lot of questions on-line and dispit looking at the Angular doc's, Jasmine doc's and on-line blogs I am still feeling a little lost. Does anyone know some good resources for learning how to unit test? It doesn't have to be specific to angular I'm also interested in learning the theory or methods people use to unit test their projects.

http://www.amazon.com/dp/0321146530/?tag=stackoverfl08-20

If you've never done or are curious about TDD, this is a great book to carefully walk you through learning how and why to do it.

Where do we start using Unit testing?
I have some doubts about where to start using Unit testing.
I am doing unit testing with Junit in RAD. And I am doing it after all code is ready to deploy or may be after deployment. I am confused why we are doing unit testing after code is almost ready to deploy.
My question is when we should start unit testing?

I have some more questions.....
What I am doing in my unit testing is I took one method from a class and create a test class for that method.
In that class I give some input values to the method and expect respected output values from the database.
Now here the single test class does taking input values->passing it to method->calling the method from original class->database connection->fetching value from DB->return it to test class.

If test successfully runs then Junit console shows Green Bar else Red bar.
Red bar with the cause of error.But it doesn't generate any unit test report.

Now here is my question...
Am I doing correct unit testing? Since a single unit test method comprises all code flow and gives result...

The best time to start unit testing, if you haven't already, is now. The most effective use of unit tests is Test-Driven Development (TDD), in which you write the tests for each method as you implement it (write a test that fails and then implement the method to make it pass). However, it's not too late to add tests later on. JUnit tests solidify certain assumptions about the code that could change later on. If the assumption changes, the test breaks and you might save yourself from some bugs that would have beeen really hard to detect.

I don't know of a report facility, but you can add a JUnit ANT task which will output the test results to your build console (or log, if ant output is captured).

Your database tests sound like small integration tests rather than unit tests. That's good, but if the tests are too slow you might want to consider using mock objects (via a framework like JMock or EasyMock) to replace real DB connections with fake ones. This isolates the logic of the method from the state and behavior of the database and lets you run tests without worrying about whether the DB is running and stocked with test data.

Useful links :

http://en.wikipedia.org/wiki/Test-driven_development

http://www.jmock.org/

http://easymock.org/

http://ideoplex.com/id/25/ant-and-junit

http://ant.apache.org/manual/Tasks/junit.html

http://misko.hevery.com/code-reviewers-guide/ (Guide to writing unit-testable code)

[Edit - in response to comment]: About whether what you've done is correct unit testing, technically the answer is 'no'. As I said, it's more of an integration test, which is good, but it does too much and has too many dependencies to be considered a real unit test. Ideally, a unit test will test the logic in a method and not the dependencies. The guide to writing testable code mentioned above (as well as the associated blog of Misko Hevery) gives some good advice about how to write code with seams to insert mocks of dependencies. Michael Feathers goes into depth on the subject in his excellent book Working Effectively with Legacy Code.

The key factor is dependency injection: Testable methods don't look for their dependencies - they receive them in the constructor or as arguments. If this requires too much refactoring, there are some tricks you can try, such as moving the code that looks for dependencies into protected or package-visible methods you can override. So then you run the tests on a subclass which returns mock objects from those methods.

You should really write your tests as early as possible, ideally before you write your implementation.

This is a book that I've found useful on the subject and may be worth a read... http://www.amazon.com/Test-Driven-Development-Addison-Wesley-Signature/dp/0321146530

To try and address your second point, what you are describing is an "integration test" i.e it is not just testing your unit of code, but also your databases connectivity, configuration, data and the like.

Unit tests should only test the specific "part" of code that you are concerned with. i.e. Calling your method should just test that method. It should not be concerned with Database conectivity and data access. You can achieve this using "Mock" objects to act as a temporary replacement for your dependencies.

See: http://msdn.microsoft.com/en-us/library/aa730844.aspx

Whilst this document is from Microsoft and you're using java, the same principles apply.

Hope this helps.