3/28/2009

Doing The Right Thing And Getting It Done

Been following my man Corey as he moves along on his Journeyman Tour (which I posted about on InfoQ at its onset), in fact had dinner with him tonight!

Here's a talk he did recently on "Practice", which I'm totally in agreement with. Check it.


Practice - Corey Haines from 8th Light on Vimeo.

Keep on keepin' on, Corey!

3/27/2009

Killing "Quality"


So, a guy walks into a bar...

No, just kidding. But seriously though...

You walk into a bar and order a "low quality" beer (tough economy these days). What does this mean to you? That it'll come out with a bug swimming in it? That it will be served heated up? That it will be skunked? Glass only half full?

No, of course not. You'd be appalled, outraged. Have every right to ask for your money back. This place has acted in bad faith, they've acted unacceptably. You leave angered and vow to never return.

Your wife wants you to drop $600 on a new Louis Vuitton bag (you gotta buy cheap beer and she' still asking for Louis?!?), because it's a "top quality" product. Does she describe it this way because she knows it has less likelihood of falling apart the minute she gets it, or its colors smearing, or it having holes in the bottom? Again, no. These things are assumed to be absent. In the rare case you do happen to find anything like this, you are sure to invoke "return to sender, no questions asked".

I could continue on this way throughout various types of products, but I suppose you get the point.

Well, this ain't the case with our industry (Software Development). So I ask ye: WTF, man?

We use the word "quality" as an indicator of both "value" and/or "defect" presence, interchangeably.

When someone says "high quality" they could very well be talking about the absence of defects or they could be talking about the presence of value. And by software standards, neither would be an inappropriate use of the word.

Not so in the bar, not with your beer. Why then for us? Again I ask, WTF?

I know, I know. Some of you are sitting there saying (correctly), "hey, wait, agile pays attention to customer feedback to iterate towards a higher quality solution. We're talking about 'value presence' there." Or maybe your saying, "Our company's website says 'highest quality product in the industry!'. That's talking about 'value presence' too! That's what we mean when we say 'quality'."

Well, maybe so (and good for you, seriously). But I bet many of you still call your testers "the QA folk". "QA", as in Quality Assurance, as in "Those who ensure quality". But really, is their (primary) function that of ensuring 'value presence', or of ensuring 'defect absence'? Look, don't get me wrong, on a healthy agile team these wicked cool cats ARE involved in 'ensuring high value', but at the end of the day that isn't why you're calling them "QA'ers".

I've seen my share of "quality review" meetings, or fancy charts of "quality metrics", or annual budgets that talk of "quality goals" ... where in each of these cases, "quality" is talking about DEFECTS.

When was the last time someone described the benefits of TDD without mentioning, excitedly, "increased quality"? (And, yes indeed, I am certainly just as guilty as the next guy.)

So, why do I care? Why the rant?

Because, it's an indication of the generally absurdly high level of tolerance our industry has for defects.

That we describe defect density with the same word that we use to describe product "value" (to our customer/consumer) is gross.

Heck, I'd go as far as to say its a part of the reason that we have such a generally high tolerance in the first place. It's confusing and it is misleading. It's too easy to masquerade our defect problems as a "quality improvement initiative" or some other politically wimpy label.

When our customers get a bunch of "bugs in their beer" why aren't they appalled and outraged? Why aren't we? Why don't they storm out and vow to never come back? Why is it so damn okay to have defective software? For the third time, WTF?

Don't get me wrong, I'm not saying we should have an absolute "zero defect, period" attitude, that's simply unrealistic considering what we do. But we should have a low enough tolerance for defects that it makes no sense for us to use the word "quality" when talking about them.

"Quality" should be used as a measure of functional/aesthetic utility to our consumer, and not as a measure of defects. Really, it should just be assumed that defects are generally absent. This should just be implied in what it means to be a Professional.

So:

I hereby propose we as software professionals and businessmen stop using the word "quality" to mean a "measure of defects".

Well then, what word will we use? I polled Twitterverse with this question and received lots of great responses, but the group's choice as winner, courtesy of one Brian "Big Ball Of Mud" Foote, is this:
Shittiness.

Software with minimal defects has a "low degree of shittiness"; lots of defects means it has a "high degree of shittiness".

Of all the tweep responses though, overall my very favorite was from Lisa Crispin who said, "I have never liked measuring defects so it's hard to think about what to call it."

Nice, now that actually says it all. Something tells me that Lisa's company does not simply think of their testers as "the QA folk". There is hope!

3/22/2009

TDD: The Synergy


TDD. What does this stand for?

Test Driven Development.

Said another way: Development, driven by tests.

Notice, the acronym doesn't stand for test driven design. Not test driven analysis. Not test driven coding. Nope, not even test driven testing. It is, again, test driven development.

TDD is not a "task". It's not a thing we do in addition to doing design, or in addition to writing production code, or in addition to "writing" stories. Rather, TDD it is the way we design, the way we write code, the way we define a story. It is the way we do development.

When done well, TDD exists at root of nearly everything we do in our agile team - it is the team's minute-to-minute comforting heartbeat.

TDD means that "writing a story" equates to specifying in an automated acceptance/story test (most often using FitNesse, Selenium, WaTiR, or something similar) "what it will look like" for a story to be working once we've implemented it. How the new application feature monickered on the story card must behave for it to be DONE.

TDD means that "designing" equates to specifying in an automated unit/micro test (most often using a member of the xUnit family of tools) "what it will look like" for any particular object to work once we've implemented it. In other words, what the expected behavior of that object is.

TDD means that "coding" equates to letting that unit test tell us when we've written working production code. Possibly more importantly, it means that "coding" equates to letting those passing unit tests ensure that our object continues to work (satisfy it's expected behavior) while we mercilessly refactor that object to keep it super clean and expressive.

And, in case you didn't notice in the above assertions, it means that our development is driven by tests. This means that the test is written first. The existing [failing] test is what drives us to conjure up production code. The test is the only reason for us to write a line of code. It's the only way we know what code to write - the test is our specification. Yes, we can write production code first then try to write a unit test to verify it's implementation, but, in addition to the mechanics of that likely being extremely difficult and counter-productive, that's simply not TDD.

Make no mistake about it. True TDD, at the acceptance (Story/Service) test level, is an act of analysis. And true TDD, at the micro (unit/object) test level, is an act of design. In a single statement: True TDD is much less an act of verification (as the "test" part of the name might accidentally and incorrectly imply) as it is an act of specification.

In summary, TDD provides the fundamental synergy between and foundation for all of our development "tasks" that enables us to truly build the right software and to build the software right. It ensures quality2 - which, in the end, is the only true way for us to go fast and go fun.



** Upon further review, authors are changing the catchy phrase "ensures quality2" to the more appropriate phrase of "ensures high quality and low shittiness"

3/10/2009

Rubber Duck Debugging




From an ethernal.org discussion:

"We called it the Rubber Duck method of debugging. It goes like this:

1) Beg, borrow, steal, buy, fabricate or otherwise obtain a rubber duck (bathtub variety)
2) Place rubber duck on desk and inform it you are just going to go over some code with it, if that's all right.
3) Explain to the duck what you code is supposed to do, and then go into detail and explain things line by line
4) At some point you will tell the duck what you are doing next and then realize that that is not in fact what you are actually doing. The duck will sit there serenely, happy in the knowledge that it has helped you on your way.

Works every time. Actually, if you don't have a rubber duck you could at a pinch ask a fellow programmer or engineer to sit in."


I must say, I do quite like this. I must also say, THIS IS WHY WE PAIR!

Having (hell, getting) to vet your thoughts out-loud, in real-time, with another person is a fundamental benefit of pair programming, one of the primary reasons overall productivity is increased, not decreased.

Forget asking "at a pinch", have him there all along!

Thanks for pointing me to this Buffer, fun stuff!!!

3/02/2009

Nutshells: Why TDD?

So, ya need that techie-friendly quick-pick-list to explain to someone why your project simply cannot live another second without TDD? I know I did, so I created one. And I'd like to share it with you faithful readers.

Confidence & ProductivityToday
  • Immediate feedback, ie. "cookies": Green bars that give me full confidence in what I’m coding, or in other words, meeting requirements with decreased bugs

  • A more efficient design:
    • Clear “expectations” (tests) drive my mind quickly and smoothly to solutions (algorithms)
    • Good public interfaces – my tests make me “eat my own dog food”
    • Simplicity – I’ve coded only what was needed to pass the tests
  • A more effective design: in order to write software that is effectively tested in this manner, I’m naturally encouraged to ultimately write more cohesive and less coupled code
  • Test “protection”: “my tests” give me freedom to refactor mercilessly to create a more expressive, readable code-base while I’m coding
  • And then even more Test “protection”: “all the tests” allow the build to silently ensure that today’s collective changes across the group haven’t broken any windows

Confidence & Productivity … Tomorrow
  • Fact: addressing yesterday's bugs exponentially decreases today's productivity...unless:
    • yesterday 's TDD decreases (if not nearly altogether removes) yesterday's bugs

  • Fact: most bugs occur upon changing an existing code-base...unless:
    • Today’s [automated] tests become your safety net tomorrow
    • Growing test suite allows build to be the perpetual “silent police”

  • Fact: easy to understand my design; hard to understand your design...unless:
    • Today’s [expectation-based] tests become your documentation tomorrow
    • My expressive code is easy for you to understand tomorrow

  • Fact: “code rot” (poorly factored code) slows development over time...but:
    • My tests force me to keep my design well-factored – forever!

  • Fact: “paper documentation” gets outdated (and lies!) over time...in contrast to:
    • Automated Storytests: up-to-date, runnable “what it does” documentation (analysis)
    • Automated Microtests: up-to-date, runnable “how it does it” documentation (design)

Of course, I'm not perfect, and feedback is king. So, please, pretty please with sugar on top, let me know how you think this list can be improved!

 
/* */