Is the SDLC a thing of the past?

16 08 2007

It’s days like today that I start to wonder if I’m getting too old for this business….

 A thread started over at Sitepoint yesterday which started off innocently enough.  It asked which of these steps of the System Development Life Cycle is the least important:  Problem Definition, Software Specification,  Algorithm Design, Coding (Implementation), Documenting, Testing or Evaluation.  Seems like an innocent enough question, right?

Coming from an “old school” background (my first couple jobs were mainframe based) and having had the SDLC beaten into my head over the years, I answered the question as best I could – all things considered, Coding is the least important step of the SDLC.  If you define the problem and develop the specs and algorithms properly, coding becomes very easy.  You then need to test thoroughly, document it well (both for users and for developers who have to maintain your code later), and then evaluate what got accomplished to ensure you solved the problem.  All in all, coding is the least important step.

 Then someone piped in that this lifecycle was too restrictive and “real men do it iteratively.”  WTF?

OK, maybe I’m getting too old for this profession, but what am I missing here?  I know the concepts of extreme programming and iterative development, but how is that really different from the “standard” SDLC?  From my perspective, here’s the standard SDLC

  1. Problem Definition
  2. Software specification/algortihm design
  3. Coding
  4. Testing
  5. Documentation
  6. Evaluation

Now here is the “iterative” format

  1. Problem Definition
  2. Software specification/algortihm design
  3. Code functionality A
  4. Test functionality A
  5. Go back to 4 for functionality B
  6. Rinse and Repeat for each piece of functionality defined
  7. Documentation
  8. Evaluation

The benefits of iterative development isn’t in problem definition and specifications.  The time savings comes in the coding and testing phases in that you’re only coding and testing one piece at a time, ensuring that you are able to identify exactly where a problem is occurring.  You’re not playing guessing games trying to determine exactly which piece is going wrong. 

It’s a philosophy I’ve been following for years before it became a popular design pattern – this has always made sense to me.  But it seems to me that there’s a new way of thinking which has permeated modern development.

  1. Define one small problem
  2. Design a solution for that problem
  3. Code for that problem
  4. Test that code
  5. Return to step one for a new problem
  6. Rinse and Repeat until all the problems are fixed.

Notice documentation isn’t in there – another trend I find disturbing.  “All code should be self documenting”….HA!  Try to visit someone’s code from a year or two ago and tell me that’s the case.  Not gonna happen.

But back to my point – it seems that there’s such a push to get a product out there, that the basic premise of good development has gone away – figuring out the problem first.  It seems that it’s OK to put out something small, thinking “oh, we’ll add feature X in later” – even if feature X may be the key feature which the end user would want, or even the feature which make the end result a viable and usable product.

Or am I just missing something here?




One response

27 11 2008

Excellent site!,

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: