In “More Ruby-Influenced BDD in .NET”:http://codebetter.com/blogs/scott.bellware/archive/2007/06/19/164380.aspx, “Scott Bellware”:http://codebetter.com/blogs/scott.bellware/ writes:
bq. I’m not sure that C# will ultimately the language that will let BDD really shine on the CLR. Without some kind of DSL ability like what can be had from Ruby and Boo, it’s kinda hard to conceive of specification syntaxes that won’t be fraught with C# line noise.
Here is a thought I had while reading this: _Is C# ultimately the language we will use on the CLR, period?_ (VB guys, substitute C# for VB.NET if you want)
Or, to put it in another way, which has the longest lifetime, C# or the CLR platform? For instance, on the Java side of things, here are two slides from “Polyglot Programming”:http://www.sda-india.com/conferences/jax-india/sessions/Neal_Ford/Neal_Ford-Polyglot_Programming-slides.pdf, a presentation by “Neal Ford”:http://memeagora.blogspot.com:
bq. Another language will supplant Java as the primary way people code the Java Platform
bq. Start treating Java as the assembly language of the Java Platform
In the future, will C# be the primary way of programming the .NET platform?
I used to subscribe to the blog “Worse Than Failure”:http://worsethanfailure.com/ (formerly The Daily WTF), but it got removed when I was trimming the number of blogs in my reader. If you have not seen it, they post examples of code that is really badly written or designed. Most often there is also a funny story to go with the code. One of the most absurd posts that I remember was one called “The Brillant Paula Bean”:http://worsethanfailure.com/Articles/The_Brillant_Paula_Bean.aspx.
It is a very short post, so go read it if you want. For the even shorter summary, it tells the story of a company that brought in a contractor, Paula, to help with a Java project. She was apparently handed some work to do, and for a few months she reported good progress during the weekly status meetings. However when the deadline came closer she asked for some help to finish it in time, and then it became clear that all she had produced was a couple lines of code that could do nothing more than return a misspelled string.
I guess all posts at WTF has to be taken with a grain of salt, but for now let us consider this one to actually be true. Like I said, it is really absurd, but what is it actually that is most absurd here? While there is no question that Paula was not a very good programmer, can we really blame her for this mess? If the story is true, this is clearly an example of a “process problem”:http://silkandspinach.net/2006/04/02/people-problems-vs-process-problems/. Why did no one notice there were no commits from her? Why did no one review her code? How did she ever get hired? There are so many ways this mess could have been avoided. For instance if Paula would have been pair-programming, her lack of skills would have been noticed at day one. With smaller tasks it would have been clear that she where having problems much earlier.
Unfortunately, even if the story turns out to be untrue, I do not think it is that uncommon. Change a few of the extremes of the story (the ridiculously low amount of code produced, the quite long time period, the insanely large task she must have been working on to be given so much time) into more realistic ones and I think most of us will recognise situations that we have been in. Maybe not situations that ended up the same way, but definitely situations that had the potential to do so. Things such as not testing the code and application enough, not having code reviews, and working on too large tasks with individual responsibility for the programmer that is given them all have the potential to end up in disasters such as The Brillant Paula Bean. When they do, instead of just pointing fingers at the person who screwed up, ask how it happened. Find out what is the root cause and eliminate it.