The Last Straw

I had always tried to be a "good" programmer - analysing, designing, coding and testing in separate, discrete steps. However, I always found myself subverting the process and getting something tangible in front of the Customer in order to obtain their feedback. I actually got in quite a bit if trouble for doing it once. However, one project turned out to be the Last Straw for my attempts to follow conventional wisdom of the time.

A thread a couple of months ago in the XP Yahoo Group reminded me of how and why I came to be a part of the Agile development movement.

A post by Brian Nehl referred to two articles about the development of the Canadian Firearms Registry:
From June 1998 to December 2000, I worked on that project. The article in Baseline magazine is, in my opinion, quite balanced and accurate. Dwayne King, who was one of my colleagues, describes quite well the environment in which we worked. Some things that weren't said:
  • The program was backed at the very highest level of the government and would be implemented come hell or high water.
  • The estimated total cost of the system was based on the development cost of the system that was in place at the RCMP at the time. However, that system only captured the licencing information for people using Restricted firearms, which is only a fraction of the total number that were to be contained in the new system.
  • The application was developed using PowerBuilder 5.0 and Oracle 7.3, which were somewhat long in the tooth in 1998. However, this decision was made in order to minimize the technical risk, and was something with which I didn't have a problem.
The methodology used for the project was straight waterfall, although we were using OO techniques for the design and development (I believe they may have moved to RUP shortly after I left in the fall of 2000). The requirements for the system were drawn from the legislation and the associated set of regulations. The problem with this approach was that quite often we needed legal opinions from the Dept. of Justice over what was meant by certain parts of the legislation. Equally as often, the same question could elicit multiple different answers from different people, or would result in hours of discussion over a single, minute point.

Since there were so many major stakeholders for the system (14, I believe), there was a group that handled all of the requirements definition and management. They were essentially our Customer, but they acted as an insulating layer between the development team and the real Customers. Their 'product' was a big binder called the BPE - Business Process Engineering, and it contained page after page of decision matrices, some of which were even up to date. In retrospect, those matrices could have been great FIT tables, although they often weren't consistent.

From a contracting perspective, it had been a real dogfight among the big SI's to land the development contract. It was eventually won by EDS, with the testing contract being awarded to Systemhouse. The Baseline article suggests that EDS and Systemhouse worked together, but nothing could be further from the truth. There was a lot of bad blood between the companies over the bidding process, and there was a very adversarial relationship between the teams at the management level. It was downright petty at times, although the people in the trenches (i.e. developers like myself and the individual testers) got along fine.

When I joined the project, the development team consisted of approx. 25 PowerBuilder and Oracle PL/SQL developers. This eventually grew to about 35-40 people, although the total number escapes me. Of the developers, almost all were contractors - EDS subcontracted the development to the company through which I was contracting at the time - with a few employees. There was a lot of talent on the team, with decades of combined experience at the analyst/designer level as well as at the developer level. It was, essentially, an all-star team, and the contract paid well. As Dwayne King suggests, the project was well within the technical capabilities of the team. With the benefit of Agile Hindsight, the team could have been considerably smaller, and still produced better results using an agile approach.

When I joined the project they were ramping up for an initial release to production in December of 1998. They had just undergone what they called the 'Alpha' release, which was a proof of concept intended to verify that the system would be workable and to expose any technical risks. There were some substantial performance problems in the application, but these simply required some optimizations of the code which were eventually performed. I was also surprised to find out that the database schema had been determined before any other development had even started.

The development team itself was split up into groups that focused on particular areas of functionality, and were led by an architect/designer. Despite the fact that we were all located in the same place, there was a surprisingly low level of communication between these groups. This led to some significant issues:
  • Different approaches were taken to developing different parts of the same application in terms of class hierarchy, separation of logic between application layers, etc.;
  • Different coding standards were applied between the groups;
  • Different groups at times used the same database columns for different purposes;
  • Code integration was anything but continuous, occurring usually just before a release to the test group. This happened about every 6 months.
Needless to say, integration was a mess. The application would compile on individual workstations, but the whole thing rarely did. The last week before shipping the system to the test group was usually spent just getting the bloody thing to build, let alone perform our own testing.

Once those problems had been ironed out (or just hidden), the system was tossed over the wall to the testing group. They used the (frequently out of date) BPE document as the basis for their tests, and many inconsistencies quickly became evident. Sometimes the code was wrong, sometimes the requirements were wrong. In the end, it always becase a battle.

So, here's a project that had the following issues:
  • The powers that be believed that they could "nail down" the requirements based on legislation and regulations that were still in a state of flux;
  • A group insulated the development team from anything resembling the Customer;
  • The initial estimate of the project was based on "a similar project" that was only similar in that it had information about firearms and ran on a computer;
  • The estimate was not made by anyone from the team that would have to implement the application;
  • The development team was split into multiple groups, each one working almost in isolation;
  • Code integration was infrequent;
  • There was no automated testing of any sort, and no formal unit tests;
  • Acceptance testing was manual, and performed after the fact by a group completely separate from either development or the Customer;
  • Iterative development was not practiced;
  • Releases were typically once every 8 months to a year;
And this system caught the eye of the Auditor General? What a shock.

For a while, I just figured that this was what life was like on large projects with large teams. I did get a reprieve in early 2000 when I was tasked to build a small satellite application that had very focused requirements and an accessible Customer. Once that was finished, though, I had had enough. I actively sought out a new contract, and in the fall of 2000 one came my way.

A funny thing happened during the interview for that contract. About halfway through, I realized that it was for a Java developer position! At that point, I had about 6 months of playing around with Java, but I certainly wasn't in any position to be marketed as a Java consultant. Fortunately for me, this was still during the tech boom here in Ottawa, and companies such as Nortel and Alcatel would hire anyone who had Java on their CV and could fog a mirror. That left a significant void in the contracting world for Java developers, and the manager who interviewed me explained that my O-O experience was what they were really after and I could learn Java while I was working! Uh... OK!

I started working with Jim Leask, a consultant from Sybase. Jim had a couple of years of Java under his belt then, which was pretty impressive considering that Java itself was barely out of diapers. Our mission was to create a framework that would be used by "an army of developers" to build a big, honking case management system for this government department (which is another story unto itself!). The goal was to simplify things such as data access, GUI building, etc. so that they didn't need expert developers to build this thing.

We started out by handling one of the most critical aspects of any system built for the Canadian federal government - bilingualism. We began doing some modeling in Rose, but after a day or so we decided to validate our designs in code. Since I was a Java newbie and Jim was an expert, we decided that I'd do the typing and he would look over my shoulder to guide me as I figured out Java. We threw some classes together and realized that something in our design wasn't quite right, so we switched to his machine and updated the design based on what the code told us. While there, we added a bit more to the design, then switched to the other machine and I again hammered away on the keyboard. This back and forth activity continued for a couple of hours, at which point Jim uttered the immortal words:

"This reminds me of that Extreme Programming thing I heard about."

Once the images of Mountain Dew-swilling snowboarders with laptops cleared away, I did a quick Internet search (I can't even remember what search engine, though I know it wasn't Google). I believe the first site we visited was either Ron Jeffries' XProgramming.com or the C2 Wiki. The next day we hit the bookstore - Jim bought XP Explained, and I bought XP Installed. The rest is history.

So, I find it interesting that my disgust over the classic waterfall practices and the inherent dysfunctionality of a team using them led me to the Agile Promised Land. I will never go back.