I’ve been reading about the ensuing Apocalypse that is currently underway over at Microsoft about IE8. The article by Joel Spolsky is worth the read because it highlights a constantly recurring problem.

The problem in a nutshell is whether or not to go down the “Standards” road where you adhere to a very specific set of APIs and anything out of the ordinary is rejected, or you support the robustness principle from  Jon Postel or a variation on this using very defensive coding.

Where I find this interesting is that over time I have designed various interfaces, APIs, module structures and always had in the back of my mind:

What do I support in the next version ?

The options are fairly straight forward, either break with the past, and don’t support it, or try to make your application backwards compatible. The decision is not quite as straightforward. I agree that the middle ground doesn’t really exist, and the outcome is not likely to be favourable to either. So am I a pragmatist or an idealist ? Not sure. I’d like to be an idealist in my vision, but a pragmatist in my delivery. Let me explain…

So here’s the thing, the robustness principle of: “be conservative in what you do, be liberal in what you accept from others”, I have never followed fully. I have always preferred an approach of conservative in what I do, and accept.

When XML Schema came out and was available as a sensible tool for validating the input for my APIs or data-feeds, it seemed to support this idea. If something wasn’t right I could reject it and tell people where the error was. From an operational viewpoint this approach seemed well suited, as when something went wrong, the erring component could be identified and a fix applied. The cynics might also gather that this was an easy way to figure out who screwed up in the supply chain.

The problem usually came in “Version 2.0”. Try as you might, you cannot design an infinitely expandable API. In fact you shouldn’t. Design what you need using the relevant patterns and domain knowledge and you’ll have a decent version for “now”. However during the next version, you might have to tweak some of the old parts. Changes are not confined to being additive. So what happens ? The Schema approach would mean old versions of the calls/data would be rejected instantly, which means you have condemned your clients to follow your upgrade path.

But what of Version 2.0 ? Do we support a view where the old version is embedded in the new and we relax (no pun intended) the Schema ? If we go down this road, then validating the input has little value, as all we will know is the data will work, but we don’t know which version.

Do we support multiple versions side by side ? Possible. And here I follow the path of the pragmatists, and end up in “DLL Hell”. In order to do this I need to know who is using which interface, I also potentially need to support multiple systems, and depending on my release schedule, that could be a lot of systems.

My current approach is to follow the multiple version option, but to validate explicitly against the version being used and reject if invalid. The main thing I’ve done to apply this is to ensure that the operational support and the structure of the application/API/other is actually thought about at the beginning of the project and not bolted on as an after-thought. Make friends with the operations guys, see where the business folks want to go, and make sure the programmers have access to the system, and your life will be a little easier.

Would this work with the browser world ? Essentially all the possible versions of the HTML spec are now wrapped into one blob which is “HTML”. The room for error is huge. Cutting away all the differences and “non-standard” features would make you a nice fast small browser that would work with… no-one. In this respect sticking to the standards will not make you any friends, especially as there is no way to force everyone to use a “standards” compliant browser. Despite the power of the browser developers, the real driving force is the web developers, if everyone did decide to use “XHTML 1.0” for the sake of argument, then removing HTML 3.01 support would be an easy decision to make. The issue is that all the sites we depend on, banking, travel, search etc, all use “features” of our browsers to give us the experience we “want”, and introducing a new browser that breaks or refuses these will not meet a favourable reception.

So while there is a war amongst the developers and also trouble over the design, the actual barriers are in the people we’ve provided for, and their ability to; or desire to want – change.