Nick’s Flawed Vision of a Shared Integration Model

Of all the things you might say about Nick Malik, “thinks small” is not one of them. He takes on a significant percentage of the .NET developer community over the definition of Mort. He wants to get IT out of the applications business. He invents his own architecture TLA: SDA (aka Solution Domain Architecture). He’s a man on a mission, no doubt. And for the most part, I’m with him 110% on his ideas.

However, when he starts going on about a shared global integration model, I start to wonder if he has both hands on the steering wheel, as it were.

Nick’s been talking about this for over a year. As he points out, SaaS integration layer is the new vendor lock-in. One of the attractions of SaaS is that you could – theoretically, anyway – switch SaaS application providers easily which would drive said SaaS companies to constantly innovate. However, if the integration layers aren’t compatible, the cost to switch goes up dramatically, leaving the customer locked-in to whatever SaaS company they initially bet on – even if that bet turns out to be bad.

OK, I’m with him so far. Not exactly breaking news here – we’ve seen the same integration issues inside the enterprise for decades. SaaS adds new wrinkles – if your ERP vendor goes belly-up, they can’t take your data with them or worse sell it to your competition – but otherwise it sounds like the same old story to me.

However, where Nick loses me is when he recommends this solution:

“To overcome this conflict, it is imperative that we begin, now, to embark on a new approach. We need a single canonical mechanism for all enterprise app modules to integrate with each other. If done correctly, each enterprise will be able to pick and choose modules from different vendors and the integration will be smooth and relatively painless.”

Yeah, and if a frog had wings, it wouldn’t bump its ass when it hopped.1 There are so many things wrong with this approach, I’m not sure I can get them all into a single web post.

First off, it won’t, in fact, be done correctly – at least, not the first time. I realize everyone knocks MSFT for never getting an application right before version 3.0, but I believe it’s actually systemic to the industry. Whatever you think you know about the problem to be solved, it’s at best woefully incomplete and at worst wrong on all counts. So getting it right the first time is simply not possible. Getting it right the second time is very unlikely. It isn’t until the third time that you really start to get a handle on the problem you’re really trying to solve. Getting an effort like this off the ground in the first place would be a Herculean task. Keeping it together thru a couple of bad spec revisions would be impossible.

Meanwhile, the vendors aren’t going to be waiting around twiddling their thumbs waiting for the specs to be done. We’ve seen efforts to unify multiple completing vendors around a single interoperable specification. By and large, those efforts (UNIX, CORBA, Java) have been failures. The technologies themselves haven’t been failures, but the idea that there was going to be “relatively painless” portability or interoperability among different vendors never really materialized. If it didn’t work for UNIX, CORBA or Java, what makes Nick think it will work for the significantly more complex concept of a shared global integration model? Not only more complex in terms of spec density, but the mind-boggling number of vendors in this space.

Nick is worried that either “we do this as a community or one vendor will do it and force it on the rest of us.” But if you look at how specifications evolve, retroactive realization of defacto standards is the way the best standards get created. For example, I could argue that TCP was forced on us by the US Military, but I don’t hear anyone complaining. I realize there’s a big difference between having a vendor force a spec down our throat vs. a single big customer, but either way it’s not designed by committee. Besides, if we do see get an enterprise integration standard forced on us, I don’t believe it will be the vendors doing the forcing. If I were a betting man, I’d bet on Wal-Mart. Business leverage trumps IT leverage and Wal-Mart has more business leverage than anyone in this space these days.

BTW, would design-by-committee be an extreme example of BDUF? Do we really want to develop a this critical integration model using the same process that produced the XSD spec?

Finally, Nick thinks that this model will improve innovation, where I think it will have the exact opposite effect. Once you lay a standard in place, the way you innovate is to build proprietary extensions on top of that standard. However, by definition, these extensions aren’t going to be interoperable. If someone has a good idea, others will copy it and eventually it will become a defacto standard.

A recent example of the process of defacto standardization is XMLHttpRequest. Microsoft created it in 1999 for IE 5, Mozilla copied it for their browser a couple of years later, followed by the other major browser vendors. Google innovated with it, Jesse James Garrett coined the term AJAX, everyone else started doing it and then finally – nearly a decade later and still counting – a standards body is getting around to putting their stamp of approval on it.

However, if you’re worried about painless integration and not having something forced on you by some vendor, then you’re not going to embrace these innovations – which means, you won’t embrace any innovation. Well, there may be some innovation in the systems themselves that doesn’t affect the interface, but once that interface is cast in stone, the amount of innovation will go way down. How do vendors differentiate themselves? There’s only two ways: price and innovation. Take away innovation with standardization, and you’re left with a race to the rock bottom price with no incentive to actually improve the products.

I get where Nick is going with this. He looks around our enterprise and sees duplication of effort and massive resources being spent on hooking shit together. It sure would be nice to spend those resources on something more useful to the bottom line. But standardizing – or worse legislating – the problem out of existence isn’t going to work. What will? IMO, applying Nick’s ideas of Free Code to interop code. If we’re going to get IT out of the app business, can’t we get out of the integration business at the same time?


  1. It’s exceedingly rare that you get to quote Wayne’s World or Raising Arizona in a discussion about Enterprise Architecture, much less both at the same time. Savor it.

Caps 6, Penguins 5 (SO)

NHL.com Game Summary NHL.com Game Recap

It wasn’t a pretty win, but I’ll take the two points just all the same. Especially given the long history Washington has of losing to the Penguins – they had lost the past six meetings before tonight. Japers’ roundup mirrors my own thoughts, though On Frozen Blog’s roundup was funnier – they made a drinking game out of the number of times the on-air announcers referenced Sid the Kid. Certainly, there’s no love lost for Crosby among Caps fans, but the amount of on-air time spent discussing an injured player bordered on ridiculous – Sid the Kid was mentioned 27 times by OFB’s count + five in the post game. Worst was probably Malkin’s first goal – he hadn’t even stopped celebrating and the announcer was already talking about Crosby.

Caps were pretty dreadful on special teams tonight. Pens had three goals on eight penalties while the Caps had only went one for six. However, the penalty kill came thru in overtime overtime with the Caps down 5-3 for 1:07. The Caps won both defensive zone face offs and blocked four shots – Quintin Laing had three of those blocks – and kept the Pens from registering a shot on goal for the entire power play. That was money. If they gave out game stars to unsung heroes, Laing would have gotten one.

For all the great young talent on the Caps, there’s got to be real concern about goaltending. I love Olie the Goalie, but he didn’t get it done tonight. The Penguins had a grand total of 15 shots (14 if you don’t count the one from beyond the blue line with one second left in overtime). 5 goals on 15 shots == a pretty crappy save percentage. Malkin’s first goal was very impressive skating, but he didn’t so much shoot as throw the puck at the net. And letting Talbot’s  open the scoring by stuffing the puck in at the post was weak sauce as it were. I’m not so much worried about it for this season, but with Kolzig talking retirement as his struggles, I’m not sure who the Caps have in the pipeline between the pipes.

It sure was fun getting to watch an entire Caps game in its entirety with my family. Patrick and Riley watched most of it. Patrick wanted to know who the bad guys were – he figured it out after I pointed out the Penguins were wearing black…like Darth Vader. 😄 Julie wanted to know how I’d handle it if Patrick grew up to be a professional hockey player, but was drafted by Pittsburgh. My love for Paddy Boy far exceeds my hatred for the Penguins, though that’s the only scenario I could imagine rooting for the Penguins. My boy Patrick, however, protested and said he wanted be a Capital and play with Alex the Great. Patrick will be 18 by the time Ovechkin’s contract is over. It could happen. Guess I gotta teach him to skate!