Morning Coffee 140

  • I only posted one Morning Coffee post last week. It wasn’t a lack of content, it was a lack of drive on my part. I had 20-30 items flagged in my news reader, but for some reason I couldn’t work up the interest in posting them. So some of these are a bit old.
  • I’m at the Language.NET Symposium this week, so look for lots of language blogging. I’ve already chatted with Tomáš Petříček and John Lam. If someone kicks Ted Neward’s ass because he hates Perl, I’ll try and liveblog it.
  • Speaking of Ted Neward, he discusses the question “Can Dynamic Languages Scale?” without devolving into a flame-fest. I agree 100% with his point about the difference between performance scaling and complexity scaling. Personally, I tend to err on the side of better complexity scaling, since buying hardware is easier than hiring developers.
  • Nick Malik responds to me calling his shared global integration vision flawed. He points to NGOSS/eTOM as an example of a shared iterative model that works. I know squat about that shared model, so I’ll refrain from commenting until I do a little homework on the telco industry.
  • Speaking of shared interop models, Microsoft is joining DataPortability.org. Dare Obasanjo and Marc Canter are skeptical that so far this effort is all hype and no substance. Reminds me a bit of AttentionTrust.org. But if DataPortability.org can get off the ground, maybe there’s hope for Nick’s vision (or vis-versa).
  • Don Syme lists what’s new in the latest F# release. As I said, this release is pretty light on features. Hopefully, I’ll get some details
  • Tomas Restrepo shows how to change your home folder in PowerShell. I need to do this.

Nick’s Flawed Vision of a Shared Integration Model

Of all the things you might say about Nick Malik, “thinks small” is not one of them. He takes on a significant percentage of the .NET developer community over the definition of Mort. He wants to get IT out of the applications business. He invents his own architecture TLA: SDA (aka Solution Domain Architecture). He’s a man on a mission, no doubt. And for the most part, I’m with him 110% on his ideas.

However, when he starts going on about a shared global integration model, I start to wonder if he has both hands on the steering wheel, as it were.

Nick’s been talking about this for over a year. As he points out, SaaS integration layer is the new vendor lock-in. One of the attractions of SaaS is that you could – theoretically, anyway – switch SaaS application providers easily which would drive said SaaS companies to constantly innovate. However, if the integration layers aren’t compatible, the cost to switch goes up dramatically, leaving the customer locked-in to whatever SaaS company they initially bet on – even if that bet turns out to be bad.

OK, I’m with him so far. Not exactly breaking news here – we’ve seen the same integration issues inside the enterprise for decades. SaaS adds new wrinkles – if your ERP vendor goes belly-up, they can’t take your data with them or worse sell it to your competition – but otherwise it sounds like the same old story to me.

However, where Nick loses me is when he recommends this solution:

“To overcome this conflict, it is imperative that we begin, now, to embark on a new approach. We need a single canonical mechanism for all enterprise app modules to integrate with each other. If done correctly, each enterprise will be able to pick and choose modules from different vendors and the integration will be smooth and relatively painless.”

Yeah, and if a frog had wings, it wouldn’t bump its ass when it hopped.1 There are so many things wrong with this approach, I’m not sure I can get them all into a single web post.

First off, it won’t, in fact, be done correctly – at least, not the first time. I realize everyone knocks MSFT for never getting an application right before version 3.0, but I believe it’s actually systemic to the industry. Whatever you think you know about the problem to be solved, it’s at best woefully incomplete and at worst wrong on all counts. So getting it right the first time is simply not possible. Getting it right the second time is very unlikely. It isn’t until the third time that you really start to get a handle on the problem you’re really trying to solve. Getting an effort like this off the ground in the first place would be a Herculean task. Keeping it together thru a couple of bad spec revisions would be impossible.

Meanwhile, the vendors aren’t going to be waiting around twiddling their thumbs waiting for the specs to be done. We’ve seen efforts to unify multiple completing vendors around a single interoperable specification. By and large, those efforts (UNIX, CORBA, Java) have been failures. The technologies themselves haven’t been failures, but the idea that there was going to be “relatively painless” portability or interoperability among different vendors never really materialized. If it didn’t work for UNIX, CORBA or Java, what makes Nick think it will work for the significantly more complex concept of a shared global integration model? Not only more complex in terms of spec density, but the mind-boggling number of vendors in this space.

Nick is worried that either “we do this as a community or one vendor will do it and force it on the rest of us.” But if you look at how specifications evolve, retroactive realization of defacto standards is the way the best standards get created. For example, I could argue that TCP was forced on us by the US Military, but I don’t hear anyone complaining. I realize there’s a big difference between having a vendor force a spec down our throat vs. a single big customer, but either way it’s not designed by committee. Besides, if we do see get an enterprise integration standard forced on us, I don’t believe it will be the vendors doing the forcing. If I were a betting man, I’d bet on Wal-Mart. Business leverage trumps IT leverage and Wal-Mart has more business leverage than anyone in this space these days.

BTW, would design-by-committee be an extreme example of BDUF? Do we really want to develop a this critical integration model using the same process that produced the XSD spec?

Finally, Nick thinks that this model will improve innovation, where I think it will have the exact opposite effect. Once you lay a standard in place, the way you innovate is to build proprietary extensions on top of that standard. However, by definition, these extensions aren’t going to be interoperable. If someone has a good idea, others will copy it and eventually it will become a defacto standard.

A recent example of the process of defacto standardization is XMLHttpRequest. Microsoft created it in 1999 for IE 5, Mozilla copied it for their browser a couple of years later, followed by the other major browser vendors. Google innovated with it, Jesse James Garrett coined the term AJAX, everyone else started doing it and then finally – nearly a decade later and still counting – a standards body is getting around to putting their stamp of approval on it.

However, if you’re worried about painless integration and not having something forced on you by some vendor, then you’re not going to embrace these innovations – which means, you won’t embrace any innovation. Well, there may be some innovation in the systems themselves that doesn’t affect the interface, but once that interface is cast in stone, the amount of innovation will go way down. How do vendors differentiate themselves? There’s only two ways: price and innovation. Take away innovation with standardization, and you’re left with a race to the rock bottom price with no incentive to actually improve the products.

I get where Nick is going with this. He looks around our enterprise and sees duplication of effort and massive resources being spent on hooking shit together. It sure would be nice to spend those resources on something more useful to the bottom line. But standardizing – or worse legislating – the problem out of existence isn’t going to work. What will? IMO, applying Nick’s ideas of Free Code to interop code. If we’re going to get IT out of the app business, can’t we get out of the integration business at the same time?


  1. It’s exceedingly rare that you get to quote Wayne’s World or Raising Arizona in a discussion about Enterprise Architecture, much less both at the same time. Savor it.

Caps 6, Penguins 5 (SO)

NHL.com Game Summary NHL.com Game Recap

It wasn’t a pretty win, but I’ll take the two points just all the same. Especially given the long history Washington has of losing to the Penguins – they had lost the past six meetings before tonight. Japers’ roundup mirrors my own thoughts, though On Frozen Blog’s roundup was funnier – they made a drinking game out of the number of times the on-air announcers referenced Sid the Kid. Certainly, there’s no love lost for Crosby among Caps fans, but the amount of on-air time spent discussing an injured player bordered on ridiculous – Sid the Kid was mentioned 27 times by OFB’s count + five in the post game. Worst was probably Malkin’s first goal – he hadn’t even stopped celebrating and the announcer was already talking about Crosby.

Caps were pretty dreadful on special teams tonight. Pens had three goals on eight penalties while the Caps had only went one for six. However, the penalty kill came thru in overtime overtime with the Caps down 5-3 for 1:07. The Caps won both defensive zone face offs and blocked four shots – Quintin Laing had three of those blocks – and kept the Pens from registering a shot on goal for the entire power play. That was money. If they gave out game stars to unsung heroes, Laing would have gotten one.

For all the great young talent on the Caps, there’s got to be real concern about goaltending. I love Olie the Goalie, but he didn’t get it done tonight. The Penguins had a grand total of 15 shots (14 if you don’t count the one from beyond the blue line with one second left in overtime). 5 goals on 15 shots == a pretty crappy save percentage. Malkin’s first goal was very impressive skating, but he didn’t so much shoot as throw the puck at the net. And letting Talbot’s  open the scoring by stuffing the puck in at the post was weak sauce as it were. I’m not so much worried about it for this season, but with Kolzig talking retirement as his struggles, I’m not sure who the Caps have in the pipeline between the pipes.

It sure was fun getting to watch an entire Caps game in its entirety with my family. Patrick and Riley watched most of it. Patrick wanted to know who the bad guys were – he figured it out after I pointed out the Penguins were wearing black…like Darth Vader. 😄 Julie wanted to know how I’d handle it if Patrick grew up to be a professional hockey player, but was drafted by Pittsburgh. My love for Paddy Boy far exceeds my hatred for the Penguins, though that’s the only scenario I could imagine rooting for the Penguins. My boy Patrick, however, protested and said he wanted be a Capital and play with Alex the Great. Patrick will be 18 by the time Ovechkin’s contract is over. It could happen. Guess I gotta teach him to skate!

Morning Coffee 139

  • Big news on the WGA strike front: the AMPTP reached a deal with the Directors Guild last weeks. Initial reaction from United Hollywood is mixed, but I’m hopeful this will at least get the AMPTP / WGA talks started again.
  • Speaking of new media, Xbox 360 Fanboy has a rundown of 45 short films from Sundance that are getting released on Xbox Live Marketplace. That’s pretty a-typical content for XBLM. Typically, new content on XBLM has been from “Hollywood Heavyweights“. I’m pretty excited to see them branch out content wise.
  • Speaking of Xbox 360, seems they had a good year. Congrats!
  • Still speaking of Xbox 360, everyone gets a free copy of Undertow this week.
  • Scott Guthrie announces the availability of the .NET Framework Source Code. Shawn Burke has instructions for how to use it with VS08. So far, they’ve made the core base class libraries, ASP.NET, Windows Forms, WPF, ADO.NET and XML available. LINQ, WCF and WF are expected to become available “in the weeks and months ahead”.
  • Ted Neward wonders if Java is “Done” like the Patriots, or “Done” like the Dolphins? If you want my opinion (I’m guessing yes, since you’re reading my blog), definitely done like the Dolphins. OpenJDK was a desperation move to make Java “cool” again, but it won’t work. People who want an open source stack are using LAMP and language wonks who saw Java as mainstream SmallTalk have moved on to Ruby. The question will be if Sun buying MySQL will make Sun cool or MySQL uncool by association. I’m guessing the latter.
  • Speaking of Ted, he’s got a great post about the relevance of game programming to the mainstream or enterprise developer.
  • Speaking of game development, David Weller points to all the new XNA GS 2.0 content up on Creators Club Online.
  • There’s a new version (1.9.3.14) of F# out, but no announcement from Don regarding what’s new. I reviewed the release notes, seems like this is primarily a bug-fix release with only very minor feature additions.
  • Speaking of F#, Don points to Greg Neverov’s implementation of Software Transactional Memory in F#. This immediately reminded me of Tim Sweeney’s Next Mainstream Programming Language talk. Tim suggested said language would need to support a combination of side-effect free functional code and software transactional memory. F# is looking to be closer to that language all the time.
  • Still speaking of F#, Don Syme’s Expert F# book is out. I read the draft version – it rocks – but I’m still going to get my own real copy. You should too.
  • With their win Saturday, the Caps are back to .500 for the first time since late October. Since Thanksgiving, the Caps are 15-7-4. Only four teams in the league have a better record over that time span. We play one of them tonight – the Penguins – and it’s on Versus, so I’ll even get to see it. In HD no less.

365 and Counting

Steve Benen points out:

You may have noticed, on bumpers or t-shirts, the “1.20.09” slogan. It denotes, of course, Inauguration Day for Bush’s successor.

I just thought I’d mention that after seven painful years, the Bush presidency will end exactly one year from today. It’s obviously something to look forward to.

It’s an awkward period in Democratic politics right now — a contentious presidential primary, a frustrated Democratic Congress — but looking at the light at the end of the tunnel, and knowing it’s probably not a train, might serve as a morale booster.

From the morale perspective, it’s worth noting the voter turnout in the primaries so far. Yesterdays’ Democratic Nevada primary is the third in a row to set turnout records. On the Republican side, Benen points out that McCain won South Carolina yesterday with a 135,000 votes but eight years ago, he got nearly 100,000 more votes in a losing effort. By my calculation, Republican South Carolina turnout was 30% less than it was in 2000.

Voter turnout in early primaries doesn’t make this election a sure thing by any means, but it sure is an encouraging sign.