Hating the Term Web 2.0

Now that I’m an Architect on the Edge (I’m thinking of putting that on my business card. Good idea or bad idea?) of course the first order of business is taking a closer look at “Web 2.0”. One thing leaps out at me right away – I hate the name “Web 2.0”.

First off, it’s a pure marketing buzzword. It was originally coined as a conference name. In a way, the fact that is has no underlying meaning is a good thing, because it gives people argue whether it really exists or not. In a way, it’s like the word “multimedia” back when we were first putting CD-ROMs into computers. There used to be lots of discussion if one thing or another truly was “multimedia”. Now, we don’t really worry about categorizing it as the marketing buzz around the term is long gone.

Secondly, I think it’s wildly arrogant to claim we’re only on version 2.0. The Internet has been around for 36 years. So everything before mid 2004 was Web 1.0 or earlier? And people are already talking about Web 3.0. Come on, let’s get real. The technologies that are driving the current revolution have been percolating for more than one major version of the underlying technology.

Finally, what’s with the version number anyway? One of the core principles Tim O’Reilly outlined was the “End of the Software Release Cycle”. Why are we using a holdover from the software release cycle days to indicate the end of the software release cycle?

Don’t get me wrong, I strongly believe that there is dramatic change happening in this industry. The way I explain my new job is to consider that one of the most basic axioms of distributed computing has been overturned.

From day one, all the computing power has been focused in the center. At first, the machines on the edge had no power at all – they were just dumb terminals. Slowly but surely, those machines at the edge started to become powerful in their own right. However, it’s only in the last seven to ten years that commodity hardware that was pervasive on the edge grew powerful enough to power the center. And it’s only in the last two or three years that the connection between the center and the edge grew fast and pervasive enough to make that edge power relevant.

The rules have changed. The power has shifted from the center to the edge. And we’re only just beginning to see the effects.

Maybe we should call it WebNT? 😄

TechieWife Back Online

The hardware failure that caused DevHawk to be down in early December also took out my wife’s blog TechieWife. But she just started up a new TechieWife MSN Space. Subscribed (of course).

Architect on the Edge

So for the fourth time in seven months, I have a new manager. Way way way back and the end of June, I left marketing to be a solution architect. In doing so, I traded in Norman for John as my manager. Things were looking shiny but then Microsoft had a major reorg back in September. These big reorgs often cause small ripples, like the director of Architecture Strategy deciding to move back to a product group. John was promoted to head the team and Gianpaolo was hired to take over John’s previous role as head of solution architecture. Finally, it seemed like things were calming down on the management front, so I was caught off-guard by what happened next.

Last Monday, John offered me a chance to switch roles. We have spun up an Edge Architecture team to focus on the architectural impact of the next generation of computing – what some call “Web 2.0″ – and he thought I would be a perfect fit for the team. I agreed and jumped at the chance. So now I work for Michael Platt. Don’t let the infrastructure focus of Michael’s blog fool you – he incredibly deep on Edge Architecture and I am currently playing catch-up. The strangest thing so far is having to re-orient my thinking for consumer focused systems and away from the enterprise world where I have I’ve gotten the vast majority of my experience and where I have spent my entire Microsoft career to date.

In the immediate future, I’m been dumped into the MIX06 planning process. We have other stuff going on in the MIX timeframe that I’ll get to soon enough. I’m also re-thinking my personal presence. As my career progresses, the moniker “DevHawk” seems more and more outdated. Is it?

CNET Picks Up Outlook Integration Sample News

I couldn’t blog about the Outlook Integration Sample last week because we were waiting to see if the mainstream media would pick up the news. CNET ran an article about the sample today. There’s also a link to the Customer Explorer case study that was the original inspiration for the Outlook Integration Sample and VSTO Outlook in the first place.

Evolving Language Fidelity

I haven’t seen much in the way of response to my Hi-Fi Models post, but I did come across this great article by Ted Neward on the history of the tumultuous marriage of objects and relational databases, primarily in the context of LINQ. In the context of Code is Model, the following passage from the summary was the most interesting:

While Project LINQ doesn’t purport to be the “final answer” to all of the world’s object-relational mismatch problems, it does represent a significant shift in direction to solving the problem; instead of taking an approach that centers around code generation, or automated mapping based around metadata and type inference, both of which are exercises in slaving the relational model to an object-oriented one, Project LINQ instead chooses to elevate relations and queries as a first-class concept within language semantics and library-based extensions.
[Comparing LINQ and Its ContemporariesTed Neward]

When Ted says relations and queries are elevated to “first class concepts” within the language, it makes me think of Stuart’scomment about language fidelity. I’m not sure I would say C# 3.0 is at a higher level of abstraction than 2.0, but I would say that the inclusion of these new abstractions does improve the language’s fidelity. This fidelity improvement does come at the cost of complexity (TANSTAAFL) but compared to the current alternatives, I’m willing to pay that price.

The problem with increasing the language fidelity like this is dealing with the outdated code it leaves behind. You see this today with the addition of generics in the 2.0 CLR. How many hand-coded or generated strongly typed collections are floating around out there from the 1.x days? Lots. (As if 1.x was so long ago!) How much database access code is floating around out there today? An astronomical amount. Every app that touches a database or processes XML will be outdated with the arrival of C# 3.0 and VB 9.0. But the price of converting this outdated code to use the new abstractions probably won’t be worth the time or risk. That means you’re left with maintaining the outdated code while also writing any new functionality with the new language features.

I wonder how DSLs will be impacted by this evolving language fidelity issue? On the one hand, the nature of DSLs is that they have much narrower usage (i.e. one domain) than something like generics or LINQ. On the other hand, I expect DSLs to evolve faster than general mainstream languages like C# can. So I’m thinking the impact will be about the same.