Evolving Language Fidelity

I haven’t seen much in the way of response to my Hi-Fi Models post, but I did come across this great article by Ted Neward on the history of the tumultuous marriage of objects and relational databases, primarily in the context of LINQ. In the context of Code is Model, the following passage from the summary was the most interesting:

While Project LINQ doesn’t purport to be the “final answer” to all of the world’s object-relational mismatch problems, it does represent a significant shift in direction to solving the problem; instead of taking an approach that centers around code generation, or automated mapping based around metadata and type inference, both of which are exercises in slaving the relational model to an object-oriented one, Project LINQ instead chooses to elevate relations and queries as a first-class concept within language semantics and library-based extensions.
[Comparing LINQ and Its ContemporariesTed Neward]

When Ted says relations and queries are elevated to “first class concepts” within the language, it makes me think of Stuart’scomment about language fidelity. I’m not sure I would say C# 3.0 is at a higher level of abstraction than 2.0, but I would say that the inclusion of these new abstractions does improve the language’s fidelity. This fidelity improvement does come at the cost of complexity (TANSTAAFL) but compared to the current alternatives, I’m willing to pay that price.

The problem with increasing the language fidelity like this is dealing with the outdated code it leaves behind. You see this today with the addition of generics in the 2.0 CLR. How many hand-coded or generated strongly typed collections are floating around out there from the 1.x days? Lots. (As if 1.x was so long ago!) How much database access code is floating around out there today? An astronomical amount. Every app that touches a database or processes XML will be outdated with the arrival of C# 3.0 and VB 9.0. But the price of converting this outdated code to use the new abstractions probably won’t be worth the time or risk. That means you’re left with maintaining the outdated code while also writing any new functionality with the new language features.

I wonder how DSLs will be impacted by this evolving language fidelity issue? On the one hand, the nature of DSLs is that they have much narrower usage (i.e. one domain) than something like generics or LINQ. On the other hand, I expect DSLs to evolve faster than general mainstream languages like C# can. So I’m thinking the impact will be about the same.

Outlook Integration Sample

For the past few months, I’ve been heavily involved in a project but I wasn’t allowed to blog about it. Last week, it went live on MSDN so finally the gag is off.

About a year ago, word started to surface about something called Project Elixir which aimed to integrate back end CRM systems with Microsoft Outlook. Part of that effort resulted in the addition of Outlook Managed Add-ins to Visual Studio 2005 Tools for Office. However, the VSTO team’s primary deliverable was an add-in loader that enforced security, enabled shutdown unloading and provided a better startup/shutdown developer experience that IDTExtensibility2. (Check out the VSTO Outlook Architecture document for more details.) While those are important fundamentals that needed to be gotten right, VSTO Outlook doesn’t provide much in the way of tools or guidance for building Outlook add-ins that leverage managed forms and controls or integrate with your back end systems. That’s where the CRM Integration for Outlook sample comes in.

What we’ve built is a sample application that surfaces CRM style data inside of Outlook. Outlook is the natural home for your calendar and your personal contacts. Why not make it the natural home for your customer contacts, activities and opportunities as well? As part of the demo project we’ve implemented:

  • Using Windows Forms for editing custom items. Check out this screenshot. The Activity form is a standard managed Windows Forms form, not an Outlook custom form.
  • Using a Windows Forms user control as a folder home page. Here’s a screenshot of the “CRM Today” page. Again, that’s a standard managed Windows Forms user control.
  • A framework for adding menu items and toolbars. In Outlook, the developer has to manage adding the custom toolbars and menu to each explorer and inspector window themselves. With our sample, we built a framework to handle that for you.
  • Using SQL Express as a local cache of CRM data. It turns out that for many scenarios, storing a copy of all the back-end data directly in Outlook is a bad idea. First, it increases the size of the users mailbox, requiring more storage on the Exchange server. Furthermore, any custom data in Outlook has to be synced twice – once from the back end system to Outlook on the desktop, then from Outlook back to Exchange. By minimizing the amount of back-end data stored in Outlook proper, we reduce the mailbox size and sync bandwidth needs. In both the above screenshots, the displayed data is coming out of the local SQL Express instance, not Outlook.
  • Having two separate storage locations (Outlook & SQL Express) means having to sync between them. We’ve built a local sync engine that can sync both individual items between Outlook and SQL Express as well as a collection of items between SQL Express and a given Outlook folder.
  • Finally, there are some utility classes to make it easier to deal with Outlook folders and items. Of primary note is the ItemAdapter class which provides a pseudo base class for Outlook items (appointments, emails, tasks, etc). Those items all have a set of similar properties and methods, but don’t have a common base class so they can’t be treated polymorphicaly. ItemAdapter uses runtime reflection to implement those common operations without needing to cast to the concrete Outlook item type.

Check out the Architecture Design Guide, as well as the Outlook Customization Guide and the Local Sync Engine Guide up on the Solution Architecture Center. You can also pick up the source code. Also, I spun up a GDN Workspace so we can have a discussion forum and to track bugs and requests.

Going forward, I’m going to be focusing on the remote data sync story for this scenario. Among other responsibilities, I “own” the Data pillar of our Connected Systems model so this dovetails nicely. You’ll note above that while we have a local sync engine in the sample, we don’t have any way to move the data back and forth between the local copy in SQL Express and the remote copy in the CRM back-end. We are working on some guidance around this right now, but we didn’t want to hold up publishing the rest of the sample.

Frankly, it’s been nice to be involved with something so technical after spending time on the marketing team. I’m pretty proud of the project and I look forward to your feedback.

Update: Removed the link to the running demo as it’s been taken off the download site for reasons I am not aware of. If you want the binary and you don’t know how to compile it, drop me a mail.

My New Boss is Blogging

Gianpaolo took over the Solution Architecture team a few months ago and he rebooted his blog a couple of weeks ago. And he’s active on the afore mentioned Architecture Forums. Subscribed.

Architecture Forums

Simon just emailed a bunch of internal architects about the new Architecture Forums on Microsoft.com. So far there’s a general architecture forum as well as one for modeling and tools, with more to come I would guess.

What other forums do you think we need?

Hi-Fi Models

I’m slowly but surely working through my holiday backlog of email and weblogs. Slowly being the operative word here.

Anyway, Stuart has a great post on the process by which we build models. And he’s not talking theoretically here, he’s working on a model for the designer definition file in the DSL toolkit. (Which is good news in and of itself as hand-writing the XML dsldd file is a pain in the butt. Though until then there’s the great Dm2Dd tool from Modelisoft). The iterative process he describes certainly looks a lot like development, in the same way that C# development looks like C development. Similar steps taken on different concepts. Additionally, he’s working bottom up – the output of the model will eventually be a working program (a designer in this case) which is the point I made in Abstraction Gap Leapfrog. There are existing abstractions that work now (i.e. the code generated from the existing dsldd file) and he’s trying to building something one level up from there.

I also like Stuart’s use of “fidelity” instead of my use of “complete”. Stuart uses it as an indication of how correct a given model is. That’s what I was implying when I said “complete” but “fidelity” captures the idea much better. I could imagine both lo-fidelity and hi-fidelity models for a given domain, though I would imagine you would always want to use the highest fidelity model available. The difference would be the amount of custom code you have to write – the higher the model fidelity, the less code you write by hand. And I would imagine the model’s fidelity would evolve over time, which introduces interesting questions regarding language evolution as well as the evolution of projects built with those languages.

I hope Stuart keeps blogging about this project.