The Information Revolution Is Just Getting Started

(Harry is on a secret mission in uncharted space this week, so instead of the daily Morning Coffee post, you get a series of autoposted essays. As this post is about Web 2.0, it’s obviously from fairly old from his previous role @ Microsoft.)

A friend of mine is doing some research into Internet topics, including Web 2.0. After reading dozens of articles each with a different definition, she asked me to sum up Web 2.0 in thirty seconds or less.

Web 2.0 is the latest evolution of our post-industrial society, driven primarily by the ubiquitous access of Internet connected computing devices.

Got it down to just one sentence and it only takes about fifteen seconds to say. The critical thing to notice about that statement is what it doesn’t include:

  • No mention of specific technology outside of “Internet” and “computing devices”. That means no acronym laden techno-babble such as AJAX, REST, SOAP or XML.
  • No mention of a specific platform or vendor. That means no references to Microsoft, Google, IBM, Yahoo, Sun or Apple. Likewise, there’s no mention of open source software projects like Linux, Apache or Ruby on Rails.
  • No mention of Tim O’Reilly’s principles of Web 2.0. That means no web as platform, harnessing collective intelligence or the end of the software release cycle

This isn’t to say these technologies, platform vendors and principles aren’t important. They are. However, they aren’t what are happening; they are only pieces of the bigger picture. Exploring these individually without understanding the larger context is like the Blindmen and the Elephant.

I’ve recently been reading Alvin Toffler’s The Third Wave. It’s fascinating to read a book about the future that was written twenty five years ago. His opinion is that the industrial age peaked in the mid 1950’s and that the post-industrial age has been building steam ever since. Not coincidently in my opinion, the late fifties saw the first transistor based computers as well as the earliest work on computer networking. It is because of this intertwined history that this post-industrial age is often called the Information Age.

While it’s been building for half a century, the Information Age is only just getting started when it comes to remaking society. Over the course of three centuries, the Industrial Age saw rise to societal concepts such as the nuclear family, the school system and the corporation. It created the role of the bureaucrat. It separated the producers and consumers, giving rise to the idea of the market. It changed our view of the universe by precisely defining units of time and space. It got its energy from non-renewable sources, such as fossil fuels. In short, the Industrial Age completely remade the world. The Information Age will have equally far reaching effects before it’s done. I believe Web 2.0 is the next step in this evolution.

Toffler identified six principles of the Industrial Age: Standardization, Specialization, Synchronization, Centralization, Maximization and Concentration. The relevance of each of these principles is dropping rapidly as we shift out the Industrial Age. For example, weblogs represent a massive de-centralization of the news media. Online retailers like Amazon.com replaced the standardized shopping experience with a personalized one. Digital video recorders and online video sharing sites eliminate the synchronization of broadcast TV.

For each principle of the Industrial Age, there are examples of Web 2.0 companies working against it.

Morning Doughnuts 5

  • Joel Dehlin, a former Microsoft employee and the CIO of the LDS church is conducting a series of tech talks. The next one is being planned for the bay area. If you are interested you can respond to his post here. The dates would be between April 22 – 26 with a tentative agenda as follows:

    • Keynote
    • Infrastructure breakout
    • Development breakout
    • Interaction Design breakout
    • Community breakout
    • Building to building video breakout
  • Everything needs a 12 step program now. CNN has a 12 step program to cure your email addiction here. I started thinking about this after Harry’s post saying he had hit zero email bounce prior to going on his secret mission.

  • I read an interesting blog on XNA and how it fits into Microsoft’s strategy in gaming. I am not sure I agree with all of the points, but I found the arguments to be compelling.

  • My BYU Cougars are now up to 21 in the AP Poll. I can’t think of a year when both the football and basketball teams have both had such successful seasons.

  • Between today and tomorrow I will be finalizing my vision document for how I think monitoring should work in the Service-Oriented Infrastructure project I am on. As I was outlining my vision it really hit me how much there is to do.

Morning Doughnuts 4

  • According to Reuters surgeons who play video games are more skilled. Remind me to ask the doctor if s/he owns an XBOX 360 the next time I am getting operated on.
  • I have reached the National Championship game in dynasty mode of NCAA Football 2007. The opponent of my BYU Cougars…why that would be Harry’s alma mater, the USC Trojans. Funny how that worked out.
  • Nicholas Allen writes in his blog about when you should use Indigo to write a channel, and more importantly when you should not. As most of you know Harry and I are doing quite a bit of work with WCF so we are interested in this type of advice.
  • Our team has been thinking about how to manage a large number of services in an automated fashion. This would include deploying new services, monitoring the services, automatically handling scaling, service discovery, and automated provisioning to name a few possible capabilities. I almost think of it like the next version of UDDI, especially when it comes to provisioning. I think that as systems become more distributed that the ability to automatically manage these systems is going to be key to their success. I know that some thought has already gone on in this area by people far smarter than I, but as I consider how to operate an infrastructure with thousands of services in it it is apparent that the opportunity is there for us to design and implement a system management framework that automates the majority of the tasks. I need to spend some time to consider how the framework would work, and document the capabilities.

Internal DSLs in PowerShell

(Harry is on a secret mission in uncharted space this week, so instead of the daily Morning Coffee post, you get a series of autoposted essays. This post combines both some leftover learnings about Ruby from Harry’s Web 2.0 days with his recent obsession with PowerShell.)

My first introduction to the idea of internal DSLs was an article on Ruby Rake by Martin Fowler. Rake is Ruby’s make/build utility. Like most build tools like Ant and MSBuild, Rake is a dependency management system. Unlike Ant and MSBuild, Rake doesn’t use an XML based language. It uses Ruby itself, which has huge benefits when you start doing custom tasks. In Ant or MSBuild, building a custom task requires you to use a external environment (batch file, script file or custom compiled task object). In Rake, since it’s just a Ruby file, you can start writing imperative Ruby code in place.

Here’s the simple Rake sample from Fowler’s article:

task :codeGen do  
  # do the code generation
end

task :compile => :codeGen do  
  # do the compilation
end

task :dataLoad => :codeGen do  
  # load the test data
end

task :test => [:compile, :dataLoad] do  
  # run the tests
end

The task keyword takes three parameters: the task name, an array containing the task dependencies and a script block containing the code to execute to complete the task. Ruby’s flexible syntax allows you to specify task without any dependencies (:codegen), with a single dependency (:compile => :codegen), and with multiple dependencies (:test => [:compile,:dataLoad])

So what would this look like if you used Powershell instead of Ruby? How about this:

task codeGen {  
  # do the code generation
}
task compile codeGen {
  # do the compilation
}

task dataLoad codeGen {  
  # load the test data
}

task test compile,dataLoad {
  # run the tests
}

Not much different. PS uses brackets for script blocks while Ruby uses do / end, but that’s just syntax. Since it lacks Ruby’s concept of symbols (strings that start with a colon), PS has to use strings instead. Otherwise, it’s almost identical. They even both use the # symbol to represent a line comment.

There is one significant difference. For tasks with dependencies, Rake uses a hash table to package the task name and its dependencies. The => syntax in Ruby creates a hash table. Since the hash table has only a single value, you can leave of the surrounding parenthesis. The key of this single item hash table is the task name while the value is an array of task names this task depends on. Again, Ruby’s syntax is flexible, so if you have only a single dependency, you don’t need to surround it in square brackets.

In Powershell, the hash table syntax isn’t quite so flexible, you have to surround it with @( ). So using Rake’s syntax directly would result in something that looked like “task @(test = compile,dataLoad) {…}” which is fairly ugly. You don’t need to specify the square brackets on the array, but you having to add the @( is a non-starter, especially since you wouldn’t have them on a task with no dependencies.

So instead, I thought a better approach would be to use PS’s variable parameter support. Since all tasks have a name, the task function is defined simply as “function task ([string] $name)”. This basically says there’s a function called task with at least one parameter called $name. (All variables in PS start with a dollar sign.) Any parameters that are passed into the function that aren’t specified in the function signature are passed into the function in the $args variable.

This approach does mean having to write logic in the function to validate the $args parameters. Originally, I specified all the parameters, so that it looked like this: “function global:task([string] $name, [string[]] $depends, [scriptblock] $taskDef)”. That didn’t work for tasks with no dependencies, since it tried to pass the script block in as the $depends parameter.

Here’s a sample task function that implements the task function shown above. It validates the $args input and builds a custom object that represents the task. (Note, the various PS* objects are in the System.Management.Automation namespace. I omitted the namespaces to make the code readable.)

function task([string] $name) {
  if (($args.length -gt 2) -or ([string]::isnullorempty($name))) {
    throw "task syntax: task name [<dependencies>] [<scriptblock>]"
  }
  if ($args[0] -is [scriptblock]) {
    $taskDef = $args[0]
  }
  elseif ($args[1] -is [scriptblock]) {
    $depends = [object[]]$args[0]
    $taskDef = $args[1]
  }
  else {
    $depends = [object[]]$args[0]
    #if a script block isn't passed in, use an empty one
    $taskDef = {}
  }

  $task = new-object PSObject
  $nameProp = new-object PSNoteProperty Name,$name
  $task.psobject.members.add($nameProp)
  $dependsProp = new-object PSNoteProperty Dependencies,$depends
  $task.psobject.members.add($dependsProp)
  $taskMethod = new-object PSScriptMethod ExecuteTask,$taskDef
  $task.psobject.members.add($taskMethod)
  $task
}

Of course, you would need much more than this if you were going to build a real build system like Rake in PowerShell. For example, you’d need code to collect the tasks, order them in the correct dependency order, execute them, etc. Furthermore, Rake supports other types of operations, like file tasks and utilities that you’d need to build.

However, the point of this post isn’t to rebuild Rake in PS, but to show how PS rivals Ruby as a language for building internal DSLs. On that front, I think PowerShell performs beautifully.

I’m looking forward to using PowerShell’s metaprogramming capabilities often in the future.

Morning Doughnuts 3

  • What does it take to be an architect? Skyscrapr.net attempts to answer this question by asking a bunch of architects.
  • I have started teaching my children about astronomy. I found an open source product called Stellarium that is excellent for learning about the celestial objects visible in your area.
  • A Methodology for SOA adoption? I read an interesting blog on this subject from a couple of weeks ago. It’s not a long article, but the author makes some interesting points including an outline for SOA adoption.
  • I finally picked up Gears of War on Friday. It really isn’t a game I can see playing much, although I can see why it’s popular. I guess the best and the worst part of the game is having to utilize cover so you don’t die right away.
  • Windows Live Writer is a great tool! I use it to author the blogs for my website, and this week I have been using it on these Morning Doughnuts posts. My favorite feature is that you can preview your post and see exactly how it will appear on your website. This has been particularly useful since Devhawk and my site look quite different.