Tool Driven Development

I’ve gotten some great feedback on my Code is Model post. But before I get to those comments, I wanted to talk a little bit about the role of tools. Tools are important in every industry, but in no industry are they as important or central as they are to information technology.

One of the fascinating differences between IT architecture and “real world” architecture is the feasibly of automated transformation. If I have a blueprint of a house, there is no tool to automatically transform it into a house – someone has to do the actual work of putting up the walls, running the plumbing, etc. But in IT architecture, I can transform blueprints – i.e. the code – into the finished application automatically. That speaks volumes about the importance of tools in information technology.

Without the tools to automate the transformation, all IT models – code or otherwise – are nothing more than documentation.

So if we are to include models as part of our development process, we need tools to execute the transformations between levels of abstraction. But where do those tools come from? In the case of abstractions with widespread horizontal appeal, they are likely to come from language vendors such as Microsoft. For example, abstractions like generics and query expressions are useful in almost nearly every project, so it makes sense to include them in mainstream languages like C#. However, one of the key points of Software Factories is that in order to continue raising the level of abstraction, we are going to need narrow the domain to which those abstractions apply.

Developing language-based tools is currently quite expensive, and therefore makes economic sense only in broad horizontal domains, such as user interface construction and data access, where the necessary investments can be amortized over a large customer base. For this reason, commercial implementations of the Language Framework pattern are supplied almost exclusively by platform vendors.

This economic model is threatened by the trade off between scope and value for abstractions. As Jackson puts it, the value of an abstraction increases with its specificity to the problem domain. The higher the level of an abstraction, the narrower the domain to which it applies, but the more value it provides in solving problems in that domain.

We are now at a point in the evolution of the software industry where frameworks and tools must become more vertically focused, in order to realize further productivity gains. They must provide more value for smaller classes of problems. This breaks the current economic model because markets for vertically focused frameworks and tools are too small to support platform plays, and because the domain knowledge required to develop them is housed mainly in end user organizations that have not traditionally been software suppliers.
[Problems and Innovations by Jack Greenfield]

In other words, these domain focused languages, frameworks and tools are not going to come from traditional language vendors like Microsoft. They are going to come from organizations that have deep experience in the given domain, many of whom will not be traditional software vendors. I think SIs and ISVs will adopt this approach before many end user organizations do, but eventually any organization that invests in building reusable frameworks is going to want to invest in building languages and tools to automate the usage of those frameworks.

I think the best way to start developing these tools is to incorporate their construction directly into the software development lifecycle. I liken this to Test Driven Development. If you use a TDD approach, at the end of your project you end up with two outputs – the project and the unit tests. Assuming the code you’re building now (with their associated unit tests) is designed to meet current requirements, you shouldn’t need to refactor it significantly until those requirements change sometime in the future. The tests you build today become a tool to help you improve your code in the future when the requirements change. From a tool development perspective, Test Driven Development is a specific instance of the more generic concept of Tool Driven Development, as tests are a specific type of tool.

What if, instead of building code test first, you thought about building tools first?

The problem with tests is that hey are tightly bound to the current project. They may help you improve this project’s code in the future, but they won’t help you develop other applications. I think we should also focus on building tools that make it easier to build the existing project. The goal would be to be able to use these tools not only on future iterations of the current project, but on other similar projects as well.

Of course, we don’t really have the infrastructure to do this effectively today. We need to be able to make it easier to build tools, compose tools and evolve tools over time. Supporting evolution will be key since these tools will likely need to be refined as they are applied to different projects. Test Driven Development has xUnit style frameworks, Tool Driven Development needs the equivalent. The DSL Tools and Guidance Automation Toolkit are a good start, but aren’t enough to enable Tool Driven Development on their own.

Do you believe this Tool Driven approach can work? What do you think Tool Driven Development tools would look like? Please let me know.

Comments:

Yes, I fully agree. One big part of that seems integration with source code control to me. Often tools that are part of the build process are installed via an installer onto machines. This approach really breaks down, once you have rapid evolution of the tools, in sync with your project dev. I think making rapid changes in these small tools is important, but I also think that they will NOT be terribly backwards compatible, unlike huge languages like C#. So, I expect that tool T1 in version 3.0 will not be able to work on the models defined for building version 2.8, rather there would probably be a process to transform the model accordingly in version 2.9 (I am not sure whether I am very clear with this... Just generally I believe that the solution will be more to make sure each version of a small tool can handle input data from the previous version, but not to make sure each tool will be able to handle ALL old file format). So, either those tools should be stored as binaries within source code control, so that if I check out a branch for an old version and build it uses the old version of the tool. Or, probably even better, it should build the tool as one of the first stepts with each full build and then use that dll/exe later on in the same build process. Good automation there would be incredibly helpful. Also, this is very important for the VS IDE DSL editors created with the DSL Tool. Really what should happen is that if I check out a specific branch of my source code, the corresponding version of my custom DSL editor tool is dynamically loaded into the IDE. The approach I see right now to have a seperate install for the DSL that works across all versions within my source code depot is really not very usable, since it would require to be incredibly backwards compatible with my tools, which I don't really think brings any value for very small tools that might not be used across too many programs (and probably only internally too).
Although I would like to see a DSL for my domain, I don't think that tool-driven development is a natural extension of test-driven development. In test-driven development, the goal is to define some functionality, define the tests, then build and test. Both the code and the unit tests are directly related to the specific release of a specific system. In tool-driven development, the goal is essentially to codify domain expertise in some form of DSL and/or a tool to execute the transformations from model to code. This tool delivery is only indirectly related to the delivery of a specific product and specific release, and often imposes a cost that is cannot be borne by the budget of a specific release. The tools are best left to a "functional" group of people who are responsible for developing, evolving, and sustaining the product or subsystem over multiple releases. They will build the domain expertise over time, and they are most likely to get extra money in their budget to develop a tool because it helps them meet their objectives of codifying domain expertise for current and future work. Jay Godse