Chronological Narrative
There's a common thread through many of my earlier papers. It has to do with how difficult it is for organizations to use XML (previously SGML) and related technologies to develop to semantically-rich documents, processes, and systems. The following overview is intended to help put the individual papers in context. Alternately, a complete list of files is also available.
I was formally introduced to the Standard Generalized Markup Language in May, 1992, at an "SGML vendor show" hosted by Boeing in Seattle. At the time, I worked for a subsidiary of Boeing at the Department of Energy's Hanford site. DOE had recently decided to use SGML to exchange Scientific and Technical Information.
When asked to develop an SGML implementation plan, I only vaguely remembered that SGML was a CALS standard and, like many other individuals, initially assumed that we would just convert our WordPerfect 5.1 files to SGML before shipping them offsite.
I quickly figured out 1) the conversion strategy was bogus, and 2) that widespread use of SGML at Hanford could be one of the most important improvements to the site's computing infrastructure since the widespread use if microcomputers, but not without significant cost. A couple of years earlier, we had upgraded the site from WordPerfect 4.2 to 5.1 at a cost of nearly $1M. I estimated a site-wide upgrade to SGML would cost in the neighborhood of $5M.
But 1992 marked the beginning of a period of instability at Hanford and all investments in the site's computing infrastructure were looked upon with suspicion. Confronted by impossible requests to justify SGML investments based on their projected cost savings, I began looking for other measures of value.
Measuring the Strategic Value of Information Technology Investments documents some of my findings. It weaves together approaches by Hammer and Champy, Paul Strassman, and others to argue that organizational performance (what an organization accomplishes) and not cost savings is the true measure of the value of information technology investments.
The next paper in the series, SGML: It's Not Just for Documents Anymore , extends these thoughts and talks about applying SGML to the implementation of performance management systems. Most performance management systems (financial systems, project management systems, etc.) are based on traditional database technologies and SGML-based implementations would seem to involve significantly higher risk.
As I struggled to articulate my rationale for accepting the perceived risk, I began to focus on flexibility and adaption as elements of value. One of my associates, Bart Preecs, introduced me to Kevin Kelly's book Out of Control: The Rise of Neo-Biological Civilization.
This led me to differentiate organic approaches to information management from more traditional engineering and mechanical transformation models. Each of these approaches are different in terms of the system development process and the way that information, itself, is organized. These differing approaches, in turn, provide different, and potentially complementary, measures of value. From this perspective, SGML is incredibly important because it allows communities to better balance engineered and organic measures of value.
For the 1995 HyTime Conference, I focused on the subject of risk management and how policy and politics (the more human, and thus more organic elements of information management) are a primary cause of project risk.
SGML, HyTime, and Organic Information Management Models repeats some of the "Organic SGML" thoughts of the previous paper, but deals most directly with the dangers of implicit policy making by information technologists and explores the political risks that are inherent within the SGML and HyTime standards. These risks result, almost directly from the fact that many organizations lack the sophisticated political mechanisms necessary to define their information management policies and are therefore unlikely to effectively manage the risks associated with SGML and HyTime implementations.
The paper describes two approaches for dealing with these issues. The first is Paul Strassman's idea of basing organizational governance on the separation of powers described in the U.S. Constitution. The second is a stakeholder-focused, values-based negotiation process called AtStake.
Tools for Implementing SGML-Based Information Systems: Viewers and Browsers, Text Retrieval Engines, and CD-ROMs was prepared for the SGML'95 conference to provide an overview of SGML delivery tools. It also focuses on some of issues of policy and value that are involved when selecting delivery tools and designing Document Type Definitions (DTDs). It explores the behavioral value of metadata, describes the shift in cost-benefit ratios that are typical of SGML initiatives, and details how the potentially divergent stakeholder interests that exist at different points in the information lifecycle can drive different markup strategies and different returns on investment. This paper was updated and presented again at the SGML'96 conference.
For the January/February 1996 issue of The Gilbane Report, I repackaged many of these ideas. Organic Information Management - A Better Way To Reengineer Information Systems is based largely on the HyTime conference paper, but also integrated key concepts from the other writings. It emphasized some of the ways that these approaches can protect information investments in the face of certain technology changes. The Gilbane Report article also extended the argument in two important areas.
First, it made the case that SGML, unlike more common information technologies, is largely policy neutral and that this leads to a catch-22 when trying to construct a business case. The business value of SGML results largely from the markup strategy that is chosen. Accordingly, it is often not possible to define this markup strategy until after considerable analysis and design. In addition, the competing drivers for engineered and organic measures of value often prevent SGML from being an "obvious solution" during this critical pre-analysis investment phase.
Second, it looked at the practice of reengineering, both as it described by Hammer and Champy and how it is often applied in practice. It examined reengineering from an organic perspective, looking specifically at issues of decision making and governance and suggested approaches for reducing the potentially disruptive consequences of reengineering initiatives.
Other than repeated references to SGML (and not XML, which didn't appear until later that year), the article seems to have weathered the years fairly well, anticipating such topics as metadata frameworks, semantic standardization, and ontologies.
At the SGML'96 conference I also presented a paper entitled SGML: Process, Policy, and Politics. It took a closer look at the very real problems created by SGML's policy neutrality. It made the argument that because we are all idiots (The Dilbert Principle), decisions tend to be dysfunctional unless a large number of people are involved. Accelerating change drives complexity, further increasing the value of communities and the multi-contextual thinking that is beyond the reach of individuals.
Most organizations, however, rely on hierarchies to allocate decision making authority to select individuals, driving many of the political problems that exist. The collaborative analysis associated with SGML initiatives points to more effective decision making models, and when the needs of multiple document stakeholders are integrated, the result is often a class of metadata and markup which can be described as transformational, because it describes transformations associated with a document.
By making these transformations explicit, differentiating various behavioral contexts, and documenting the rationale behind decisions, it is possible to use this meta-knowledge to drive more intelligent human and automated behaviors.
In 1997, I was asked to prepare an Introduction to Document Management tutorial for the Documation '97 conference. Armed with an abstract prepared by the consulting company that had previously committed to doing the tutorial and a long list of buzzwords, I was left with the challenge of making sense of the entire document management industry. When the dust settled, I had concluded that document management technologies could be segmented based on the evolving concept of the document: paper documents, automated paper documents, and electronic documents.
This tutorial was offered again at Documation '97 East. Other than the company name, little was changed. For Documation '98, however, I found that the concept of electronic documents didn't quite reflect changes in the marketplace. The result was a more detailed conceptual model that differentiated paper, automated, electronic, and active documents. As I write this (April 2000), I would probably add a fifth category that describes documents as engineered knowledge artifacts.
As time goes by, I find myself getting pulled deeper and deeper into the field of Knowledge Management. The line between semantically rich documents and knowledge artifacts is a very thin one. The use of values-based decision making processes to integrate the interests of multiple stakeholders can fairly accurately be described as knowledge creation. In most situations, performance and intelligent behavior are the same thing.
For a number of years, I have worked with Bo Newman on a variety of Knowledge Management topics, including:
- Alignment theory
- The use of organizational culture and value systems to affect changes to tacit knowledge
- The mechanics of conceptualization
- Techniques which can be used to speed the understanding of dynamic problem and solution spaces
For the Documation '98 Conference, we set out to document some of this work. A Framework for Characterizing Knowledge Management Methods, Practices, and Technologies introduced many of the key terms and concepts, organized them in the form that could be used to characterize tools, provided examples of how to use the framework, and explored potential application areas.
With increased interest in Knowledge Management, I was also asked to speak at the XML World Conference in September 1999. The result was Emerging Information and Management Strategies: It's a Question of Balance. It described the phenomenon of organizational imbalance and its associated costs; some of the economic and cultural factors driving interest in Knowledge Management; and the use of ontologies to formalize conceptual balance. It concluded by examining some of the limits of conceptualization, exploring the principles of disruptive innovation, and describing the use of isolation to incubate new conceptualizations and produce what can be called "meta-balance".
Continuing in the KM vein, I repackaged the KM Tool Characterization Framework paper as a tutorial for the Documation 2000 conference in Montreal. Characterizing Knowledge Management Tools refined and extended the base ontology presented in the original paper. It also introduced new worksheets which used a series of focus questions to simplify the characterization process. Because the underlying KM models are so generic, we found that it is very easy for evaluators to get drawn into detailed analysis of the the internal operation (mechanics) of the tools being evaluated. The focus questions help constrain the analysis by targeting the policy (behavioral) impacts of the tools being characterized.
I had also decided to use the Documation 2000 conference as an opportunity to begin formalizing and documenting my work applying Knowledge Management and Knowledge Engineering principles to the design and development of markup languages. I had discussed the possibility of writing a journal article on the topic for a couple of years, but hadn't found the time to begin serious work on it. Knowledge Management and XML represents a first effort at inventorying and organizing some of my work in this area.
While preparing for the Documation 2000 conference, I was invited to sit on a systems integrators panel at the Dynamic Content 2000 conference in San Francisco. I decided to talk about Dynamic Integration, focusing on the inability of traditional engineering and integration approaches to deal with the highly dynamic problem and solution spaces that are typically encountered when developing web sites for dot-coms and implementing other dynamic content systems.
I was also asked to speak at one of the next day's Knowledge Management panels. The result was Integrating KM Tools into Your Existing IT Infrastructure, a presentation which explored the problems inherent in the term 'KM tools', introduced a procedural framework for making the policy decisions necessary to develop a coherent knoweldge management strategy, and provided a list of technical integration principles.
Revised 2009.09.08
Copyright, The Sagebrush Group, 2001-2009