Organic Information
Management — A Better Way to
Reengineer Information Systems

By Kurt Conrad, The Sagebrush Group

Prologue

One of the biggest challenges facing managers implementing new IT strategies is the rapid, and sometimes revolutionary pace of technological change. Designing and implementing a comprehensive information system today means investing in an infrastructure that may have a distressingly short life span. Unfortunately, many of today's reengineered systems assume a more stable environment than exists, try and control too much, and dictate inflexible processes and workflows. This combination can be disasterous. And the bigger they are...

In this issue, Kurt Conrad (who was the Gilbane Report's very fist subsriber 3 years ago!) provides a compelling argument for implementing systems that can adapt to the constant change that is an inescapable part of reality rather than monolithinc systems that are not designed to evolve in manageable ways. What's more, Kurt relates his arguments to both SGML and financial derivatives market! As consultants, we often find ourselves convicing organizations that the mammoth structure they have painstakingly designed at high cost is just too risky to implement. We think Kurt's article should be required reading for anyone looking at investing in a reengineering exercise.


Executive Summary

Strategic Overview

  • Many of the most serious risks that IS managers face do not come from information technologies but from the need to synchronize technology and changes in the business process in order to improve organizational performance and adaption.

  • Technology and business systems that are designed around organic principles can help organizations respond to environmental changes more quickly and reduce the risks associated with information technology initiatives.

What is Organic Information Management and Why is it Important?

  • Organic Information Management involves the application of organic and biological concepts to the design of computer applications and the decision -making processes that direct investment choices.

  • Mechanical approaches often fail because efforts to control costs at the workgroup level usually result in suboptimized, stove-pipe solutions that raise overall information lifecycle costs.

  • The computerization of business processes damages organizational adaption, flexibility, and responsiveness to customers

  • Increasingly obsolete measures of productivity and organizational performance result in dysfunctional decision-making and investment strategies.

Application of Organic Information Management Concepts

  • The financial derivatives market provides a good example of the problems which are caused by managers who blindly surrender their policy-making roles to technologists.

  • Politics is a critical success factor. Properly integrated, the divergent stakeholder interests that add conflict and threaten project success can be converted into forces of support.

  • A formalized governance process, with well-defined distributions of policy-making, execution, and enforcement powers can eliminate disruptive political maneuvering, reduce micro-management, and ensure that information management initiatives improve organizational performance.

  • Values-based decision-making simplifies some of the more difficult organizational learning and polarization issues that come with attempts to balance stakeholder interests and corporate objectives.

  • Organic designs shift the emphasis from monolithic, enterprise-level solutions to smaller projects and design teams, reducing risk and expediting learning.

  • Fear and a suplus of white-collar labor in many organizations cause reengineering to be inapprpriately used as a cover for downsizing.

  • Reengineering's emphasis on radical change and speed conflicts with the organic organization that it seeks to create. These conflicts can be seen in the way that decision-making and associated political issues are addressed.

Conclusions

  • The need for organizational flexibility and resiliency is driving the demand for Organic Information Management (OIM) approaches.

  • Companies should take note from the 600 BC author Lao Tsu, who wrote that ... Intelligent control exerts influence without appearing to do so. Unintelligent control tries to influence by making a show of force.


Strategic Overview

When it was introduced, the Gilbane Report promised to help IS managers make well-informed short-term business decisions and to formulate long-range strategies that reflect the risks and capabilities of document system technology. Many of the most serious risks facing IS managers today stem not from information technologies but from the change process itself: the need to synchronize technology and business process changes in order to improve organizational performance and adaption.

Accordingly, organic and biological metaphors are becoming an useful way to describe and understand the challenges facing organizations today. As Kevin Kelly writes in Out of Control, The Rise of Neo-Biological Civilization, ...our strongest terms of change are rooted in the organic: grow, develop, evolve, mutate, learn, metamorphose, adapt. Nature is the realm of ordered change.

Modern information technologies do not automatically improve organizational performance and many of the common approaches for managing complex organizational and technical changes are simply not reliable. As an example, the most popular technology for synchronizing radical process and technical change — reengineering — is far from foolproof. Numerous books and articles, and even Michael Hammer and James Champy — the developers of the reengineering concept — report staggering statistics about the number of failed reengineering projects and dissatisfied organizations.

But for all of its dangers, the concept of reengineering is compelling and reflects a very real need. Modern information technologies are changing the very nature of work and making traditional organizational designs suspect. Organizations that are shifting to open information and document systems cannot realize full returns on their technology investments unless they also update their information management practices to match the capabilities of the new technologies.

Even more dramatic, many believe that structural changes in the economy — downsizing and layoffs, increased global competition, condensing of management levels — are occurring because we are just now beginning to learn how to use computers to improve organizational performance and productivity. Some even believe that we are entering a period of economic instability that is likely to last from five to 25 years while this learning process takes place. In the face of these pressures, the biggest challenge facing organizations is how to step up the pace of change with the greatest benefits and smallest disruption.

Unfortunately, many information management approaches are adverse and even hostile towards change. A fixation on mechanical efficiency often results in overly-rigid information management systems that take too long to develop and can hamper flexibility by hard-coding a snapshot of the organization's processes and automation needs. Increasingly, corporations need especially modular, flexible, and adaptable information systems to both enable and keep up with new demands and associated process changes.

Organic Information Management (OIM) looks to biological metaphors and takes a human-centered — not technology-centered — approach to information. OIM also pays special attention to the distribution of decision-making authority within a company and how different ways of dispersing the power can affect organizational adaption. Technology and business systems that are designed around organic principles can help organizations respond to environmental changes more quickly and reduce the risks associated with information technology initiatives.


What is Organic Information Management
and Why is it Important?

Fundamentals of Organic Systems

Organic systems are not engineered, they evolve. Instead of being monolithic, organic systems consist of multiple small components. Decision-making is distributed among these components. While this appears chaotic, it is the only way that large systems can reliably adapt to changing circumstances.

To begin the step toward an organic approach, companies have to experiment with approaches in a decentralized manner. Experimentation and variations from the norm are central to learning. Starting small and having a trial-and-error mentality can help organizations learn what works for them and safely explore alternatives to the old way of doing things.

Decentralized experimentation is necessary to help incubate new approaches. Experiments are to be encouraged and given safe haven. In a very real sense, the more mistakes the better — especially if they are small errors that are made early enough to have little impact on the organization as a whole.

The more flexible a system is, the less optimized it will be. Autonomy is given to smaller modules that can fill separate needs. Instability and tension are important and necessary, since the only stable organic system is a dead one.

Organic Information Management involves the application of these organic and biological concepts to the design of computer applications and the processes which manage their implementation. Because of the need to improve overall organizational performance, OIM takes a holistic approach to information and management and expands from a narrow technology focus to include the decision-making processes that direct investment choices and handling of authority in an automated system.

Why Mechanical Approaches Fail

The most prevalent organizational design and information management approaches can be traced back to John Taylor and the ideas that he developed nearly a century ago for automating factories. These industrial engineering approaches are based on the concepts of mechanical efficiency, minimization of transformation costs, and an emphasis on resource optimization. Productivity and value are measured primarily in terms of the inputs consumed — material, labor, dollars, time — and the outputs produced. Efficiency results from segmenting and controlling the discrete steps in manufacturing processes. Strategic value results from improving the efficiency of existing processes and outstepping the competition.

Not surprisingly, many of these same industrial engineering principles have been applied to information management. Historically, analysts have sought to model existing paperwork systems and develop computer analogs that would monitor the flow of information. Where possible, processing would be shifted from humans to machines, where errors could be controlled and minimized. Today, most discussions about information management are still ruled by financial considerations, with project decisions based on projected short-term savings. This fixation on operational efficiency and financial measures of value has led to a number of problems.

Sub-optimization and Stove-pipe Solutions

The goal of mechanical efficiency is almost impossible to fully realize. Information lifecycles and their associated production processes tend to be more expensive than the engineered ideal because the desire for maximum performance in one arena has often sacrificed the company via lowed performance, overall. From a mechanistic viewpoint, sub-optimization occurs because computer applications almost always target a subset of the information lifecycle and have a narrow corporate focus. In most organizations, the resulting stove-pipes of technology and information match the organization's functional hierarchy and hamper the efficiency of processes which must cross those functional boundaries.

From an organic viewpoint, sub-optimization occurs because the needs of individuals and groups cannot be completely subordinated to system designs. Accordingly, it is probably not possible to completely eliminate redundant software systems, unsynchronized parallel information streams, and duplicated development and maintenance costs. At the same time, open information standards such as SGML (see sidebars: Realizing Organic Value through Organic DTDs and Using SGML to Balance Mechanical and Organic Approaches) can help reduce some of the costly problems with sub-optimization: the over-reliance on paper-based information exchange, data integrity problems, and the need to recreate, reconcile, or perform expensive, labor-intensive conversions of information assets.

In a way, the commercial software industry is responsible for many of the biggest problems. Proprietary data encoding schemes provide individual vendors with competitive advantage, but they also help create the pockets of information that block information compatibility and process integration. In addition, most commercial applications retain, at their core, a paper paradigm — where the computer functions like a big, powerful pen. They draw very fast, can handle very complex images, and allow printing instructions to be revised, but they do little to manage the true meaning of the information. By doing so, they emphasize personal productivity, not overall organizational performance. In contrast, most open information management standards are better able to balance local variations and customization with the need to reuse and share information.

Lack of Flexibility and Customer Responsiveness

Accelerating economic and political change is forcing many companies to adapt and change faster than their information infrastructures allow them to do. Many of these organizations have begun to realize that software isn't very soft. The system won't let me do that is an increasingly common complaint. Strategic business initiatives are often delayed or hampered by trying to adapt or modify legacy information management systems that weren't designed for such changes. In short, you can't streamline smaller, isolated systems that cannot even talk to each other at the most basic level!

In his Harvard Business Review article: What Really Makes Factories Flexible? David M. Upton provides some examples of this phenomenon. Upton studied the relationship between Computer-Integrated Manufacturing (CIM) and operational flexibility within the fine paper industry. He found that in most cases, CIM reduced flexibility. The CIM systems operated the plants more conservatively than human operators, slowed changeovers, and produced more paper breaks than the computer's human counterparts. In addition, operator skills atrophied from lack of use after CIM systems were installed. The biggest detriment to flexibility, however, was traced to management behavior. By dumping the problem on technology and (directly or indirectly) technologists, the managers used CIM to dodge their responsibilities for defining flexibility objectives and managing the organization to meet those performance goals.

Dysfunctional Decision-making and Investment Strategies

Defining productivity solely in terms of physical inputs and outputs started to lose viability in the 1930's as quality — not quantity — became a significant measure of value. Since then, the information content of goods and services increased. In the 1970's with an even more noticeable shift from a manufacturing-based economy to a service-based economy, Taylor's productivity model lost almost all validity.

Accepted productivity models became invalid, in large part, because their methods of measuring value, such as return on assets (ROA) and return on investment (ROI), are based on the assumption that capital is the scarcest — and thus most valuable — input. Today, capital is a commodity item, crossing international borders at the speed of light. Human capital and knowledge are now the most valuable inputs. Individuals who can mobilize the productive capacities of others, protect the organization's future, and direct the efficient use of capital are more scarce than the capital resources — themselves — even though they are virtually invisible from productivity calculations.

Without productivity measures that reflect the changing value of information and human capital, however, organizations often find themselves trapped in dysfunctional decision-making processes that allocate resources poorly and make bad investment decisions.

For example, American productivity appeared to fall as white collar workers became an increasing portion of the labor force. By using the standard productivity models, many organizations counted the reduction of clerical and secretarial staff as a cost savings, even if their work was just shifted to higher paid managerial and professional personnel.

In The Business Value of Computers, Paul Strassman states that invalid productivity measures can force information management investments to be justified only in strictly financial terms, where—as with all other capital investments — the projected costs and benefits need to be estimated with some confidence. When applied to IT investments, however, these justification processes become counterproductive and prevent needed change. Even worse, requiring reliable savings estimates doesn't reduce the risk of projects as much as it biases projects towards the automation of routine clerical operations and away from those that also involve the upgrading of business processes.

In short, Strassman found that strategies to manage information technologies through project authorizations failed to improve organizational performance, increased risk — as long-overdue improvements were combined into larger and riskier projects — and failed to control costs, because initial purchase and implementation had become a shrinking percentage of lifecycle costs.


Application of Organic Information
Management Concepts

Decision-making, Policy, and Politics

Derivatives: A parable for information managers

The financial derivative market has generated much attention. Kodak, Orange County, Procter & Gamble, Metallgesellschaft, and other high-profile organizations have suffered major losses or become bankrupt as a result of speculating on derivatives.

These problems result, in large part, from complexity. Derivatives have evolved so quickly and become so complex that few can explain or claim to understand exactly how many of these investments will behave in the market. Understanding — and thus much of the decision-making — has shifted to a small group of technicians. As a result many senior managers have effectively surrendered their policy-making roles.

After first singing the praises of the new breed of financial analysts dubbed rocket scientists, organizations that choose to trade in derivatives have started to realize that Financial decisions that were previously designed and implemented by specialists need to be monitored more closely from the very top of organizations.

A similar situation exists in the information technology arena. There too, senior managers have largely turned decision-making over to technologists. By selecting and implementing technologies in a relative vacuum, these technologists also implicitly define corporate policies. This is unfortunate, because the stakes of bad information technology investments are just as high, if not higher, than bad financial investments.

Unlike their financial counterparts, however, the impact of poor information management policies and investments isn't as observable or as measurable. The harm done by inappropriate solutions can remain unrealized for years or even decades, hampering organizational performance and competitiveness long after the key participants are gone. CIO turnover, outsourcing, lost opportunities, and downsizing are just a few of the results of poor information investment strategies.

Converting Political Resistance into Support

Information professionals tend to look upon politics with disdain — the enemy of rational decision-making. Politics has a negative connotation. Political forces are dangerous, precisely because few organizations have adopted methods for identifying and addressing critical political issues and communicating the resulting choices in the form of well-defined information policies.

As a result, projects are put at risk. DeMarco and Lister observe in Peopleware: Productive Projects and Teams that projects which require no real technical innovation (such as accounts receivable systems) are failing. Technology is not to blame in the vast majority of these failed projects. Instead, politics is the perceived culprit.

Strassman echoes similar frustrations in The Politics of Information Management, recounting his experiences from the late 1950s: ...I was never sure how my technology plans would turn out because of the problems that emerged every time I started installing well conceived and formally approved computerization programs. It always came as a surprise that well-laid plans could go awry for no apparently good reason except for something I kept dismissing as mere company politics. Today, Strassman has concluded that Managing information systems is primarily a matter of politics and only secondarily a matter of technology.

When viewed from an organic perspective, politics is a critical success factor, perhaps even the singular critical success factor. Uncontrolled, the political forces which result from divergent stakeholder interests introduce uncertainty and can disrupt the success of important efforts. Properly integrated, those same stakeholder interests can be converted from forces of resistance to forces of support. Such alignment not only is an important mechanism for ensuring that information management initiatives provide sufficient organizational value to justify the cost, but reduces the need to only base investment decisions on anticipated cost savings.

Formalizing Information Governance and Policy-making

Just as western culture has seen a steady progression of governance models, Strassman traces the evolution of computing as a political history, where power has shifted from centralized to decentralized institutions. Mainframe theocracies gave way to minicomputer hierarchies, which in turn were displaced by the microcomputer revolution and today's age of cooperative alliances.

Strassman argues that organizations today are posed to enter an age of information democracy, where decision-making is formally distributed throughout the organization using the federalist principles of Jefferson, Madison, and Adams to balance power among a central authority and a number of constituent political units. Specifically, Strassman suggests the establishment of a formalized governance process, which is built around an information management policy board that has sole authority over policy decisions. This board functions like a legislature to define policies involving standards, technologies, and implementation practices. Executive powers are split among a variety of enterprise, process, business, application, and local units that have operational responsibilities and are in the best position to decide how to apply information technology to improve business processes. An auditing organization is vested with judicial powers. In all cases, the representatives of the governance processes are all managers, because all management is information management.

Perhaps the most sensitive issues involve conflict resolution, and enforcement. Formalizing information management policies is of no value if they can be ignored, misinterpreted, or used for political gain. Although clear lines of accountability will resolve many of the conflicts that exist today, swift and competent interpretations are necessary to develop the precedents that future policy changes will need to be based on.

Reliable enforcement can also help to eliminate micro-management. Without an ability to trust that policy objectives will be fulfilled, managers often start directing the execution of work activities and fixating on minute operational details — in hopes of controlling performance and results.

According to Strassman, such a formalized governance process promises to put into place a framework by which individuals and organizational units can cooperate while minimizing the debilitating effects of contentious politics and formalize the information policies that are needed for improved organizational performance. Computer staff can safely deliver superior results only if they are executing an unambiguous information policy. Fuzzy objectives and ill-defined means lead to confused execution. Under such conditions, it may be a matter of luck by blundering into success if what is delivered matches what management alleges that they always wanted.

The Use of Values to Integrate Divergent Stakeholder Interests

Policy is the codification of an organization's desired balance among competing values. Thus the process of defining policies is a political process that pits the values, interests, and desires of groups and individuals against each other.

Political conflict often results from the polarizing effects of group decision-making. As Strassman points out, Politics is the art of the feasible and one of the primary challenges in group decision-making is to identify and define a shared vision for the future. Joe Beck, of the University of Eastern Kentucky, has concluded that values, not facts, form the basis for most decision-making and that if agreement can be reached on the values that should drive a decision, consensus can usually be found.

Beck has developed a predictability model called VIE to translate values, interests, and expectations into a range of acceptable options, effectively defining the conceptual space for politically viable solutions. Each individual brings with them a set of unacceptable solutions. When those are combined at a group level, the resulting range of tolerance is usually much smaller than for any given individual. When VIE analysis is applied to a diverse group of stakeholders in a facilitated setting, however, the resulting range of tolerance can be much broader than would be expected in political models that promote polarization. Beck's facilitation model provides a collaborative framework for learning and negotiating a shared consensus of how individual values should be integrated and balanced to define mutually-desired end states.

After achieving alignment of the stakeholders around a shared vision, the group can begin to identify mutual drivers and constraints, which are marked as key issues to be considered during the development of future strategies. This is similar to marking land mines in order to navigate around them. Dispute resolution occurs by focusing on mutual long term agreements rather than fixating on areas of disagreement.

Beck developed this approach to address the political issues associated with the creation and implementation of environmental policy, responding to the inability of traditional strategic planning to converge external and internal stakeholder input. He has tested the approach with over 100 groups during the past few years on high-conflict-potential projects and successfully aligned diverse and contentious groups of stakeholders dealing with a wide array of issues. These successes include the definition of new missions for international governmental agencies, the development of enterprise-level information policies, and the structuring treaty negotiations between antagonistic governments to resolve sensitive national security issues.

When used at the project level, VIE analysis integrates the diverse views of multiple stakeholders into the strategic planning activity, but leaves decisions to the true decision makers — not planners or technologists. This ensures that projects better reflect perceived requirements, have a supportive constituencies, and facilitate the learning and sharing of knowledge necessary for wise decisions.

Application Design

To be effective, computer applications need to reflect the messy ways that people really produce, deliver, and use information to effect change. Thomas H. Davenport writes in the Harvard Business Review that ...a human-centered approach assumes information is complex, ever-expanding, and impossible to control completely. The natural world is a more apt metaphor for the information age than architecture. From this holistic perspective, all information doesn't have to be common; some disorder and even redundancy may be desirable.

Unlike traditional information management, which is based on the concepts of mechanical systems, Organic Information Management is designed around biological concepts, such as learning, adaption, competition, and chaos. Organic designs are concerned less with mechanical efficiency and more with how the application helps organizations, groups, and individuals learn to respond to environmental changes.

Organic designs assume that the tools and techniques used today will be replaced at an ever accelerating pace, evolving and adapting through time. They abandon the idea that analysts can identify and formalize every possible relationship between pieces of information. Instead, they distribute decision-making to assure maximum flexibility to individuals and workgroups, within the limits of enterprise and institutional requirements for information to be accessible and usable throughout its lifecycle.

Organic designs reflect the way that people actually produce, deliver, and use information to effect change. They allow individual workgroups to work with their stakeholders to structure their information holdings based on process-specific requirements and provide a framework for individuals and groups to describe and document relationships as they are uncovered (or learned).

Avoid Comprehensive, Ideal Solutions

Monolithic software engineering projects, especially as reflected in the data dictionary and enterprise architecture planning schools of thought are close to death. Relational database nirvana — all-encompassing relational database architectures — will not be reached. The main reason is time. Most organizations do not have time to wait. By the time comprehensive database architectures are designed, there's a good chance that the organization's understanding of their needs and the organization itself will have changed. By the time the system is implemented, the organization may not even exist.

Risk increases dramatically with the size of the effort and can be minimized by keeping projects and design teams small. While this tends to promote sub-optimization and violate many lifecycle information management principles, chances are that the solutions would have had mistakes regardless of how well-designed they were. The idea here is not to avoid mistakes, but to make as many mistakes as quickly as possible. This facilitates the learning and understanding which are necessary for the evolution of successful solutions. Also, a series of small mistakes is usually far less disruptive than large, expensive ones.

Monolithic systems are also unlikely to meet the needs of all users or be culturally acceptable. Davenport states that Some managers have always been distrustful of the information systems approaches of their companies, largely because they didn't understand them. In many cases, they were right to feel uneasy. As the diverse company experiences suggest, grand IT schemes that don't match what rank-and-file users want simply won't work.

Rely on Relationships

Organic designs do not abandon engineering all together, but shift the emphasis from comprehensively engineered systems to well-engineered components that can be integrated in a number of useful ways. This integration is performed by the users of the system — not the system designer.

Integrating information management components as needed through time reduces risk and improves organizational flexibility. Risk is decreased because the system designer is not challenged to identify all possible uses for the information objects and every needed software behavior during development. Likewise, component engineering is smaller in scope and shorter in duration than system engineering, and that also lessens risk.

Organizational flexibility is improved when humans to do what they do best: interpret situations, weigh alternatives, and apply information and tools to meet task objectives. Organic designs do not lock individuals into predefined behaviors, but give them the information tools that they need to solve unanticipated business problems and preserve their value to the organization. Both individual and organizational flexibility are enhanced when ...the system [is] designed from the outset to help workers make better decisions rather than to cut them out of the decision-making process. [Upton]

Allowing individuals and groups to define their own interfaces and link them to important data elements is another attribute of organic designs. Allowing individuals to preserve and document the relationships that they find useful also begins the transition to knowledge-based systems, where the both the system and the human become more knowledgeable the more that they interact. Shifting from formalized structures to user-defined relationships helps to preserve investments, especially when the information technology initiatives have strategic value and are intended to create destabilizing change, either to the way that the business operates or the way that people think.

Diagnose Before Prescribing

Like organisms, organizations get sick when they can't adapt to change. In The Stress of Life, Hans Seyle used the phrase General Adaption Syndrome to describe an organism's response to stress. By improving an organization's ability to adapt, and thus its well-being, computers can have therapeutic effects. Both mechanical and organic approaches to information management can have value, but both can also be misapplied. A specific therapy should vary with the condition of the patient and the cause of the malady. Thus, differentiating organic and mechanical goals, requirements, and deficiencies is a critical diagnostic process.

In most cases, an effective computer system balances mechanized and organic elements. Numerous information management components will need to be designed and engineered very carefully for an organic approach to work, but OIM-based systems are characterized by intentionally limiting such engineering to the component and sub-system levels. This leaves much of the final integration to be done by humans in the form of defining relationships between components or executing some of the behaviors that would otherwise be automated. Such an approach is likely to be more expensive than fully automated solutions when comparing isolated transactions, but it is imperative to maintain flexibility and resiliency. Thus, it is almost certain to lower long-term costs by preserving the value of information technology investments.

Reengineering

Reengineering is the Jekyll and Hyde of process improvements. On one hand, it is very popular. Nearly 2 million copies of the book Reengineering the Corporation have been sold worldwide. Thousands of organizations have active reengineering programs. On the other hand, horror stories abound and criticisms are easy to find. By focusing on its organic aspects, it is possible to get a better understanding of what reengineering is all about and why so many individuals and organizations have trouble applying its concepts.

Reengineering has strong organic foundations

As described by Champy, reengineering is all about building a fundamentally better place to work. A reengineered organization should support profound questioning of organizational purpose and procedures and encourage radical experimentation. Reengineering is based on trust and a belief in an expansive human potential — that as jobs become more complex, people will be able to do the work. Jobs will become more rewarding and take better advantage the pluralistic thinking of multi-skilled individuals. A reengineered organization is built around multiple goals and integrates the diverse thinking of varieties of technical experts and stakeholders. Decision making is distributed and authority isn't derived from position in the org chart, but how individuals contribute to the success of the organization.

Reengineering is very much an organic process. So much so, that the primary challenge for an organization starting a reengineering program is one of being. That is, to understand its fundamental purpose and the realities of its position within its environment — improving organizational focus through shared vision and goals. Reengineering also acknowledges the idea that change changes change. New levels of efficiency, processes, and technologies will often change the very nature of what the business is all about and eliminate the fragmentation found in stove-pipe organizations by resynthesizing work along process flows.

But in practice, it is usually something quite different

In practice, reengineering is usually something very different from the organic approach described above. Organizations do a variety of things and call them reengineering. In many cases, it is simply a cover for attacking the productive capacity of the organization, replacing voluntary agreements that are mutually beneficial with exercises of power and forced self-sacrifice.

Meg Wheatley describes this phenomenon in Leadership and the New Science: Reengineering is the supernova of our old approaches to organizational change, the last gasp of efforts that have consistently failed. What is reengineering but another attempt, usually by people at the top, to impose new structures over old — to take one set of rigid rules and guidelines and impose them on the rest of the organization? It's a mechanical view of organizations and people — that you can `design' a perfect solution and then the machine will comply with this new set of instructions.... Reengineering doesn't change what needs to be changed most: the way that people at all levels relate to the enterprise. We need to be asking: Has the organization's capacity to change increased and improved?... Or have we just created a new structure that will atrophy as the environment shifts?

Even Champy is appalled by the way that reengineering is being used as a cover for destructive downsizing, having seen numerous organizations reduce the numbers of workers without changing the fundamentals of the work process. Organizations that reengineer by simply downsizing (leaving the remaining people to figure out how to cope) do little but reduce their functional capacity and create organizations that are populated by the walking wounded. In a January 1996 speech, Champy even expressed a fear that a generation has been taught to grow by cutting.

Why has Hammer and Champy's reengineering concept caused so much trouble?

The human factor was often overlooked.

Reengineering promises cost savings and smaller and more efficient organizations, but many confuse cause and effect. A more productive organization that has aligned its business processes with technology improvements usually can do more with less. On the other hand, simply cutting budgets and staff does not, in and of itself, cause business processes to be changed and new technologies to be implemented. In most cases, such cuts undermine change and innovation. A period of investment is usually required before returns can be realized.

New competitive pressures, including white-collar layoffs that were virtually unheard of before 1992, are causing stress, fear, and even depression. In most cases the fears are quite rational, but many companies and managers are unresponsive to the challenges facing them, choosing instead to deny the future and failing to act in ways that will protect the organization's future. As the popularity of the Dilbert comic strip testifies, most people work in organizations that they perceive as clueless and dysfunctional. The fight/flight response is very well documented. It is a common human tendency to destroy threats which can't be avoided or fixed.

Hammer and Champy mislabeled the concept.

Organizations aren't engineered — they evolve. They have more in common with tribal cultures than bricks and mortar because they are built of humans and human relationships, with all the historical baggage, learning, and experience that humans bring to every activity. Treating people like machine parts is not viable, especially if you are seeking to mobilize their creative and adaptive capacities.

But a book titled Renurturing the Corporation probably wouldn't have sold as well. Engineering metaphors are, understandably, more culturally acceptable to the intended audience. It is also understandable that a radical, totalitarian approach appeals to executives that can tell that power is slipping from their fingers, but in ways that they don't understand. It is as if the lesson of the Soviet Union's failure with command and control governance has been forgotten.

Even more fundamentally, this quest for mechanical efficiency in human systems is based on idealized views of what science and engineering are all about. As Thomas S. Kuhn and Henry Petroski write — in The Structure of Scientific Revolutions and To Engineer is Human: The Role of Failure in Successful Design, respectively — change in both disciplines is anything but straightforward. The evolution of scientific thought is not a smooth progression of new ideas, but is paced by competition, territoriality, perceptions, and paradigms. Fundamental change in engineering results from learning from mistakes, some of which have disastrous consequences.

Reengineering lacks a process model and reliable mechanisms for executing the change process.

In effect, reengineering advocates design by anecdote and prescription without diagnosis. Hammer and Champy provide extensive examples of what they have observed to be best practices, but many of these practices are contradictory. Without an understanding of the fundamental dynamics and mechanics of these practices, organizations have no real way of determining which of the contradictory approaches are best suited to their own situations. For the most part, Hammer and Champy do not provide any analysis of why these best practices work, or when and how they should be applied.

This is not a small problem. If organizations are candidates for reengineering because they are dysfunctional and their ability to adapt and act rationally in the face of new challenges has been inadequate, it points to a failure of the organization's fundamental decision-making and governance mechanisms. Such an organization cannot be expected to successfully pick and choose from a list of best practices. In fact, their history implies the likelihood of their selecting inappropriate solutions. Accordingly, a formalized method of analysis and execution is needed to help organizations determine how to apply the concepts of reengineering and develop healthy adaption mechanisms.

Reengineering emphasizes radical change and speed.

Reengineering is focused on the goal of making big change happen quickly. Revolutions are never painless. They are also risky and hard to control. In its own way, forced radical change reenforces the tendencies of organizations to avoid deliberative analysis and planning and plays on feelings of fear and hostility towards the status quo. Organizations often get into trouble by making big changes just to impress others. Reengineering just gives it a neat name.

Despite the goal of building a better place to work — and the many organic elements of that objective — the desire for rapid change causes Champy to dismiss organic approaches to decision-making during the reengineering process. He contends that because organic approaches move incrementally, they adapt far too slowly and thus, top-down (non-organic) approaches are necessary to drive the needed changes.

This begs the question, which is faster at effecting change — consolidated or distributed decision-making? This is not a small topic, to say the least. But an argument can be made that the principle barrier to rapid change is that the hierarchical, command-and-control structure of most organizations is adverse to distributed decision-making. At some point the reengineered organization is to have shifted from a command-and-control management system to one which is focused on mobilizing and enabling and where accountability and control are dropped deeper into the organization. When and how are the more responsive governance mechanisms needed for Champy's vision supposed to get implemented and why aren't they used during the transition process?

The confused way that reeengineering handles organizational politics mirrors this conflict. Champy believes that the larger the scale of the change, the more likely it is to succeed. In his view, the fragmentation which exists in most organizations gives everyone a perspective from which to argue against change and organizational antibodies will attack and probably kill the smaller, incremental changes. Not surprisingly, he finds the strength and convictions of these divergent stakeholder interests to be stronger and more pronounced when dealing with intelligent people and smart organizations (e.g., universities, hospitals, research, engineering, and technical organizations).

In effect, rapid and radical change is used as a weapon to force stakeholder interests to be surrendered instead of addressed through a deliberative process. Not surprisingly, this doesn't work. And accordingly, one of reengineering's best practices is to induce genuine debate that will result in disagreements. Many organizations that have skipped the debate stage have found their reengineering efforts ignored and treated as irrelevant, just as other technology and business initiatives that fail to integrate and align themselves with the organization's various stakeholders usually find themselves without a sufficient following are are soon crushed by company politics.

Rapid radicalism is not only dangerous, but it is not the only way to realize rapid results. As Beck's experience applying his VIE model documents, consensus can be quickly formed by replacing the detailed analysis of current processes with detailed analysis of stakeholder values and interests. It is also much safer.


Sidebar: Realizing Organic Value
through Organic DTDs

The Standard Generalized Markup Language (SGML) is well suited to supporting the human-centered approach to information management that Davenport advocates. SGML is a user-driven standard and the industry has evolved with a strong emphasis on usability and interchange. SGML was designed to support multiple representations, allowing interfaces to be customized for different sets of users. The standard is flexible enough to allow individual organizations to determine how stringent or forgiving the conformance testing should be and the formalized structures can not only act as interface specifications but provide a conceptual framework—speeding learning and focusing attention on content.

By making retrieval easier and more cost effective, rich metadata vocabularies can enhance the way that people interact with documents to improve collaboration, learning, decision-making, innovation, and the acquisition and development of knowledge. Perhaps even more important, however, are the conventions that have evolved for the modeling of information (document analysis) and their formalization as Document Type Definitions (DTDs). Davenport places particular emphasis on having individuals design their own information environments, because that participation directly influences the willingness to use the resulting conventions.

The need for participation brings with it a whole new set of issues. First non-technologists must be kept engaged in the modeling process. As compared with the use of relational database technology, SGML makes it easier to construct and interact with complex hierarchical information models, like those found in dictionary entries.

Second, Davenport observed that The more a company knows and cares about its core business area, the less likely employees will be to agree on a common definition of it. SGML makes it possible to "assume transience of solutions," assume multiple meanings of terms, and build point-specific structures. To support these additional requirements, organic DTDs are likely to differ from interchange DTDs in a number of ways:

  • Organic DTDs are built with the expectation that they will change as the organization learns and adapts to new environments.

  • Organic DTDs avoid the use of standardized generic identifiers. Different workgroups may need to retain their own vocabularies. The architectural forms construct, which was developed for the HyTime standard, allows interpretation and processing to be attached to attributes instead of element names. Using architectural forms, workgroup-specific vocabularies can be mapped back to corporate and industry standards, while retaining local flexibility.

  • Organic DTDs support a wide variety of compatible structures. Just as receptor sites accept a variety of compatible chemicals, organic DTDs contain a mixture of standard and non-standard structures. The standard structures are designed to bond with standard systems and non-standard structures are tuned to exploit localized systems. Such an approach is critical to "maximize the fringes" and promote experimentation.

  • Organic DTDs allow a wide variety of metadata models to be developed and applied. As the quality of the metadata largely determines the usefulness of the information, information objects effectively compete for attention based on the quality of their metadata. Metadata requirements are expected to change through time, and this evolutionary process, too, reflects organizational learning. High-quality metadata models support the shift from information management to knowledge management.

  • Organic DTDs rely on self-organization. Instead of organizing information through the centralized design of complex structures, the decentralized interaction of information providers, consumers, and software systems create and maintain order. Hypermedia links are a primary mechanism for organizing and integrating large bodies of information, allowing DTDs to become smaller and easier to develop. Comparatively simple chunks of information are used as shells to hang HyTime links. Like metadata models, sets of links (webs) will compete for resources based on their utility.

SGML also allows—and perhaps encourages—local variations. DTDs can be customized within an individual instance. This has the potential of being very chaotic, and is much in keeping the organic models of decentralized autonomy. Together with the HyTime standard, however, SGML provides mechanisms to embrace the chaos. Because the variations need to be explicitly defined, receiving systems can more easily identify them and determine whether an individual instance contains enough of the required elements to be usable.


Sidebar: Using SGML to Balance
Mechanical and Organic Approaches

The ability to both standardize and customize is likely to become increasingly important as workgroups and organizations are pressured to quickly adapt to changing circumstances and are driven to pursue divergent information management strategies. By supporting both mechanical and organic approaches, SGML and its companion standards (especially HyTime) provide a framework for developing shared solutions that balance competing drivers and help ensure organizational health in the face of uncertainty. SGML allows organizations and workgroups to emphasize either engineering or organic approaches to realize immediate benefits but does not preclude evolving to a more balanced approach in the future.

The challenge to system designers is to provide a framework where local experimentation and variety do not become barriers to information exchange and where investment decisions can be made at a tactical level and still meet strategic requirements. SGML is both flexible enough to meet local requirements anare important precisely because they allow local, tactical solutions to be integrated and shared within a common information management architecture.

Improving Mechanical Efficiency

The classic view of SGML has been as an interchange standard. This use of SGML promotes mechanical efficiency, minimizes data transformations, emphasizes resource optimization, integrates and augments a wide variety of information technologies, and protects information from technology changes. SGML can reduce overall information lifecycle costs by formally defining information structures that meet the needs of a wider variety of information producers and consumers, thus reducing the barriers and costs associated with sub-optimization.

Formalized structures can function as interface specifications between dissimilar systems. Formal validation processes allow individual sets of information to be tested for conformance. Taken to its extreme, a formalized structure can act as a common denominator throughout the information lifecycle, allowing the same information objects to be used for creation, intermediate processing, and archival. SGML can reduce the need for duplicate systems and costly data conversions and make it easier to retrieve, recycle, re-purpose, and reformat information.

Rules-based formatting reduces cost. Lifecycle labor costs can be dramatically reduced as authoring and editing for appearance disappear. Consistency and quality are often improved, as are cycle times. Rules-based formatting dramatically improves flexibility and accounts for much of the reusability and repackaging described above.

The dynamics of an engineering-oriented SGML project are to pick a document type, stake out as much of the information lifecycle as possible, and try to get everyone to agree on a single acceptable structure. On the systems side, information producers and consumers will use the resulting SGML DTD as an interface specification, and build or adapt information technologies to understand and produce and interact with conforming documents. Because of the magnitude of associated investments, these interchange DTDs are meant to be stable artifacts that are used as fixed points of reference for years—or even decades.

Stakeholder Interests and Metadata Requirements

But SGML is an incredibly challenging information management standard. At a fundamental level, the challenges associated with implementing SGML are equivalent to the challenges associated with learning how to use computers and information technologies to improve organizational performance. This is because SGML is one of the better examples of revolutionary computer technologies that were not designed to conform to the paper paradigms of the past but to meet the more complex requirements of information-based economies.

SGML's flexibility allows organizations to do things that could not be done as easily any other way—to make big gains. That same flexibility also gives organizations the ability to do things that are not in their best interests — to make big mistakes. Many leading edge SGML projects are percieved as being so important to their corporation's succes that they won't even discuss them, believing that the mere conceptualization of an SGML application can have strategic value.

Understanding metadata (data about data) is at the core of understanding these challenges. Information, by itself, is not terribly valuable anymore. There is simply too much of it. Metadata, by contrast, is increasing in importance because it provides the hooks needed by computers to determine how to process the data and the handles needed by humans to help identify which pieces of information are relevant to their interests.

What is metadata? The SGML tags within a document instance are metadata. They describe the role of each element within the context of the document's structure. Attributes are metadata, as they further describe important characteristics of the data within the SGML instance. Titles, authors, publication dates, and index numbers are metadata, as are annotations, bookmarks, and other navigational aides.

TV Guide is one of the best examples of metadata and its increasing importance. With the exception the horoscopes and advertisements, TV Guide is almost entirely metadata, and not so long ago, Wired magazine reported that TV Guide makes more money than the four major networks, combined.

When SGML is used to develop a vendor and processing-neutral markup language, the resulting Document Type Definition (DTD) is a formalized framework for capturing and storing metadata. DTD development is often described as a contact sport because of the conflict that results when a varied group of information producers and consumers attempt to develop common definitions. This is because a DTD (or metadata framework) represents a negotiated balance between the divergent stakeholder interests that exist at different points in the information lifecycle. In most cases, those interests diverge precisely on the types of metadata that should be stored and managed.

It is not uncommon, for example, for authors and editors to desire a simple markup language that is easy to use. Information consumers, on the other hand, usually desire richer, more complex sets of metadata. Instead of being satisfied with a DTD that reflects the generic structures of the document (e.g., chapter and title), tags that capture the meaning of the data (e.g, purpose, scope, rationale, part number, voltage, person, software package, company) are preferred. Rich metadata allows documents to better function as databases and can have important benefits when using retrieval tools that support context-sensitive searches.

Disruption of the Status Quo and its Policy Implications

For most organizations, the transition from proprietary, page-based document production and management architectures to SGML will be one of the most destabilizing efforts ever attempted. The adoption of these standards means significant changes to tools, processes, responsibilities, and even the way people think about information. The transition to an SGML-based document management process generally shifts the cost burden upstream and shifts the realization of benefits downstream. Often, the magnitude of these changes become even greater if the organization seeks to maximize the returns on their SGML investments and align production processes with the new paradigms.

Taken together, divergent metadata requirements, shifts in cost-benefits profiles, and process changes are not technology issues, but represent policy choices that a given organization needs to make. In most cases, the most compelling reasons for adopting SGML are not to improve the efficiency of existing operations but to realize long-term policy goals. For many organizations, however, the policy implications of SGML are its most challenging aspects. Identifying and balancing competing policy objectives is often difficult and chaotic.

SGML effectively forces organizations to formalize an information policy, often for the first time. This is not easy. It also goes a long way towards explaining why it is so difficult to develop a business case for SGML. The potential benefits of using SGML are well known and fairly well documented, but many of the benefits are tied to the specific metadata that the organization chooses to implement—not the standard itself.

It is precisely because the SGML standard leaves so much to be decided by implementing companies that a large number of economic, organizational, and technical factors influence both the design choices and investment decisions required to implement SGML. Some of these potential benefits are concerned primarily with mechanical efficiency, while others deal with human interaction and performance. The choices that an organization or project team makes when balancing these competing measures of value have tremendous impact on how (and even whether) potential and intended benefits are fully realized.

For many organizations, real value cannot be easily measured in strictly financial terms. They require the richer, more expensive metadata that can be difficult to justify using only calculated costs and benefits. At the same time, these organic measures of value can be central to the SGML implementation effort and a major source of strategic value. For example, as the information density of business transactions continues to increase, organizations that deliver richer, more useful information products to their customers are likely to realize competitive advantages, relative to competators that focus their their investment strategies around cost savings.

Managing Information Performance

In many respects, SGML represents the derivatives of the information technology marketplace. SGML is a very complex technology whose implementation is dominated by complex policy and associated political issues. Like derivatives, SGML is a strategic technology that allows organizations to make big gains and big mistakes. And many managers don't really understand all that computer stuff, having effectively abdicated their decision-making role to one or more experts: in-house technology managers, developers, vendor's sales reps, consultants, or even the popular computing magazines. When confronted with decisions about using SGML, will managers stop to educate themselves on the policy issues, or will they just hand over virtually all learning and decision-making to someone else? Will their organizations make wise investments? How will they know?

Senior managers have begun to learn that they need to provide oversight and define corporate policies for the use of derivatives to reduce and manage the risk associated with these investments. They also need to provide oversight of and policy-making guidance for their information investments — particularly those involving SGML — to ensure that these investments are aligned with corporate performance objectives.


Conclusions

An increasing understanding of biological systems is influencing the development of IT strategies to improve organizational performance. Specifically, the need for organizational flexibility and resiliency is driving the demand for Organic Information Management (OIM) approaches.

While the engineering of information management lifecycles can provide tremendous value, a narrow focus on mechanical efficiency can be limiting when trying to nurture organizational capacities for adaption and change. The goal of reducing an organization and the individuals that comprise it into a finite set of entities and relationships is idealized, at best. Accordingly, many change-management initiatives are hampered by an inability to differentiate and balance organic and engineered measures of value.

OIM can reduce the risk of project and organizational failure, in large part because politics — not technology — is the greatest source of uncertainty, and thus, risk. The main issue isn't so much whether it is possible to drive change, but what the condition of the organization will be after the change process. From that standpoint, it is much safer to mesh technology with the existing political reality than to try to change that reality through force.

While Hammer and Champy push reengineering as a way to get organizations to address the hard questions of organizational purpose and value, their strategy of motivating people through fear to accept large and risky changes is not the answer. Instead, the integration of stakeholder values and interests into a shared vision for the future can not only effect change, but increase trust and reduce the potential for the change process (and outcomes) to harm the fabric of the organization.

Politics is a dirty word in most organizations, yet politics is a natural part of human interactions. Thus, efforts to ignore or dismiss it make a recipe for disaster. The only real choice lies in the nature of the decision-making mechanisms that serve as the context for political forces to be played out. Humankind has tried a variety of governance models: dictatorships, feudal states, confederations, and democracies. It should not be surprising that humanity's creative potential is better mobilized by decision-making frameworks based on freedom, trust, and nurturing experimentation.

Perhaps the penultimate example of the role of decision-making in organic information management systems is the Internet. The statistics are immense, the chaos is unimaginable, and the attention is becoming frenzied. No one owns it. No one controls it. But as TCP/IP and the World Wide Web demonstrate, it has become an important (perhaps even dominant) source for international computing standards.

The desire for rapid change should not be a justification for dictatorial decision-making and organizational design processes. Beck has demonstrated that values-based decision-making can achieve results very rapidly. Organizational stability and success is more likely to result from clearly articulating who can make what decisions and under what circumstances than from the grand designs of a minority.

As Lao Tzu wrote in the Tao Teh Jing in 600 BC:

Intelligent control appears as uncontrol or freedom.

And for that reason it is genuinely intelligent control.

Unintelligent control appears as external domination.

And for that reason it is really unintelligent control.

Intelligent control exerts influence without appearing to do so.

Unintelligent control tries to influence by making a show of force.


Stemma

Copyright, The Sagebrush Group, 2000-2009.

This document originally appeared in The Gilbane Report, Vol 3, No. 6, January/February 1996.

It was converted to HTML and some wordings were slightly revised in April, 2000.