There has been a dramatic recent shift in sentiment in relation to choice of the best model for developing software systems. The shift has marked a change from the tradition of preparing a detailed requirements specification as the first phase in the development cycle, to a less rigid adaptive evolutionary approach.
The ongoing goal of software engineering is to ensure that a system meets its primary aims in terms of the quality criteria of functionality, flexibility, performance and reliability. Achieving rigorous standards of performance and reliability has never been the problem for developers; rather it has been their inability to capture a rigorous set of user requirements capable of delivering long lasting optimal outcomes.
This is similar to the problem of using rigorous deductive logic to draw conclusions from a set of axioms, but reaching a wrong conclusion because the axioms themselves are incorrect or incomplete.
Time and again this Archilles heel of software development emerges- particularly when a project is large, complex and operates within a dynamic environment. Systemic failure is more often the norm and the litany of collapsed projects keeps growing; particularly in the domains of government and business demanding planning and delivery of complex customer services such as health, education, infrastructure and communications.
A vast literature has accumulated on this endemic problem: how best to capture the enduring requirements of a system. It is the elephant in the room at almost every CIO seminar and conference.
A number of techniques have been applied over the past fifty years, each hopeful of delivering the magic silver bullet including- functional, data, entity relationship, process and object-oriented analysis applied at various levels of sophistication. Each manages to capture a particular facet or dimension of user aspirations- but never the whole set.
Libraries of tools and methods also cover all phases of the traditional software development cycle- requirements analysis, design, coding, testing, implementation, as well as project and quality management- but still the problem remains.
Organisations attempt to deal with the problem in a number of ways.
First by buying off-the-shelf, pre-packaged software, hopefully flexible enough to be easily tailored and adapted to an organisation’s requirements. But this solution only works if a reasonable functional match exists in the first place and if the level of built-in flexibility is sufficient to avoid costly re-working over time, beyond ad hoc version updates.
The second way is by linking together multiple functional components like a Lego set. But this also only works if the components are available and can be adapted independently by the customer and if they fit together without the need for complex middleware.
In past decades, these approaches often worked adequately for standard systems such as accounting, inventory, sales, maintenance, CAD, HR, job scheduling, project control, office systems etc. But even these became obsolete or unmanageable over time as protocols changed, customer expectations increased, technological change accelerated and the enterprise’s products and services evolved.
Perhaps in our efforts to tame the elephant, we have focussed on the wrong problem.
In the 21st century we live in a vastly different world of web services and SOA’s, cloud and mobile computing and enterprises which must continually adapt to a bewildering mix of competitive and economic pressures, almost on a daily basis.
On the other hand we have proof that immensely complex systems can be built and remain viable and continue to deliver real value over time- vast communication systems such as the Internet and World Wide Web, reliable operating systems such as Unix, Linux and Symbian, social networks such as Facebook and Myspace, families of powerful scripting languages based on Java, ever-improving search engines such as Google and Safari, easy to use databases such as SQL and an increasing number of flexible online e-business applications from the new utilities such as Amazon.
These are cooperative innovative works in progress, which have been tested through many iterations by scenarios and prototypes, before emerging in beta form; all developed in close consultation between developers and their user communities. And they continue to adapt as community needs evolve on a daily basis.
These are examples of the new emerging class of evolutionary adaptable systems.
The major driver for the emergence of this radical evolutionary paradigm is the accelerating rate of social, technological and economic change, particularly over the past twenty years. In almost all cases this acceleration will mean that long lead times for systems development are now untenable and almost certain to lead to obsolescence or outright failure- certainly before an adequate ROI is achieved.
It is becoming rapidly recognised that any realistic requirements engineering methodology must incorporate an evolutionary approach, combined with an efficient mechanism such as Agile programming and design techniques, for converting evolving functional and process requirements incrementally to a useable system. This enables the enterprise to adapt to the continuing dynamics of social, business and technological change, by continuously spawning new functions or incremental amendments, without disrupting its core processes.
The same change imperative applies to small as well as large systems. The risks inherent in smaller systems in the past just haven’t been as obvious or critical. In fact any significant system build that hopes to meet its user aspirations of long-term support and value contribution, must adopt an evolutionary approach.
Risks in evolutionary development also exist as for traditional systems - the risk that managers misread the environmental signals, such as in the case of GM’s disastrous planning decisions and continue supporting ineffective reporting systems; or that the updates and changes become so pervasive that the system becomes unwieldy and opaque, as in Microsoft’s early Vista system.
But the risk impacts from not following the evolutionary canon are far greater. Wrong management decisions can be quickly turned around by agile methods, if recognised in time. Building individual inappropriate functions can waste resources and cause annoying disruption, but don’t cause catastrophic project collapse and massive system re-design time delays and budget overruns.
In the future, the trend towards applying evolutionary techniques to software development will become embedded in IT best practice, particularly as this will be coupled with the parallel trend towards autonomic management of enterprises; interacting with the human and physical world on a real-time basis.
The record of systems development to date is appalling, but not through lack of the enormous level of innovation, effort and professional skills applied. It is because we have found it difficult to come to terms with a constantly evolving world impacting our built environment. We have ignored the fundamental principle that systems must continually adapt to changing environments if they are to survive.
This is as good a silver bullet as the IT industry is likely to get.
Evolution has been the universal driver of all systems- biological, social and now economic and computing, since the universe began and we ignore its wisdom at our peril.