Tuesday, November 17, 2009

Future Enterprise- Evolution of Cloud 2.0

A major shift in IT business models will emerge from the next incarnation of Cloud Computing- Cloud 2.0.

The Cloud is a metaphor for shared infrastructure, software and data within the web. The original cloud concept emerged in the sixties, not long after the commercial genesis of computing, with the advent of the Service Bureau. This allowed smaller companies to share in the benefits of the golden computer age by running their applications on large service provider mainframes. Access was provided by punched card readers and later by remote computer terminals with printer output.

Minis and desktop personal computers then dominated during the seventies and eighties and the original service bureau industry faded away. In the nineties the Internet and Web evolved, allowing online remote services to return; this time based on the client-server model linked to in-house PCs.

Cloud computing is the next evolutionary step in shared computer processing, using virtualised information services delivered on demand over the Internet, circumscribed with SLAs and user–based pricing. The major advantages as in the sixties, are lower costs and fewer technical maintenance and upgrade problems. As with the original service bureaus, the computing infrastructure and much of the application software is based on reliable services delivered via remote data centers; this time accessible via a web browser.
In addition the Cloud is evolving to deliver personalised intelligent and mobile applications; for example streaming SaaS- Software as a Service using virtual assistants to organise data-mined information. AI empowered mobile applications might include sharing time-critical market information, planning meetings, responding to voice commands or analysing traffic patterns to determine the speediest or most fuel-efficient route for an individual.
Most of the major service and software providers such as IBM, EDS, Apple, Google, Amazon, Yahoo, Microsoft and e-Bay have now established significant and expanding cloud services, providing access to their proprietary databases through Web APIs. Cloud service categories now cover a large range of standard applications including-
Social – social networks, video and photo sites, virtual worlds, and multi player gaming
Business–office and workflow, customer relationship and sales, workforce, supply chain, financial and booking applications
Utilities- skype, paypal, peer-to-peer networking
Plus numerous statistical, user-generated, media, science, geographic and cultural services
But the next evolutionary phase of the Cloud will offer much more- in particular data linking. This will promote the sharing of datasets across diverse domains and between business, research and group partners, bringing the full semantic power of the Web into play and changing the face of business forever.

Different APIs rely on different ID and access mechanisms as well as data in specific formats. Therefore APIs have tended to slice the web into separate sources and silos, restricting its full potential.

Tim Berniers Lee’s recent publication of Linked Data Principles for connecting structured data on the web, provides a future blueprint for connecting information from different sources into a single global data repository; accessible by generic data browsers and standard database and query languages.

This not only allows web documents to be identified, but also real world entities using the RDF- Resource Description Framework schema and Web Ontology language- defining mappings between related domains. The web of linked data will therefore immeasurably expand the classic document web, creating a global data network capable of spanning and weaving multiple data sources.

An increasing number of data providers have now begun to implement these Linked Data principles, leading to the creation of an open global data space containing billions of links and coordinated by the World Wide Web Consortium.

Future Trends

The trendlines are now becoming clear. The web is advancing to a multi-dimensional medium for discovering, publishing and linking seamlessly documents and data, leveraging semantic intelligence and mobile capabilities.

Individual supplier services will obviously continue to build, but enterprises will increasingly access common data clouds as well as most utility services, which will in the longer term become open source or common global property.

Cloud spaces will continue to blend and split, fragmenting and reforming in unlimited combinations and permutations. They will share data as media organisations already do amongst themselves and with countless news aggregators such as Google. The divide lines between public and private ownership of application IP will also become fuzzy, with most applications and algorithms over time converting to generic forms- as many critical medical drugs now have.

Software and system suppliers will need to differentiate their products increasingly as focussed value-added services, targeted to specific enterprises and industries as IBM and others are currently doing in forging partnerships for their new customised Smart Planet infrastructure business models.

Service applications will therefore be differentiated primarily by the level of value they contribute to the enterprise. Enterprises in turn will become more strategically porous, linking their core processes and management decisions more organically with their partner service providers and the chameleon Cloud 2.0.

Thursday, October 15, 2009

Future Enterprise- Network Science

Network science will be a critical enabler of advanced enterprise management in the 21st century.

Major advances are already being made in applying the principles of network science to social, technological and business systems and it will be vital for the future enterprise to weave sophisticated network optimisation principles into all aspects of its operations.

Network science essentially involves analysing and managing the properties and dynamics of interconnected complex systems such as social groups, the Web, power grids, supply chains, markets, ecosystems and the brain.

Such networked systems are based largely on scale-free topologies. This is the natural architecture most relevant to the world around us and is modelled on structures with a relatively small number of hubs or nodes, each with a large number of connections and a much larger number of nodes with a relatively small number of links- broadly obeying a mathematical power law.

Knowledge of network topology and dynamics allows for optimisation and prediction of the behaviour of complex system processes and is becoming increasingly vital in managing major business activities, via information systems that control vast numbers of interlinked transactions, resources, agents and events.

Failure in a tightly coupled network such as a power grid or market system of a single node may force the failure of other nodes, resulting in cascades of failures, eventually triggering a catastrophic breakdown of the whole system, as in the recent collapse of the global economy.

Examples of potential applications of network science principles include-

Economies and Markets- reducing the risk of global failure by ensuring economic networks are more robust; by closely monitoring market signals and adjusting the topology of nodes and links to reduce the risk of runaway feedback loops and conflicts between local interests and global efficiency.


Ecologies and Biodiversity- improving the sustainability of ecological systems in a period of global warming with the capacity to provide timely warning of species and resource collapse. The network model provides a powerful representation of ecological interactions among species and highlights global ecological interdependencies, which can then be re-modelled to manage risk.

Business and Finance- improving the capacity to make quality decisions regarding markets and product development, to avoid the future collapse of companies such as General Motors and Lehman Brothers. In these instances, poor decision-making was amplified by the systemic risk of runaway cascading financial asset dependencies, due to overloaded coupling strengths between nodes and indeterminate feedback loops in the myriad interconnected customer and supply networks.

The relevance of network science for the future enterprise is therefore threefold-

Firstly, many of the systems involved in business may be modelled in the future by scale-free networks, such as supply chain, investment, infrastructure, production and customer systems; with nodes representing suppliers, assets, products, consumers and customer groups.

Secondly, an organisation’s systems may be modelled by networks, with nodes representing process and activity decisions and the links represented by the dynamic flows of information feeding them.

Thirdly, the architecture of the enterprise itself may be viewed as a network of control flows between decision-makers and operational agents. As processes become more complex and time critical they will be increasingly automated, but the architecture- the information and decision-making structures and channels, will still need to be continuously optimised.

Future Trends

As forecast in previous posts, the enterprise of the future will be driven by networked architectures- patterns of linked decision processes- constantly morphing, reforming and adapting to a continuous flux of a changing global environment.
Today’s traditional hierarchical or even flat management models will be incapable of supporting tomorrow’s vastly more complex and competitive techno-social environment.

Such techno-social systems composed of technological layers operating within the larger social and physical environment that drives process application and development will need a more integrated, adaptive and intelligent framework for achieving sound management capability, underpinned by network science.

Most real world transportation, manufacturing, computing and power infrastructure networks will be linked and monitored by sensors and tags embedded in largely autonomous networked societies; constantly adapting to global evolutionary dynamics.

Network science algorithms will be developed to monitor and engineer optimal decision topologies, critical thresholds and non-linear outcomes. These will combine with AI technologies to manage complex enterprise operational and management processes.

These algorithms will apply adaptive defence mechanisms, often providing counterintuitive approaches to the engineering and control of complex techno-social systems. Such techniques will be based on the manipulation of key nodes, links and pathways to induce intentional network behavioural changes- mitigating for example potentially catastrophic outcomes.

This will represent the new Network Science Management Paradigm of the 21st century.

Monday, October 12, 2009

Future Enterprise- The Networked Enterprise

The enterprise of the future will be driven by a networked architecture- patterns of linked decision processes; constantly morphing, reforming and adapting to a continuously changing social and business environment.

The traditional hierarchical management model of the 20th century will be incapable of supporting the vastly more complex and competitive 21st paradigm of technological and social evolution.

Tomorrow's enterprise can be most effectively represented as a decision network model with decisions as nodes and information flows linking the relationships between them. This model represents an extremely powerful mechanism for understanding and optimising the adaptive enterprise of the 21st century- linked to but extending far beyond current simplistic process models.

Although process and object representations are a necessary and logical intermediary step in the evolution of enterprise system modeling and management, they fail to represent the underlying decision complexity of the real world and therefore fail to realise the true potential of a dynamic enterprise.

The core of the Networked Architecture will be the Decision Model, incorporating engineering methods based on decision pathways, with the capacity to dynamically route information and intelligence resources to critical decision-making agents in the enterprise.

This will not only involve the deployment of computing and information resources to adaptive decision nodes, but facilitate direct targeting of intelligence and problem solving capacity, enabling critical decision outcomes to be implemented in optimal time frames.

The latest 'Smart Planet' paradigm, in which the infrastructure and processes of the planet- whether manufacturing supply chains, electricity grids, water networks or traffic flows, are being re-engineered to optimise performance and achieve greener outcomes, will be the major driver for the networked enterprise of the future. The Smart Planet will demand that decisions be made more rigorously, efficiently, adaptively and therefore largely autonomously.

While SOAs focus on basic services, their capacity to implement complex decision processes is far from optimal. Current business intelligence and data warehouse software represents a halfway house towards this goal. But predictive techniques utilising AI will be the next stage, layered on current data mining and pattern recognition software and supported by a new generation of network-oriented database management systems.

Although the more far-sighted businesses are becoming aware of the need for such flexible small world network linkages, the support provided by today’s rigid organisational management architectures and philosophies has lagged well behind.

Tomorrow’s enterprise management must be far more pro-active and sentient in relation to environmental and structural change, avoiding being caught passively flat-footed in a bewildering flux of global evolution and competitive pressures.

Future Enterprise- The Intelligent Enterprise

The enterprise of the future will increasingly depend on a wide range of rigorous artificial intelligence systems, algorithms and techniques to facilitate its operation at all levels of e-commerce management.

As described in The Adaptable Enterprise blog, major decisions incorporating sophisticated levels of intelligent problem-solving will be increasingly applied autonomously within real time constraints to achieve the level of adaptability required to survive in an ever changing and uncertain global environment.

In addition web services will draw on the advances already made by the semantic web combined with the intelligent web 4.0.

A number of artificial techniques and algorithms are rapidly reaching maturity and will be an essential component of Intelligent Enterprise Architecture of the future.

Current techniques include-

Genetic algorithms- achieve solution discovery and optimisation modelled on the evolutionary natural selection process- based on the genetic operators of cross over, replication and mutation and measured against a 'fitness function'.
This technique is widely applied to solve complex design and optimisation problems.

Bayesian networks- graphical models representing multivariate probability networks- providing inference and learning based on cumulative evidence- widely used in medical diagnosis

Fuzzy Logic- based on natural non-binary methods of decision-making- assigns a

Swarm Intelligence- combines multiple cooperating components to achieve group intelligent behaviour.
Neural networks- pattern discrimination techniques modelled on neuron connection.Allows information inputs to be weighted and an activation threshold established.
Expert Systems- rule based inference techniques targeted at specific problem areas.
Intelligent Agents- designed to be adaptive to the web's dynamic environment- an agent is designed to perform a goal and learn by experience- can also act collaboratively in groups achieving higher levels of intelligence and capable of making increasingly complex decisions autonomously.

Future Trends

The above techniques will continue to be enhanced and packaged in different combinations to provide immensely powerful problem solving capability over time. The technology is slowly being applied discretely within business intelligence, data mining and planning functions of enterprise systems.

However AI is yet to realise its full potential within the enterprise by being applied to decision-making in a targeted autonomous fashion.
When this happens over the next decade, the quality of decision-making and concommitant reduction in operational and management risk is likely to be significantly improved.

Sunday, September 13, 2009

Social Computing

Social computing is the crucial next step following the rise of personal computing in the evolutionary computing stakes.

The term Social Computing has many connotations. It has come into prominence over the past few years due to the growth of dozens of hugely successful social networks such as Facebook and more recently Twitter; providing a global way of keeping in touch and exchanging information between friends and acquaintances.

In the process of linking with others, a network of social relationships is established. Social computing is in essence a way of codifying and exploring these relationships between people and agents in social spaces- crowds, communities, cities, markets etc.

Social computing applications focus on web-supported online communities such as social networks, wikis, blogs and virtual worlds, providing feedback on interactive social comment, entertainment, scientific and medical advances and business services. It also supports techniques for collective forecasting and decision-making, utilising the combined power of groups and communities to solve difficult problems such as those associated with major disasters and conflict. In addition it is increasingly applied to help analyse how changing technologies and policies affect political, social and cultural behaviour

A set of techniques and algorithms are now being developed based on network, cognitive and evolutionary theory, which will have major ramifications for the enterprise of the future. For example, business strategies and competitive markets have been increasingly characterised by turbulence, uncertainty and complexity. Consequently there is a need to model such markets and strategies as dynamic, evolutionary processes; that is, as complex adaptive systems.

In addition, data mining and simulation are applied to study social networks. Data mining can uncover patterns such as an organisation’s network structure, properties and relationships between suppliers and customers. Agent based social simulation or understanding social phenomena on the basis of models of autonomous agents has also grown tremendously in recent decades. Researchers use this approach to study a wide range of social and economic issues, including social beliefs and norms, resource allocation, traffic patterns, social cooperation, stock market dynamics and organisational decision-making.

There is as yet no effective widely accepted methods for modelling complex systems, especially those involving human behaviour and social organisations, but collaborative agent-based artificial life is currently the most promising approach.

Using agent-based modelling, an enterprise can construct a virtual competitive market that allows business strategists a way of investigating a range of realistic scenarios. For example agent models can account for interactions between irrational and rational investors in stock market bubbles.

The social network paradigm in turn has created the concept of the Social Fabric, which mediates the interactions of the network’s agents and information flows in much the same way as spacetime mediates particle interactions and exchanges. It therefore allows us to explore the dynamic social and cultural aspects of the world in which we live or in which an organisation exists.

Cultural Algorithms or CAs are one of several approaches to modelling the social fabric and the application of social intelligence to solve problems related to optimisation, based on particle or agent swarming. They are therefore a class of computational model derived from observing the cultural evolution process in nature. Embedding an activity or problem in a social fabric can improve its performance or solution outcome, enabling the system to find a better solution than the original over a number of population iterations.

Many organisations have adopted Web 2.0 tools to stimulate innovation and productivity. They are also beginning to embrace social networks as a way of more effectively marketing services and tracking customer behaviour.
But applying social computing to improve the quality of decision-making, process optimisation and prediction scenarios is not yet on the horizon for most.

In the future social computing will be an integral component of the strategic and operational management of the future enterprise, at the same time transforming the web into a truly collaborative and social platform.

Tuesday, August 4, 2009

Complex Event processing- The Smart Enterprise

Complex Event Processing or CEP, is a technology in transition- a precursor to more sophisticated autonomous decision processing emerging within the future enterprise.

CEP systems collect data from numerous sources and raw events within an environment such as a physical emergency situation or a company’s operations and uses algorithms and rules to determine in real time the interconnected trends and patterns that create complex scenarios. The results of this analysis is then channelled to the appropriate decision-maker for action.

This process should therefore be recognised as the beginning of the emergence of the ‘Smart Enterprise’. In time most major decisions within the enterprise will be associated with CEP events.

We are now also entering the era of the ‘Smart Planet' revolution. This is IBM’s mantra, but also that of Cisco, Google, Microsoft, Oracle, SAP, GE and every other major information services player. Adaptive and responsive techniques, largely autonomously managed, are beginning to be applied to the optimisation of the design, maintenance and operation of infrastructure and business processes. These include electricity and communication grids, healthcare, financial, transport, investment, building, engineering, emergency response and supply chain systems.

But in order for this revolution to occur, the enterprise must also evolve to be equivalently ‘smart’. Smart infrastructure without smart enterprise management won't compute.

Collecting the raw data for CEP will inevitably create information overload for the enterprise as sciences such as astronomy, biology and particle physics have already discovered, genrating massive datasets. Traditional relational databases and SOA architectures are not optimised for real time event processing, particularly as much of the data will be unstructured and garnered from heterogeneous sources such as web pages, videos, RSS feeds, market intelligence, statistical data, electronic devices and instrumentation, control systems and sensors.

In addition, CEP overlaps with the business intelligence domain. The latest CEP toolsets allow users to apply high level modelling tools, AI and query languages that allow them to implement business logic while processing event streams.

However, no matter how much filtering, pattern matching and analytic processing is applied to CEP data, human decision-making will still be a significant bottleneck. The future smart enterprise must have the flexibility to focus and deploy its cooperative intelligence in real time and autonomously, at all levels of the organisation in response to opportunities and competitive pressures in the marketplace.

The level of complexity of CEP and decision-making will continually and rapidly increase over time in response to the changing social and technological environment. The resulting complexity of networks of interactions involving customers, supply chains, services, markets and logistics will make it impossible for humans to respond effectively. It will become just too complex and time-consuming even for dedicated teams of humans to manage.

Real-time integration of disparate data and applications is a key challenge facing the future enterprise. Conventional approaches such as building a data warehouse to consolidate all data sources are expensive, slow and highly intrusive. A number of innovative CEP platforms are being developed in this sector based on enterprise information streaming models. These provide a virtual unified view of the data stream without first transferring it to a central repository.

The future smart enterprise will also be required to be proactive rather than reactive in order to optimise its response to a fast changing environment. That is it will be required to actively search for solutions and be knowledge driven. This will place further pressure on the need for real-time quality decision-making.

In the near future humans will be partners in the decision processes powered by CEP and smart algorithms, but over time their input, as for airline pilots and fast train drivers, will be largely symbolic.

Thursday, April 23, 2009

Evolutionary Systems Development

There has been a dramatic recent shift in sentiment in relation to choice of the best model for developing software systems. The shift has marked a change from the tradition of preparing a detailed requirements specification as the first phase in the development cycle, to a less rigid adaptive evolutionary approach.

The ongoing goal of software engineering is to ensure that a system meets its primary aims in terms of the quality criteria of functionality, flexibility, performance and reliability. Achieving rigorous standards of performance and reliability has never been the problem for developers; rather it has been their inability to capture a rigorous set of user requirements capable of delivering long lasting optimal outcomes.

This is similar to the problem of using rigorous deductive logic to draw conclusions from a set of axioms, but reaching a wrong conclusion because the axioms themselves are incorrect or incomplete.

Time and again this Archilles heel of software development emerges- particularly when a project is large, complex and operates within a dynamic environment. Systemic failure is more often the norm and the litany of collapsed projects keeps growing; particularly in the domains of government and business demanding planning and delivery of complex customer services such as health, education, infrastructure and communications.

A vast literature has accumulated on this endemic problem: how best to capture the enduring requirements of a system. It is the elephant in the room at almost every CIO seminar and conference.

A number of techniques have been applied over the past fifty years, each hopeful of delivering the magic silver bullet including- functional, data, entity relationship, process and object-oriented analysis applied at various levels of sophistication. Each manages to capture a particular facet or dimension of user aspirations- but never the whole set.

Libraries of tools and methods also cover all phases of the traditional software development cycle- requirements analysis, design, coding, testing, implementation, as well as project and quality management- but still the problem remains.

Organisations attempt to deal with the problem in a number of ways.
First by buying off-the-shelf, pre-packaged software, hopefully flexible enough to be easily tailored and adapted to an organisation’s requirements. But this solution only works if a reasonable functional match exists in the first place and if the level of built-in flexibility is sufficient to avoid costly re-working over time, beyond ad hoc version updates.

The second way is by linking together multiple functional components like a Lego set. But this also only works if the components are available and can be adapted independently by the customer and if they fit together without the need for complex middleware.

In past decades, these approaches often worked adequately for standard systems such as accounting, inventory, sales, maintenance, CAD, HR, job scheduling, project control, office systems etc. But even these became obsolete or unmanageable over time as protocols changed, customer expectations increased, technological change accelerated and the enterprise’s products and services evolved.

Perhaps in our efforts to tame the elephant, we have focussed on the wrong problem.

In the 21st century we live in a vastly different world of web services and SOA’s, cloud and mobile computing and enterprises which must continually adapt to a bewildering mix of competitive and economic pressures, almost on a daily basis.

On the other hand we have proof that immensely complex systems can be built and remain viable and continue to deliver real value over time- vast communication systems such as the Internet and World Wide Web, reliable operating systems such as Unix, Linux and Symbian, social networks such as Facebook and Myspace, families of powerful scripting languages based on Java, ever-improving search engines such as Google and Safari, easy to use databases such as SQL and an increasing number of flexible online e-business applications from the new utilities such as Amazon.

These are cooperative innovative works in progress, which have been tested through many iterations by scenarios and prototypes, before emerging in beta form; all developed in close consultation between developers and their user communities. And they continue to adapt as community needs evolve on a daily basis.

These are examples of the new emerging class of evolutionary adaptable systems.

The major driver for the emergence of this radical evolutionary paradigm is the accelerating rate of social, technological and economic change, particularly over the past twenty years. In almost all cases this acceleration will mean that long lead times for systems development are now untenable and almost certain to lead to obsolescence or outright failure- certainly before an adequate ROI is achieved.

It is becoming rapidly recognised that any realistic requirements engineering methodology must incorporate an evolutionary approach, combined with an efficient mechanism such as Agile programming and design techniques, for converting evolving functional and process requirements incrementally to a useable system. This enables the enterprise to adapt to the continuing dynamics of social, business and technological change, by continuously spawning new functions or incremental amendments, without disrupting its core processes.

The same change imperative applies to small as well as large systems. The risks inherent in smaller systems in the past just haven’t been as obvious or critical. In fact any significant system build that hopes to meet its user aspirations of long-term support and value contribution, must adopt an evolutionary approach.

Risks in evolutionary development also exist as for traditional systems - the risk that managers misread the environmental signals, such as in the case of GM’s disastrous planning decisions and continue supporting ineffective reporting systems; or that the updates and changes become so pervasive that the system becomes unwieldy and opaque, as in Microsoft’s early Vista system.

But the risk impacts from not following the evolutionary canon are far greater. Wrong management decisions can be quickly turned around by agile methods, if recognised in time. Building individual inappropriate functions can waste resources and cause annoying disruption, but don’t cause catastrophic project collapse and massive system re-design time delays and budget overruns.

In the future, the trend towards applying evolutionary techniques to software development will become embedded in IT best practice, particularly as this will be coupled with the parallel trend towards autonomic management of enterprises; interacting with the human and physical world on a real-time basis.

The record of systems development to date is appalling, but not through lack of the enormous level of innovation, effort and professional skills applied. It is because we have found it difficult to come to terms with a constantly evolving world impacting our built environment. We have ignored the fundamental principle that systems must continually adapt to changing environments if they are to survive.

This is as good a silver bullet as the IT industry is likely to get.
Evolution has been the universal driver of all systems- biological, social and now economic and computing, since the universe began and we ignore its wisdom at our peril.