Thursday, January 7, 2010

Future Enterprise- Convergence of X-Reality

First there was Virtual Reality-the creation of simulated games, objects and avatars; narratives embedded in online virtual worlds such as Second Life and World of Warcraft, with 15 million subscribers.

Then came Augmented Reality- created by integrating or mixing real objects and natural spaces with layers of related computer-generated data,images and designs; enabling real and virtual scenarios to be seamlessly combined. Basic forms of AR technology are already being used to gain a more immediate and accurate sense in practical applications such as engine repairs, wiring assembly, architectural design and remote surgery.

But now emerging from the evolution of cyberspace is Cross or X- Reality, with the boundaries between the real and the virtual extended yet again and becoming increasingly blurred in the process.

X- Reality environments essentially fuse two technologies- sensor networks and virtual worlds, bringing real world and realtime information into fully immersive virtual worlds and vice versa.

In hindsight it can be seen that Virtual and Augmented Realities are early phases in an ongoing evolutionary transition towards the acceptance of virtual forms as part of everyday human cognition. In the process we have crossed the threshold into a new space, extending human perception and interaction; linking ubiquitous sensory and actuator networks based on low cost microelectronic wireless technologies to create mixed realities.

The game is now on. By 2030, X-Reality will usher in an era of vastly extended reality indistinguishable from our present world, which has evolved over the period of life’s existence. In other words the world is evolving its own electronic nervous system via a dense mesh of sensory networks, eventually connecting and encompassing every object- living and non-living, on the planet. Such sensor networks help integrate physical reality into virtual computing platforms generating the ability to react to realworld events in automated fashion. This is creating a revolutionary relationship between human society and the Web, with the urgent need to understand the way our behaviour and future processes will become irreversibly shaped by cyberspace.

Cross reality environments can therefore serve as an essential bridge across sensor networks and Web based virtual worlds. The Web is already beginning to host an immersive 3D sensory environment that combines elements of social and virtual worlds with increasingly dense geographical mapping applications, allowing the monitoring and planning of natural and urban ecosystems- particularly its capacity to cope with climate change.

X-reality will be implemented according to the integration of key design technologies including-

Synchronously Shared Information- users will require open access to realtime data feeds and collection of information for analysis via centralised virtual command centres. Eventually control will devolve to decentralised self-organising and autonomous management systems working in partnership with users.

Complex Realtime Visualisation - users must be able to easily and flexibly visualise complex data, often delivered in 3D form. This will involve a high level of interactivity and collaboration, applying sensor-driven animation and the application of intelligent agents or avatars.

Ubiquitous Sensor Portals- such I/O devices designed for rich two-way cross-reality experiences, which can stream virtual and remote phenomenon into the user’s physical space; for example via video feeds and images uploaded from cameras. But this process can also extend into the past, allowing realtime access to historical data streams, vital for trendline analysis in business and the sciences.

Smart Phones- these will increasingly provide an intuitive interface that facilitates group collaboration in an ad hoc manner, via gesture as well as touch. Physical movement for outdoor users requires extreme mobility. Allowing augmented reality on smart phones that can query sensor networks and connect with shared online worlds paves the way for immersive mobile X-Reality.

Complex Event Processing- CEP- sensor networks will be particularly valuable in the future for generating data that tracks complex phenomenon in the real world, detectable by high-level pattern matching and logic inference techniques. Applications include- monitoring building and infrastructure maintenance, manufacturing and supply chain operations via RFIDs as well as environmental emergencies such as fire and pollution risks. In addition, CEP systems will help make sense of conflict zones, ecosystem health, field operative performance and traffic flows and events.

By 2030 most of our lives will be totally immersed in this shared reality. It will also redefine how we manage the vast and growing repository of digital information on the web- linking art, entertainment, work, science and daily life routines such as shopping, gaming and travel.

The Future Enterprise will be equally enmeshed- dependent on the management of its marketing, production and logistical operations and services via the medium of X-Reality.

Sunday, January 3, 2010

Future Enterprise- Adaptive Business Intelligence

The concept of adaptability is rapidly gaining in popularity in business. Adaptability has already been introduced into everything from automatic car transmissions to sentient search engines to running shoes capable of adapting to the preferences of each unique user over time, to business management.

Adaptive business intelligence is a new discipline which combines three components- prediction, adaptation and optimisation. It can be defined as the discipline of using prediction and optimisation techniques to create self-learning decision systems.

Managers work in a dynamic and ever-changing economic and social environment and therefore require constant decision support in two linked timeframes- what is the best decision to make now and how will this change in the future.

The general goal of most current business intelligence systems is to access data from a variety of sources, to transform it into information and knowledge via sophisticated analytic and statistical tools and provide a graphical interface to present the results in a user friendly way. However this doesn’t guarantee the right or best decision outcomes.

Today most business managers realise that a gap still exists between having the right information and making the right decision. Good decision-making also involves constantly improving future recommendations- adapting to changes in the marketplace and improving the quality of decision outcomes over time. This involves a shift towards predictive performance management- moving beyond simple metrics to a form of artificial intelligence based software analysis and learning such as evolutionary algorithms.

Future Trends

The future of business intelligence therefore lies in the development of systems that can autonomously and continuously improve decision-making within a changing business environment, rather than tools that just produce more detailed reports based on current static standards of quality and performance.

It must incorporate techniques that build autonomous learning, with feedback loops that generate prediction and optimisation scenarios to recommend high-quality decision outcomes; but also with an in-built capacity to continuously improve future recommendations.

The importance of such an evolutionary paradigm wil be esential in an increasingly competitive and complex business environment. It is regressive to continue to rely on software support systems that repeatedly produce sub-optimal demand forecasts, workflows or planning schedules.

The future of business intelligence lies in systems that can guide and deliver increasingly smart decisions in a volatile and uncertain environment.

Tuesday, November 17, 2009

Future Enterprise- Evolution of Cloud 2.0

A major shift in IT business models will emerge from the next incarnation of Cloud Computing- Cloud 2.0.

The Cloud is a metaphor for shared infrastructure, software and data within the web. The original cloud concept emerged in the sixties, not long after the commercial genesis of computing, with the advent of the Service Bureau. This allowed smaller companies to share in the benefits of the golden computer age by running their applications on large service provider mainframes. Access was provided by punched card readers and later by remote computer terminals with printer output.

Minis and desktop personal computers then dominated during the seventies and eighties and the original service bureau industry faded away. In the nineties the Internet and Web evolved, allowing online remote services to return; this time based on the client-server model linked to in-house PCs.

Cloud computing is the next evolutionary step in shared computer processing, using virtualised information services delivered on demand over the Internet, circumscribed with SLAs and user–based pricing. The major advantages as in the sixties, are lower costs and fewer technical maintenance and upgrade problems. As with the original service bureaus, the computing infrastructure and much of the application software is based on reliable services delivered via remote data centers; this time accessible via a web browser.
In addition the Cloud is evolving to deliver personalised intelligent and mobile applications; for example streaming SaaS- Software as a Service using virtual assistants to organise data-mined information. AI empowered mobile applications might include sharing time-critical market information, planning meetings, responding to voice commands or analysing traffic patterns to determine the speediest or most fuel-efficient route for an individual.
Most of the major service and software providers such as IBM, EDS, Apple, Google, Amazon, Yahoo, Microsoft and e-Bay have now established significant and expanding cloud services, providing access to their proprietary databases through Web APIs. Cloud service categories now cover a large range of standard applications including-
Social – social networks, video and photo sites, virtual worlds, and multi player gaming
Business–office and workflow, customer relationship and sales, workforce, supply chain, financial and booking applications
Utilities- skype, paypal, peer-to-peer networking
Plus numerous statistical, user-generated, media, science, geographic and cultural services
But the next evolutionary phase of the Cloud will offer much more- in particular data linking. This will promote the sharing of datasets across diverse domains and between business, research and group partners, bringing the full semantic power of the Web into play and changing the face of business forever.

Different APIs rely on different ID and access mechanisms as well as data in specific formats. Therefore APIs have tended to slice the web into separate sources and silos, restricting its full potential.

Tim Berniers Lee’s recent publication of Linked Data Principles for connecting structured data on the web, provides a future blueprint for connecting information from different sources into a single global data repository; accessible by generic data browsers and standard database and query languages.

This not only allows web documents to be identified, but also real world entities using the RDF- Resource Description Framework schema and Web Ontology language- defining mappings between related domains. The web of linked data will therefore immeasurably expand the classic document web, creating a global data network capable of spanning and weaving multiple data sources.

An increasing number of data providers have now begun to implement these Linked Data principles, leading to the creation of an open global data space containing billions of links and coordinated by the World Wide Web Consortium.

Future Trends

The trendlines are now becoming clear. The web is advancing to a multi-dimensional medium for discovering, publishing and linking seamlessly documents and data, leveraging semantic intelligence and mobile capabilities.

Individual supplier services will obviously continue to build, but enterprises will increasingly access common data clouds as well as most utility services, which will in the longer term become open source or common global property.

Cloud spaces will continue to blend and split, fragmenting and reforming in unlimited combinations and permutations. They will share data as media organisations already do amongst themselves and with countless news aggregators such as Google. The divide lines between public and private ownership of application IP will also become fuzzy, with most applications and algorithms over time converting to generic forms- as many critical medical drugs now have.

Software and system suppliers will need to differentiate their products increasingly as focussed value-added services, targeted to specific enterprises and industries as IBM and others are currently doing in forging partnerships for their new customised Smart Planet infrastructure business models.

Service applications will therefore be differentiated primarily by the level of value they contribute to the enterprise. Enterprises in turn will become more strategically porous, linking their core processes and management decisions more organically with their partner service providers and the chameleon Cloud 2.0.

Thursday, October 15, 2009

Future Enterprise- Network Science

Network science will be a critical enabler of advanced enterprise management in the 21st century.

Major advances are already being made in applying the principles of network science to social, technological and business systems and it will be vital for the future enterprise to weave sophisticated network optimisation principles into all aspects of its operations.

Network science essentially involves analysing and managing the properties and dynamics of interconnected complex systems such as social groups, the Web, power grids, supply chains, markets, ecosystems and the brain.

Such networked systems are based largely on scale-free topologies. This is the natural architecture most relevant to the world around us and is modelled on structures with a relatively small number of hubs or nodes, each with a large number of connections and a much larger number of nodes with a relatively small number of links- broadly obeying a mathematical power law.

Knowledge of network topology and dynamics allows for optimisation and prediction of the behaviour of complex system processes and is becoming increasingly vital in managing major business activities, via information systems that control vast numbers of interlinked transactions, resources, agents and events.

Failure in a tightly coupled network such as a power grid or market system of a single node may force the failure of other nodes, resulting in cascades of failures, eventually triggering a catastrophic breakdown of the whole system, as in the recent collapse of the global economy.

Examples of potential applications of network science principles include-

Economies and Markets- reducing the risk of global failure by ensuring economic networks are more robust; by closely monitoring market signals and adjusting the topology of nodes and links to reduce the risk of runaway feedback loops and conflicts between local interests and global efficiency.


Ecologies and Biodiversity- improving the sustainability of ecological systems in a period of global warming with the capacity to provide timely warning of species and resource collapse. The network model provides a powerful representation of ecological interactions among species and highlights global ecological interdependencies, which can then be re-modelled to manage risk.

Business and Finance- improving the capacity to make quality decisions regarding markets and product development, to avoid the future collapse of companies such as General Motors and Lehman Brothers. In these instances, poor decision-making was amplified by the systemic risk of runaway cascading financial asset dependencies, due to overloaded coupling strengths between nodes and indeterminate feedback loops in the myriad interconnected customer and supply networks.

The relevance of network science for the future enterprise is therefore threefold-

Firstly, many of the systems involved in business may be modelled in the future by scale-free networks, such as supply chain, investment, infrastructure, production and customer systems; with nodes representing suppliers, assets, products, consumers and customer groups.

Secondly, an organisation’s systems may be modelled by networks, with nodes representing process and activity decisions and the links represented by the dynamic flows of information feeding them.

Thirdly, the architecture of the enterprise itself may be viewed as a network of control flows between decision-makers and operational agents. As processes become more complex and time critical they will be increasingly automated, but the architecture- the information and decision-making structures and channels, will still need to be continuously optimised.

Future Trends

As forecast in previous posts, the enterprise of the future will be driven by networked architectures- patterns of linked decision processes- constantly morphing, reforming and adapting to a continuous flux of a changing global environment.
Today’s traditional hierarchical or even flat management models will be incapable of supporting tomorrow’s vastly more complex and competitive techno-social environment.

Such techno-social systems composed of technological layers operating within the larger social and physical environment that drives process application and development will need a more integrated, adaptive and intelligent framework for achieving sound management capability, underpinned by network science.

Most real world transportation, manufacturing, computing and power infrastructure networks will be linked and monitored by sensors and tags embedded in largely autonomous networked societies; constantly adapting to global evolutionary dynamics.

Network science algorithms will be developed to monitor and engineer optimal decision topologies, critical thresholds and non-linear outcomes. These will combine with AI technologies to manage complex enterprise operational and management processes.

These algorithms will apply adaptive defence mechanisms, often providing counterintuitive approaches to the engineering and control of complex techno-social systems. Such techniques will be based on the manipulation of key nodes, links and pathways to induce intentional network behavioural changes- mitigating for example potentially catastrophic outcomes.

This will represent the new Network Science Management Paradigm of the 21st century.

Monday, October 12, 2009

Future Enterprise- The Networked Enterprise

The enterprise of the future will be driven by a networked architecture- patterns of linked decision processes; constantly morphing, reforming and adapting to a continuously changing social and business environment.

The traditional hierarchical management model of the 20th century will be incapable of supporting the vastly more complex and competitive 21st paradigm of technological and social evolution.

Tomorrow's enterprise can be most effectively represented as a decision network model with decisions as nodes and information flows linking the relationships between them. This model represents an extremely powerful mechanism for understanding and optimising the adaptive enterprise of the 21st century- linked to but extending far beyond current simplistic process models.

Although process and object representations are a necessary and logical intermediary step in the evolution of enterprise system modeling and management, they fail to represent the underlying decision complexity of the real world and therefore fail to realise the true potential of a dynamic enterprise.

The core of the Networked Architecture will be the Decision Model, incorporating engineering methods based on decision pathways, with the capacity to dynamically route information and intelligence resources to critical decision-making agents in the enterprise.

This will not only involve the deployment of computing and information resources to adaptive decision nodes, but facilitate direct targeting of intelligence and problem solving capacity, enabling critical decision outcomes to be implemented in optimal time frames.

The latest 'Smart Planet' paradigm, in which the infrastructure and processes of the planet- whether manufacturing supply chains, electricity grids, water networks or traffic flows, are being re-engineered to optimise performance and achieve greener outcomes, will be the major driver for the networked enterprise of the future. The Smart Planet will demand that decisions be made more rigorously, efficiently, adaptively and therefore largely autonomously.

While SOAs focus on basic services, their capacity to implement complex decision processes is far from optimal. Current business intelligence and data warehouse software represents a halfway house towards this goal. But predictive techniques utilising AI will be the next stage, layered on current data mining and pattern recognition software and supported by a new generation of network-oriented database management systems.

Although the more far-sighted businesses are becoming aware of the need for such flexible small world network linkages, the support provided by today’s rigid organisational management architectures and philosophies has lagged well behind.

Tomorrow’s enterprise management must be far more pro-active and sentient in relation to environmental and structural change, avoiding being caught passively flat-footed in a bewildering flux of global evolution and competitive pressures.

Future Enterprise- The Intelligent Enterprise

The enterprise of the future will increasingly depend on a wide range of rigorous artificial intelligence systems, algorithms and techniques to facilitate its operation at all levels of e-commerce management.

As described in The Adaptable Enterprise blog, major decisions incorporating sophisticated levels of intelligent problem-solving will be increasingly applied autonomously within real time constraints to achieve the level of adaptability required to survive in an ever changing and uncertain global environment.

In addition web services will draw on the advances already made by the semantic web combined with the intelligent web 4.0.

A number of artificial techniques and algorithms are rapidly reaching maturity and will be an essential component of Intelligent Enterprise Architecture of the future.

Current techniques include-

Genetic algorithms- achieve solution discovery and optimisation modelled on the evolutionary natural selection process- based on the genetic operators of cross over, replication and mutation and measured against a 'fitness function'.
This technique is widely applied to solve complex design and optimisation problems.

Bayesian networks- graphical models representing multivariate probability networks- providing inference and learning based on cumulative evidence- widely used in medical diagnosis

Fuzzy Logic- based on natural non-binary methods of decision-making- assigns a

Swarm Intelligence- combines multiple cooperating components to achieve group intelligent behaviour.
Neural networks- pattern discrimination techniques modelled on neuron connection.Allows information inputs to be weighted and an activation threshold established.
Expert Systems- rule based inference techniques targeted at specific problem areas.
Intelligent Agents- designed to be adaptive to the web's dynamic environment- an agent is designed to perform a goal and learn by experience- can also act collaboratively in groups achieving higher levels of intelligence and capable of making increasingly complex decisions autonomously.

Future Trends

The above techniques will continue to be enhanced and packaged in different combinations to provide immensely powerful problem solving capability over time. The technology is slowly being applied discretely within business intelligence, data mining and planning functions of enterprise systems.

However AI is yet to realise its full potential within the enterprise by being applied to decision-making in a targeted autonomous fashion.
When this happens over the next decade, the quality of decision-making and concommitant reduction in operational and management risk is likely to be significantly improved.

Sunday, September 13, 2009

Social Computing

Social computing is the crucial next step following the rise of personal computing in the evolutionary computing stakes.

The term Social Computing has many connotations. It has come into prominence over the past few years due to the growth of dozens of hugely successful social networks such as Facebook and more recently Twitter; providing a global way of keeping in touch and exchanging information between friends and acquaintances.

In the process of linking with others, a network of social relationships is established. Social computing is in essence a way of codifying and exploring these relationships between people and agents in social spaces- crowds, communities, cities, markets etc.

Social computing applications focus on web-supported online communities such as social networks, wikis, blogs and virtual worlds, providing feedback on interactive social comment, entertainment, scientific and medical advances and business services. It also supports techniques for collective forecasting and decision-making, utilising the combined power of groups and communities to solve difficult problems such as those associated with major disasters and conflict. In addition it is increasingly applied to help analyse how changing technologies and policies affect political, social and cultural behaviour

A set of techniques and algorithms are now being developed based on network, cognitive and evolutionary theory, which will have major ramifications for the enterprise of the future. For example, business strategies and competitive markets have been increasingly characterised by turbulence, uncertainty and complexity. Consequently there is a need to model such markets and strategies as dynamic, evolutionary processes; that is, as complex adaptive systems.

In addition, data mining and simulation are applied to study social networks. Data mining can uncover patterns such as an organisation’s network structure, properties and relationships between suppliers and customers. Agent based social simulation or understanding social phenomena on the basis of models of autonomous agents has also grown tremendously in recent decades. Researchers use this approach to study a wide range of social and economic issues, including social beliefs and norms, resource allocation, traffic patterns, social cooperation, stock market dynamics and organisational decision-making.

There is as yet no effective widely accepted methods for modelling complex systems, especially those involving human behaviour and social organisations, but collaborative agent-based artificial life is currently the most promising approach.

Using agent-based modelling, an enterprise can construct a virtual competitive market that allows business strategists a way of investigating a range of realistic scenarios. For example agent models can account for interactions between irrational and rational investors in stock market bubbles.

The social network paradigm in turn has created the concept of the Social Fabric, which mediates the interactions of the network’s agents and information flows in much the same way as spacetime mediates particle interactions and exchanges. It therefore allows us to explore the dynamic social and cultural aspects of the world in which we live or in which an organisation exists.

Cultural Algorithms or CAs are one of several approaches to modelling the social fabric and the application of social intelligence to solve problems related to optimisation, based on particle or agent swarming. They are therefore a class of computational model derived from observing the cultural evolution process in nature. Embedding an activity or problem in a social fabric can improve its performance or solution outcome, enabling the system to find a better solution than the original over a number of population iterations.

Many organisations have adopted Web 2.0 tools to stimulate innovation and productivity. They are also beginning to embrace social networks as a way of more effectively marketing services and tracking customer behaviour.
But applying social computing to improve the quality of decision-making, process optimisation and prediction scenarios is not yet on the horizon for most.

In the future social computing will be an integral component of the strategic and operational management of the future enterprise, at the same time transforming the web into a truly collaborative and social platform.