Thursday, January 7, 2010

Future Enterprise- Convergence of X-Reality

First there was Virtual Reality-the creation of simulated games, objects and avatars; narratives embedded in online virtual worlds such as Second Life and World of Warcraft, with 15 million subscribers.

Then came Augmented Reality- created by integrating or mixing real objects and natural spaces with layers of related computer-generated data,images and designs; enabling real and virtual scenarios to be seamlessly combined. Basic forms of AR technology are already being used to gain a more immediate and accurate sense in practical applications such as engine repairs, wiring assembly, architectural design and remote surgery.

But now emerging from the evolution of cyberspace is Cross or X- Reality, with the boundaries between the real and the virtual extended yet again and becoming increasingly blurred in the process.

X- Reality environments essentially fuse two technologies- sensor networks and virtual worlds, bringing real world and realtime information into fully immersive virtual worlds and vice versa.

In hindsight it can be seen that Virtual and Augmented Realities are early phases in an ongoing evolutionary transition towards the acceptance of virtual forms as part of everyday human cognition. In the process we have crossed the threshold into a new space, extending human perception and interaction; linking ubiquitous sensory and actuator networks based on low cost microelectronic wireless technologies to create mixed realities.

The game is now on. By 2030, X-Reality will usher in an era of vastly extended reality indistinguishable from our present world, which has evolved over the period of life’s existence. In other words the world is evolving its own electronic nervous system via a dense mesh of sensory networks, eventually connecting and encompassing every object- living and non-living, on the planet. Such sensor networks help integrate physical reality into virtual computing platforms generating the ability to react to realworld events in automated fashion. This is creating a revolutionary relationship between human society and the Web, with the urgent need to understand the way our behaviour and future processes will become irreversibly shaped by cyberspace.

Cross reality environments can therefore serve as an essential bridge across sensor networks and Web based virtual worlds. The Web is already beginning to host an immersive 3D sensory environment that combines elements of social and virtual worlds with increasingly dense geographical mapping applications, allowing the monitoring and planning of natural and urban ecosystems- particularly its capacity to cope with climate change.

X-reality will be implemented according to the integration of key design technologies including-

Synchronously Shared Information- users will require open access to realtime data feeds and collection of information for analysis via centralised virtual command centres. Eventually control will devolve to decentralised self-organising and autonomous management systems working in partnership with users.

Complex Realtime Visualisation - users must be able to easily and flexibly visualise complex data, often delivered in 3D form. This will involve a high level of interactivity and collaboration, applying sensor-driven animation and the application of intelligent agents or avatars.

Ubiquitous Sensor Portals- such I/O devices designed for rich two-way cross-reality experiences, which can stream virtual and remote phenomenon into the user’s physical space; for example via video feeds and images uploaded from cameras. But this process can also extend into the past, allowing realtime access to historical data streams, vital for trendline analysis in business and the sciences.

Smart Phones- these will increasingly provide an intuitive interface that facilitates group collaboration in an ad hoc manner, via gesture as well as touch. Physical movement for outdoor users requires extreme mobility. Allowing augmented reality on smart phones that can query sensor networks and connect with shared online worlds paves the way for immersive mobile X-Reality.

Complex Event Processing- CEP- sensor networks will be particularly valuable in the future for generating data that tracks complex phenomenon in the real world, detectable by high-level pattern matching and logic inference techniques. Applications include- monitoring building and infrastructure maintenance, manufacturing and supply chain operations via RFIDs as well as environmental emergencies such as fire and pollution risks. In addition, CEP systems will help make sense of conflict zones, ecosystem health, field operative performance and traffic flows and events.

By 2030 most of our lives will be totally immersed in this shared reality. It will also redefine how we manage the vast and growing repository of digital information on the web- linking art, entertainment, work, science and daily life routines such as shopping, gaming and travel.

The Future Enterprise will be equally enmeshed- dependent on the management of its marketing, production and logistical operations and services via the medium of X-Reality.


mr. burlingame said...

David - very cool. Only comment would be that sensor data may not be streamed so much as it is "squirted", which is a big distinction in terms of the performance and TCO of a sensor network in this scenario. DASH7 ( provides a good view as to how this will work with AR.

David Hunter Tow said...

apologies for missing the comment- greatly appreciate the point you make- am honing up on DASH7 now