Thursday, November 22, 2012

Future Enterprise- The Future of Services


The Director of the Future Enterprise Research Centre, David Hunter Tow, predicts that the current explosion of new services will trigger the biggest treasure hunt in the history of computing technology.
The Services Sector is currently in turmoil with thousands of startup companies cashing in on new opportunities to re-engineer traditional ways of doing business- and this is just the beginning.

Every process is currently being transformed into a new service- not just in traditional service sectors such as retail, media, education, healthcare, tourism and finance, but also in industry areas such as manufacturing- with made to order 3D printing techniques, medical processing- offering personalised DNA sequencing and diagnostics instantly on an iPhone chip, and inexpensive solar energy and water purification systems cheaply available for domestic use in developing countries.  
There’s not one established service process that’s not being seriously disrupted by smaller more agile independent players, leaving the lumbering giants that once dominated commerce in the 20th century stumbling blindly in their wake.

All major service sectors are currently being carved up, their key functions hived off and new innovations successfully introduced in competition with those of the original gatekeepers, continuing to guard their crumbling IP parapets, while new knowledge is generated by the terabyte.
What is catalysing this frenzy and where is it heading?
A number of convergent factors are involved in this 21st century phenomenon – breakthroughs in new technology- mobile computing, augmented reality, artificial intelligence and information analysis,  massive social change in the form of deregulation of knowledge generation and access spurred on by the social media and global Web; and an unstoppable surge in the creative potential of a new generation steeped since birth in the cyber revolution, but now cheaply available; all combined with the very low cost of market entry for innovative entrepreneurs.

Of course online retail and marketing started the ball rolling in the nineties and has never looked back. Traditional main street bricks and mortar retail has been fighting a ferocious rearguard action, but by and large it’s been a losing battle ever since. The smart retailers hedged their bets by combining new boutiques and online websites, but overall, new leaders in the revolution such as Amazon, eBay, Apple and Google as well as thousands of smaller specialists, just kept upping the ante with greater global choice, faster service delivery and deeper discounts.
Then came the new wave of targeted retail service apps nibbling away at the leaders- companies like Foursquare peddling Deals of the Day- but now over 130,000 Android and 300,000 Apple apps covering- self-checkout, barcode scanning, loyalty programs, coupon and discount offers, retail location discovery, best buys, augmented reality advertising, customer reviews, consumer preferences and recommendations and mobile platform payment functions- all making the online and store shopping experience easier and more exciting. And for developers- the social media to inexpensively promote them.

In the meantime the traditional Media and Advertising industries were hurting. Gone were the salad days of broadsheets generating the golden streams of classified advertising revenue from job, real estate and used car advertisements; paying for the packed newsrooms and inflated expense accounts in five star global hot spots.  
In their place one stop outsourced stories and editorials with duplicated online headlines.

Again a frenzy of online experimentation began. But pay walls had only limited success, even for the heavyweights such as the New York Times and Wall Street Journal. Classified and banner advertising continued to haemorrhage, migrating to web sites at greatly reduced prices and the social and alternative media led by a myriad young independent operators, grabbing the best headlines and news stories via cheap phones.
As a result the print operators such as News Ltd are only pale shadows of their former selves and have quietly retreated to the more glamorous world of cable television and film. But this is only a temporary reprieve as the low budget independent film and documentary makers gain ground on YouTube and in Arthouse cinema seats, shooting with low cost video cameras,  while at the same time chasing the more interesting reality footage; all supported by the citizen journalists and freelance bloggers desperate for a voice in the brave new cyber world.

As a result of this revolution the power of traditional media services has seriously waned and is likely to have largely disappeared within a few decades, replaced by countless personalised web channels and DVD and gaming startups, controlled by myriad smaller, more energised groups and individuals.
At the same time the Advertising industry is in a monumental bind- caught in the headlight glare of change; trying to find the magic brand formula for clients by mixing and matching the traditional and burgeoning new media- but apart from reverting to Google and Facebook, not having a lot of success with either, unable to capitalise effectively on the thousands of creative local specialists and the cornucopia of apps.

Over time traditional advertising will therefore become less significant to major brands as it transitions to an infotainment format, with thousands of independent product sites and apps providing instant comparative advice to consumers without the retrospin of big business.

Education
But the big revolution in services- the game changer of the 21st century, will come from easy global access via mobile online learning to high quality inexpensive education. This, according to educators, will turn every mobile phone into a knowledge portal and return education to the golden age of sharing ideas among communities of scholars, releasing them from a boring classroom environment with second rate lecturers more interested in their next overseas conference schedule.

Interestingly the revolution is being led from the inside by some of the biggest and most hallowed institutions- Harvard, Stanford, MIT and Yale. Suddenly global tertiary level courseware and soon secondary level as well, is available at very low cost from these prestige US universities through massive online platforms such as Coursera; while third party reference sites across the Web such as Wikipedia, Google, Microsoft and Facebook plus a host of talented independent specialists will provide countless training services by integrating and coordinating domain related knowledge.
This is the next phase in the democratisation of the world’s storehouse of information, driven by the need to realise the potential of the vast under-educated populations of Africa, Asia and the Middle East that have missed out on the planet’s opportunities. It will allow anyone with a mobile phone or tablet to access the same level of knowledge regardless of location, income or the availability of local training resources. 

Over the coming decades therefore the services of learning and education will undergo a profound shift, from the traditional classroom/face to face method of knowledge transfer to a much more abstract model, where teaching will be largely separated from its current physical infrastructure, such as classrooms and campuses.

It will also be linked to the Cyber Revolution- transforming the world’s knowledge base into a vibrant multimedia forum- using the latest 3D, virtual reality and gaming technologies- all delivered by smart mobile and embedded multi-media kinetic devices linked to the Intelligent Web.

Healthcare
Now the medical and healthcare services sector is also ripe for revolution. Phone apps are increasingly available to act as remote monitors for home based medical and health support purposes- the remote diagnosis of life threatening conditions and algorithms to calculate correct drug dosages and interventions for acute illnesses such as diabetes, malaria  and HIV.  

This revolution has been driven to a large extent by the healthcare needs of half the planet’s population that still live in dire poverty, unable to afford traditional life-saving hospital support or medication.
These and many other diagnostic  and treatment services are now putting patients at the centre of the management of their own healthcare with the help of trained volunteers, bypassing the bottlenecks involved in the traditional delivery of medical services by scarce qualified practitioners.

Future services will also be based on the accessibility of whole-of-life eHealth records across both the developed and developing world, eventually allowing the creation of online global health records from pre-birth to death, providing personalised remote support services delivered on an iPhone or community personal computer. Within a decade, health records will include the sequencing of an individual’s genome as a vital diagnostic service at a cost of a few dollars.

A number of other technological breakthroughs will mark the expansion of new healthcare services within the next few decades including - stem cell therapies to repair human tissue and organs, reversing heart disease for example;  prevention of cancers and neurodegenerative diseases such as  breast cancer and Alzheimer’s;  sensory repair such as early retinal and corneal implants; prosthetics including neuron-controlled limbs; brain/ nervous system interfaces, overcoming spinal paralysis using brain signals; and interactive humanoid robots to provide human companionship and physical support.

All will be available as relatively inexpensive services once the enabling technologies have been approved. Why? Because if the major healthcare companies don’t provide them at  an affordable price, entrepreneurial groups will, as occurred with the generic drug revolution in developing countries when Big Pharma refused to drop their prices.

 Finance/Banking
The Finance and Banking industry has been a sitting duck for radical change for a long time- getting bigger and more obese with minimal outside competition – seeking to control ever more functions in a frenzy of greed- from investment, transaction processing, payments, exotic specialist derivatives, consumer credit, foreign exchange, mortgage provision, money transfers, advice on mergers, trading and anything else that shows a hint of making easy money and inflating their balance sheets. 

There have been numerous exposures of the underlying level of corruption within the finance and banking industries, to the point of defrauding their own customers and incurring horrendous trading losses by rogue dealers through sloppy oversight, in the process threatening bankruptcy for themselves and their clients and culminating in the GFC.  But it didn’t deter them for long and despite some fresh regulations and a massive infusion of taxpayer dollars, their insatiable greed continued to explode.
But if government regulators have failed to reign them in - a number of agile competitors offering cheaper, safer and more convenient services, may do the job for them.

The major looming battle is between the traditional finance industry and the global technology giants such as Apple, Google and Paypal- using their skills at creating innovative software to provide Payment and Credit card services, using wireless apps that allow mobile phones to store loyalty and credit card information, make payments and transfer money. Technology-poor African countries such as Kenya have taken the lead in these services of convenience and already provide perfectly viable phone money transfer services via text, bypassing expensive banking services.

Now the bloated goliaths are fighting back with their own brand apps. Banks are also using contactless near field technology to convert smartphones into mobile credit and payment devices. But it may already be too late as the genie has escaped the bottle and the smart entrepreneurs realise the banking emperor has no clothes, except Wall-Mart hand me downs.
It is likely that banks in their present form will cease to exist within the decade, effectively disembowelled by smarter consumer and business service providers. They may become primarily back office transaction processors and routine mortgage providers, with a veneer of  deal making, offering a line of credit for smaller uncomplicated businesses. All other functions will become the province of highly skilled specialists.

And so the frenzy of creativity and service disruption will continue in all areas of commerce and industry, as the current generation of software engineers and innovators becomes acutely aware that the rules that propped up the old corporate structures are obsolete.
The old software guard that controlled the boundaries of commerce so tightly are also increasingly ripe for the picking because the original rules governing old sectors such as retail, media, manufacturing, banking, pharmaceuticals, photography, music, publishing etc, just don’t hold up anymore. They don’t reflect the changing social currents of the new era, stuck in the quicksand of the past. Therefore the software and systems houses that propped them up and contributed to their stranglehold, such as SAP, Oracle, IBM, HP, Yahoo, Cisco etc, are also irrelevant, weighed down by their own legacy technologies, now being systematically cannibalised by more agile and visionary players. 

Take any industry. Who buys the software that was developed for it in the 80’s or even 90’s? Very few, except the dinosaurs locked in by exorbitant long term maintenance agreements and they are now paying a very high price for their conformity- unable to adapt or switch systems before being swallowed by the next wave of innovation. Their present systems just don’t reflect the changing way of doing business or social norms that the new generation of consumers want and have come to expect.
The technology keeps shifting and each time it moves it exposes the soft underbelly of the existing services and providers. Those like IBM that have survived have had to radically remodel their businesses. In the case of IBM – from hardware to software to services and now to ‘smart planet’ systems.

The next generation of providers- Google, Apple, Amazon and Facebook along with thousands of smaller service developers have already moved in on these crumbling bastions.  But even this new order are in turn being held to account by a host of smaller creative startups. And no matter how often the established leaders of any systems generation try to reinforce their monopolies by swallowing the smaller more agile enterprises, they are constantly outflanked by the tide of new knowledge and innovation.
And so the dance goes on – faster and faster and the treasure trove of potentially lucrative services keeps growing.

Saturday, October 6, 2012

Future Enterprise- The Future of Big Data


The Director of The Future Enterprise Research Centre- David Hunter Tow, predicts that Big Data  has the potential to unlock greater value for the enterprise and society, but in the process will radically disrupt traditional organisational functions at all levels,- particularly between IT departments and decision makers.
The connotation of the term ‘Big Data’ is at best extremely fuzzy and at worst highly misleading. It implicitly promises major benefits in direct relationship to the quantity of data corralled by an organisation. But Big Data is also hedged with constraints, contingencies and uncertainties, requiring the solution of a number of associated problems before it can translate into significant enterprise benefits.
To unlock real value in the future will require-

The design of more responsive enterprise and knowledge architectures based on a network model, allowing for the delivery of realtime adaptive decision responses;
A closer relationship between the business and its social environment enabling the enterprise to better understand the Big Picture;

The introduction of common data standards and streaming of seamless integrated multiple data types;
A quantum leap in intelligence through the application of more powerful artificial intelligence based, analytic, strategic and predictive modelling tools.

The need to upgrade the quality and security of current storage and processing infrastructure beyond current cloud architectures.

 Networked Architectures
A more flexible and responsive knowledge architecture for the future enterprise must reflect the reality of increasingly complex decision making, allowing far more agile reaction within the fast changing competitive environment of the 21st century.

This will involve the introduction of networked models at both the enterprise and information level, with nodes represented by decisions and information flows linking the relationships between them; eventually capable of autonomous adaptation within a constantly evolving social, technological, business environment.

.This Model, based on optimised decision pathways, with the capacity to dynamically route information and intelligence resources, supported by autonomous agent software, to appropriated decision makers, will be the core driver of Big Data architectures.  It must be capable of integrating and analysing streams of information from multiple sources in real time, channelling computing and information resources directly to relevant decision nodes, enabling critical decisions to be implemented in optimal time frames.

 
The Big Picture

 The capacity of the enterprise to mesh with its physical and social environment, will become increasingly vital for its survival in the future. Without such a grounded relationship, poor decision making based on an inwardly focussed mindset will continue to drive many large enterprises to bankruptcy.

 It will also be insufficient to plan just one, two or five years ahead. Although near-term sales and cost forecasts are important, understanding the bigger shifts likely to impact all businesses in a future dominated by climate change, geopolitics and globalisation, will be more essential to survival- allowing a better balance of creative planning, adaptive resilience and risk avoidance.

In fact enterprises- particularly the biggest, have a poor history in seeing the big picture. The larger the enterprise the more likely it is to believe in its own invincibility in the marketplace.

In more recent times both Ford and GM virtually went bankrupt and had to be bailed out by the public purse because they would not or could not see the obvious shift in consumer sentiment to smaller cars with lower fuel usage. And then there was AIG, Lehman Brothers and Citibank and Fanny Mae which also thought they were too big to fail. And now the giants Kodak and Sony and many others are struggling.

 In all the above cases, enterprise management ignored the signals coming loud and clear from their environments via consumers and customers, through a combination of ignorance and arrogance. In the meantime other more agile companies such as Microsoft picked up the trend towards desktop computing and exploited the opportunities left by IBM. But then Microsoft almost lost the plot to Google by not seeing the emerging power of the Internet as the dominant driver of information in today’s society.

 So despite the use of the latest business intelligence software busily scavenging for patterns from Big Data generated from past customer and financial data, standard industry forecasts,  plus some glitzy dashboard software, typical BI analysis without the guidance of the big picture will be virtually useless as a pathway to an uncertain future.

 Big Data now has the potential to open the door to this broader and more inclusive vision, which has rarely been a priority for most enterprises in the past. But that will now change. Survival, particularly for larger enterprises will be based on interpreting the bigger picture’s impact. If managed effectively Big Data will be the catalyst that provides a hedge against this myopia, but only if management’s mindset becomes more flexible and humble.

The Intelligent Enterprise
Big Data will also trigger the need to become much smarter, utilizing the latest artificial intelligence and statistical techniques such as evolutionary algorithms, neural networks and Bayesian logic to achieve a smarter enterprise. The latest 'Smart Planet' paradigm shift, in which the infrastructure, business and environmental processes of the planet are being re-engineered to optimise performance and achieve more sustainable outcomes, will also be a major driver for the networked smarter enterprise of the future.

The Smart Planet imperative will demand that decisions relating to society’s survival and wellbeing be made more rigorously, efficiently, adaptively and therefore autonomously.
But a Smart Planet revolution without a Smarter Enterprise mindshift won’t compute.

 Part of that mindshift will be a far greater emphasis on the future. The past is still a poor predictor of what’s to come. To improve the quality of decision-making, serious realtime forecasting will need to be boosted. Such systems will rely on information collected from numerous internal and external sources within an environment, using artificial intelligence algorithms and rules to analyse vital signals and interconnected trends and patterns that generate complex outcomes. The results of this analysis will then be channelled autonomously to the appropriate decision-makers for action.

But despite a range of mathematical improvements in our foresight and modelling methods, developed in tandem with a broader understanding of scientific and social principles, the corporate capacity to forecast effectively has been sadly lacking when data doesn’t follow obvious trends or when the signals of emerging change are faint.

Most forecasting textbooks traditionally list a number of well-developed techniques based around time series projections, regression analysis, Delphi and scenario expert options, as well as artificial neural networks and simulation modelling. But these have usually failed to predict the future in times of abrupt change within the broader physical, social and economic environment, such as the recent extreme disasters of the global financial crisis or the Arab democratic revolution.
The next phase in this evolution will be models powerful enough to not just deliver predictions but accurately prioritise the resources needed to manage those predictions, and develop project plans for their implementation. Then track the results to check on their effectiveness. In other words utilise textbook automatic feedback control principles much more rigorously. After a bridge or power grid has been built, its maintenance needs to be permanently and autonomously managed to prevent future catastrophic failures or escalating rebuild costs, tracked by the latest generation  of intelligent sensors.  

This common sense methodology of feedback and continuous monitoring of outcomes has been sadly lacking in many enterprises but will now be essential if business and society is to survive the onslaught of massive future shock. It will involve scanning for emerging problems, aggregating data streams from millions of internet-connected sensor systems and monitoring the pulse of the global environment- not just at the business level but also at the political, technological and environmental flashpoints.

Processes based on Big Data therefore need to be recognised as the beginning of the our civilisation’s survival fightback; applying adaptive and responsive techniques based on massive datasets, largely autonomously, because they are so big and complex that manual methods will fail, to the optimisation of the design, maintenance and operation of every process and application on the planet.

 The Cloud Solution

Collecting and storing the tsunami of data resulting from Big Data overload is a major stumbling block to the above goal, already creating unforeseen problems for the average enterprise by generating exponentially exploding datasets; as the Science communities- astronomers, biologists, cosmologists and particle physicists, have already discovered.

Traditional Relational or Hadoop databases, SOA architectures and SQL databases are not optimised for such massive real time processing, particularly as much of the data in the future will be unstructured and garnered from heterogeneous sources such as web pages, videos, RSS feeds, market intelligence, statistical data, electronic devices, instrumentation, control systems and sensors.

But just in time, Cloud processing management has emerged, offering an alternative solution, which few large organisations will be able resist. Now they will have the seductive choice of offloading the complete data management side of their operations to third parties in return for economies of scale and flexibility. The tradeoff is partial loss of control, but over time, providing security, backup and service levels are maintained at a rigorous standard, the organisation should benefit by being able to improve its focus on the core critical aspects of its operations. Only time and verification will tell if this tradeoff can deliver on its promise.
The IT department will be virtually invisible to decision-makers with the primary task to select the appropriate tools to implement enterprise strategies

Cloud computing will eventually offer a complete managed haven of services for Big Data- software, Security, Processing, Storage, Hardware and Infrastructure. All are now in the offing. But in the near future, Knowledge as a Service is likely to presage the greatest change within the Future Enterprise ecosystem.
Real-time integration of disparate data and application methodologies is a key challenge here, with the current conventional multi-staged approach being- build a data warehouse to consolidate storage, then aggregate information sources  and then select a BI tool and then process user queries.  But this is already proving expensive, slow and error prone.

A number of innovative platforms are being developed in this sector based on enterprise information streaming models. These provide a virtual unified view of the data stream without first transferring it to a central repository and also point the way to the next step of fully autonomous tool selection and decision support.

 Future Shock

It is now clear that the global environment is placing place enormous pressure on all organisations, not just from a competitive perspective but from the need to upgrade ethical and sustainability standards. This will continue at an accelerating pace. The changing technological environment in particular is already disrupting entire service industries  e-commerce in particular- retailing, banking, trading and supply. Now a second wave of service industries- manufacturing, healthcare, education, media, advertising, legal, hospitality and travel is being turned upside down by the revolution.

In this revolution Big Data is acting as a major catalyst, offering the glittering prize of untold value-added, but will generate this cornuopia only if it is also agile and precisely targeted, meeting the specific needs of multiple domains. Specialised decision-making in finance, biology, medical, cosmological, pharmaceutical, government, media and legal applications will require different classes of algorithmic support. And even domain analytic specialists may soon be obsolete as expert domain algorithms generated from the ever expanding cumulative knowledge of the Web begin to dominate the decision process. Critically however such algorithms will need to be continuously verified and adapted within a shifting social and business environment.

Because of the rate of innovation and subsequent disruption, service-based systems will therefore need to be self-adaptive; applying intelligent algorithms to support new options as well as the growth of collaborative ventures involving multiple stakeholders, such as commonly occur in service industries such as Hospitality, Travel and Real Estate.

New technological innovations such as smartphones and tablets are also increasingly filling mobile gaps and shortfalls in existing services. For example, by starting to displace traditional credit card/banking in the lucrative payments market and enabling the personalisation of healthcare and educational services in remote areas.

Such upgrades in the service sector imply the use of increasingly pervasive Big Datasets with low access time latencies. Response timeframes are critical, with cumbersome reporting and query tools way too slow for today’s end user needs. So the days of manual intervention in the decision process are drawing to a close as global markets creating decisions involving hundreds of variables required instantly.

So the stage is set. The filtering, pattern matching and super-intelligent analytic processing required to make sense of the overload of big Data, will mean that human intervention in the decision process will inevitably become a significant bottleneck.

But the future smart enterprise must have the flexibility to focus and deploy its cooperative intelligence autonomously, at all levels of the organisation. This will be a proactive response to new opportunities and competitive pressures in the marketplace.

The level of volume and complexity of decision-making will continually and rapidly increase over time in response to the changing social, geopolitical and technological environment. The resulting network interactions involving customers, supply chains, services, markets and logistics will eventually make it impossible for humans to compete. It will become just too complex and time-consuming even for dedicated teams of humans to manage, just as it is impossible to control complex trading, production, marketing operations manually or chemical plants and space missions today. 

The IT centre will rapidly transform into tomorrow’s Knowledge Technology Centre- KT Centre. This will place further pressure on the need for real-time high quality decision-making.

By 2030 humans will become partners in enterprise decision processes powered by intelligent algorithms based on realtime knowledge outcomes plus research encapsulated in the Intelligent Web. But over time their input, as for airline pilots and fast train drivers today, will be largely symbolic.

Big Data therefore will have provided a major catalyst for an extreme makeover of the future enterprise, the business environment, for society and the planet.

 




.


Sunday, May 6, 2012

Future Enterprise- The Future of Algorithms

Future Enterprise – The Future of Algorithms
Algorithms are taking over the world – at least the computational part of it- and that could be a good thing.
In a real sense the rise of algorithms is a sign of human intellectual maturity in terms of our capacity as a society to manage technology and science at a sophisticated level; representing the coming together of our mastery of computational science, together with the capacity to abstract the key essence of a process- to generalise and commoditise it.
The ubiquity of algorithms is in fact the next logical step in our technological evolution as a species and perhaps marks our evolution towards super-species status.

Algorithms translate a process into instructions that a computing machine can understand, based on a mathematical, statistical and logical framework. They are usually developed to minimise and rigorise the computing steps involved in a process or formula and therefore maximise its solution efficiency in terms of computing resources, while at the same time improving its accuracy and verifiability.

Algorithms come in all shapes and sizes and have been around a long time- well before the official computer age. Originally invented by Indian mathematicians, they were documented by a Muslim scholar of the 9th century- al-Khwarizmi and later applied by Euclid and Newton to assist in the ormalisation of their theories of geometry and forces of nature.
In the future almost every process or method will be converted to an algorithm for computational processing and solution, as long as it can be defined as a series of mathematical and logical statements, ideally capable of being run on a Turing machine. A Turing machine is a mathematical model of a general computing machine, invented by Alan Turing, which we use for our current computing requirements. Turing machines can come in a variety of flavours, including deterministic, quantum, probabilistic and non-deterministic, all of which can applied to solve different classes of problems.
But regardless, any computation, even those based on alternate logical models such as Cellular Automata or recursive programming languages, can also theoretically be performed on a Turing machine. The brain however, because of its enormous non-linear problem-solving capacity, has recently been classified as a Super-Turing machine, but the jury is still out as to whether it falls in a different computational class to the standard Turing model.

Many algorithms incorporating powerful standard mathematical and statistical techniques, such as error correction, matrix processing, random number generation, Fourier analysis, ranking, sorting and Mandelbrot set generation etc, were coded originally as computational computer routines, using languages dating from the 50s and 60s including- Fortran, Algol, Lisp, Cobol, C++ and PL1. Later common algorithms were also incorporated in mathematical libraries such as Mathematica making them easier to access and apply.

They have now infiltrated every application and industry on the planet, applied for example to streamline and rigorise operations in manufacturing, production, logistics and engineering. They cover standard operational control methods such as linear programming, process control and optimisation, simulation, queuing, scheduling and packing theory, critical path analysis, project management and quality control.

Engineers and scientists increasingly link them to AI techniques such as Bayesian and Neural networks, Fuzzy logic and Evolutionary programming, to optimise processes and solve complex research problems.
But over time, following the flow of computerisation, the ubiquitous algorithm has extended into every field of human endeavour including- business and finance, information technology and communication, robotics, design and graphics, medicine and biology, ecosystems and the environment and astronomy and cosmology; in the process applying data mining, knowledge discovery and prediction and forecasting techniques to larger and larger datasets.

Indeed, whenever new technologies emerge or mature, algorithms inevitably follow, designed to do the heavy computational lifting, allowing developers to focus on the more creative aspects.

Other algorithmic applications now cover whole sub-fields of knowledge such as- game theory, machine learning, adaptive organisation, strategic decision- making, econometrics, bioinformatics, network analysis and optimisation, resource allocation, planning, supply chain management and traffic flow logistics.

In addition, more and more applications are being drawn into the vortex of the algorithm which were once the province of professional experts including- heart and brain wave analysis, genome and protein structure research, quantum particle modelling, formal mathematical proofs, air traffic and transport system control, weather forecasting, automatic vehicle driving, financial engineering, stock market trading and encryption analysis.

A number of such areas also involve high risk to human life, such as heavy machine operation, automatic vehicle and traffic control and critical decisions relating to infrastructure management such as dams, power plants, grids, rolling stock, bridge and road construction and container loading.
The Web of course is the new playground for algorithms and these can also have far reaching impacts.
For example in 2010, the Dow Jones Industrial Average dropped 900 points in a matter of minutes in what is now known as a Flash Crash. It appears that for a few minutes several algorithms were locked in a death dance, in much the same manner as two closely bound neutron stars before implosion, triggering a massive collapse in the value of the US stock market. It was a wake-up call to the fact that in any complex system involving multiple feedback loops, it has been mathematically proven that unforeseen combinations of computational events will take place sooner or later.

Even today’s news headlines are shaped by algorithms. Not only is it normal for Internet users to select feeds relating to the personalised content they prefer, perhaps on a feel-good basis, but also stories are selected and curated by search-engine algorithms to suit categories of advertisers. This raises the issue of algorithms being applied to create different bubble realities that may not reflect the priorities of society as a whole- such as global warming, democracy at risk or a critical food shortage.


A major dimension of the impact of algorithms is the issue of job obsolescence. It is not just the unskilled jobs of shop assistants, office admin and factory workers, marketing and research assistants that are at risk, but middle-class, white-collar occupations, from para-legals to journalists to news readers. As algorithms become smarter and more pervasive this trend will extend up the food chain to many higher level management categories, where strategic rather than operational decision-making is primary.

And so we come to the millions of smartphone apps which are now available to support us in every aspect of our daily activities, but can also lead us to the dark side of a big brother society, where through the pervasive monitoring of location, shopping transactions and social connections, every individual’s life and timeline can be tracked and analysed using algorithms, with everyone eventually becoming a person of interest in the global society.

Social networks trade personal information to generate revenues, while the individual loses their right to privacy but doesn’t receive any compensation. Certainly the area of apps governing personal and lifestyle choice is now being invaded by ubiquitous algorithms in the form of Recommendation Systems. Much of the information garnered from social networks is filtered and personalised to guide lifestyle and entertainment; selecting an exercise regime, a relationship, online book or author, and restaurant or movie choice based on past experience and behavioural profiles. And already a third of US shoppers use the internet to make a buying decision.

These subliminal recommender systems represent the beginning of an always-on individual omnipresence, tracking your car on a GPS or recognising your face in a photograph, now combined with an AI generated virtual assistant such as Siri. More recent algorithms also have the potential to combine information to infer further hidden aspects of lifestyle.

But the real problem with such Recommender systems is their poor record at forecasting, particularly in areas of complex human behaviour and desires. And in addition the inner logic governing their Delphic predictions is generally hidden and opaque; meaning that guesswork is conveniently covered up while decision-making becomes dumbed down.
Enterprises such as banks, insurance companies, retail outlets and Government agencies compete to build algorithms to feed insatiable databases of personal profiles; constantly analysed for hidden consumer patterns to discover who is most likely to default on a loan, buy a book, listen to a song or watch a movie. Or who is most likely to build a bomb?
The rise of the algorithms embedded in our lives could not have occurred without the surge in the inter-connected, online existence we lead today. We are increasingly part of the web and its numerous sub-networks, constantly in a state of flux.
A supermarket chain can access detailed data not only from its millions of loyalty cards but also from every transaction in every branch. A small improvement in efficiency can save millions of dollars. The mushrooming processing power of computers means that the data collected can be stored and churned continuously in the hunt for persons of interest. So who is to stop them if consumer groups aren’t vigilant?

This is not too much of a nuisance when choosing a book or a movie, but it can be a serious problem if applied to credit rating assessment or authorisation of healthcare insurance. If an algorithm is charged with predicting whether an individual is likely to need medical care, how might that affect their quality of life? Is a computer program better able to calculate kidney transplant survival statistics and decide who should receive a donor organ? Algorithms are now available to diagnose cancer and determine the optimum heart management procedure using the latest worldwide research. Can human doctors compete in the longer term and will algorithms be better at applying game theory to determine the ethical outcomes of who should live or die?

The ethics of data mining is not limited to privacy or medical issues. Should the public have more control over the application of algorithms that guide killer drones towards human targets. Eventually computer-controlled drones will rule the skies, potentially deciding on targets independently of humans, as their AI selection algorithms improve. But if an innocent civilian is mistaken for the target or coordinates are accidently scrambled, can the algorithm be corrected in time to avoid collateral damage?

So algorithms must have built-in adaptation strategies to stay relevant like every other artifact or life form on the planet. If not they could become hidden time bombs. They will require ultra-rigorous testing and maintenance over time because they can become obsolete like any process governed by a changing environment – such as the y2k computer bug and the automatic trading anomaly. If used for prediction and trend forecasting they will be particularly risky to humans. If the environment changes outside the original design parameters, then the algorithm must be also immediately adapted. otherwise prediction models and simulators such as the proposed FuturICT global social observatory might deliver devastatingly misleading forecasts.

As mentioned, a number of artificial intelligent techniques depend on algorithms for their core implementation including: Genetic algorithms, Bayesian networks, Fuzzy logic, Swarm intelligence, Neural networks and Intelligent agents.

The future of business intelligence lies in systems that can guide and deliver increasingly smart decisions in a volatile and uncertain environment. Such decisions incorporating sophisticated levels of intelligent problem-solving will increasingly be applied autonomously and within real time constraints to achieve the level of adaptability required to survive, particularly now within an environment of global warming.

In this new adaptive world the algorithm is therefore a two-edged sword. On the one hand it can create the most efficient path to implementing a process. But on the other, if it is inflexible and incapable of adapting, for example choosing to continue to manufacture large fossil fuel burning vehicles, it can lead to collapse, as in the case of Ford and GM.
Good decision-making is therefore dependent on a process of adapting to changes in the marketplace which involves a shift towards predictive performance management; moving beyond simple extrapolation metrics to a form of artificial intelligence based software analysis and learning, such as offered by evolutionary algorithms.

Life depends on adaptive algorithms as well – assessing distance to a food source encoded in the dance of a bee, determining the meaning of speech or acoustic sounds, discriminating between friend or foe., the ability of a bird to navigate based on the polarisation angle of the sun or a bat avoiding collisions based on split-second acoustic calculations.
These algorithms have taken millions of years to evolve and they keep evolving as the animal adapts in relation to its environment.
But here’s the problem for man-made algorithms. Very few have been designed with the capacity to evolve without direct human intervention, which may come too late as in the case of an obsolete vaccine or inadequately encrypted file.

The rate of change impacting enterprise environments in the future will continue to accelerate, forcing the rate of decision making to increase in response autonomously, with minimal human intervention. This has already occurred in advanced control, communication and manufacturing systems and is becoming increasingly common at the operational level in e-business procurement, enterprise resource planning, financial management and marketing applications, all of which are dependent on a large number of algorithms.

Dynamic decision support architectures will be required to support this momentum and be capable of drawing seamlessly on external as well as internal sources of knowledge to facilitate focussed decision capability.
Algorithms will need to evolve to underpin such architectures and act as a bulwark in this uncertain world, eventually driving it without human intervention; but only if they are self- verifying within the parameters of their human and computational environment.

Thursday, January 12, 2012

Future Enterprise- The Future of The Internet

David Hunter Tow – Director of the Future Enterprise Research Centre, forecasts that within the next decade the Internet and Web may be at risk of splitting into a number of separate entities- fragmenting under technological, national, business and social pressures.

In its place may emerge a network of networks – continuously morphing- linking and fragmenting, with no central dominant domain backbone; instead a disconnected, random structure of networks with information channeled through uncoordinated switching stations and content hubs, controlled by a range of geopolitical, social and enterprise interests.

For authoritarian states such as China, North Korea, Iran and Syria as well as criminal cartels, this will facilitate the expansion of their operations, allowing them to circumvent exposure of illegal activities in much the same way as the current Darknet network.

Darknet- the alternate network of virtual channels that currently operates beneath the backbone of the Internet has long been a place for clandestine operations, by both criminal and state networks. It is also used as a tool by cyber authorities to provide evidence of DDoS, port scanning, worms and other malware; also allowing dissidents from repressive regimes to remain in touch with the outside world, providing protection to whistle blowers and hosting pirated movie and music sites- out of reach of traditional search engines.

Autocratic governments are also maintaining increasingly tight censorship over politically sensitive sites via controlled points of entry to their cyber fiefdoms, even to the extent of distorting current and historical events. Both China and Iran now have plans to establish their own Internet infrastructure to further strengthen the control and censorship of their populations and no doubt other authoritarian states will follow. But this power won’t be limited to dictator-run states. The increasing threat of Internet censorship via the proposed SOPA- Stop Online piracy Action legislation in the US, and now the exposure of the NSA's pervasive Cyberspy program, confirms the threat facing online privacy and freedom even within democratic nations and has motivated opposition by citizens and companies concerned about the risks of storing personal and confidential corporate data in US Clouds.

At the same time white hat hacker and pro-privacy groupsare  launching local wireless Meshnets without any centralsed control  as well as their own communication satellites linked to a grid of tracking stations in order to avoid such Government surveillance and interference, as discussed at the recent Chaos Communication Congress in Berlin.

But Apple, Facebook, Google, Amazon as well as Cable and Internet TV companies have already begun to fragment the web to support their own Walled Garden strategies of quarantining and manipulating membership data, applications, entertainment, search results and identities. Facebook membership data cannot be transferred to other social sites. Adobe’s Flash software as well as a number of developer applications were banned by Apple, which means the iPhone browser cannot display a large portion of the Internet. Likewise Amazon’s Kindle will only display books on sale or for rent by the company. Google fails to protect email privacy or adequately attribute search results to original sources.

Such social sites have become closed silos, similar in many respects to those of authoritarian sites such as China.

The more this type of restricted, proprietary architecture gains traction on the Web the more it will become fragmented and the easier it will be for criminal groups to exploit, placing the open and egalitarian charter of the future Internet at risk.

But there are compelling reasons why such closed silo strategies and gross invasion of citizen privacy, introduced by Governments and mega Web companies is likely to eventually collapse.

As outlined in previous blogs, physics ordains that information flows cannot be constrained and will eventually spread by pathways of least resistance, driven byconsumer demand, competitive pressure and technological advances. In addition, biological ecosystems with limited genetic variation are the most vulnerable to extinction. Companies within the cyber ecosphere are equally vulnerable- more susceptible to competition and rapid changes in their technological and social environments if open access to innovative ideas and information flows is restricted. And balkanisation of the Intenet is very bad for business- particularly US business as companies retreat from using vulnerable Cloud and Social Media services.

The emergence of the Semantic Web is also a catalyst for greater openness, facilitating the interpretation, linking and application of knowledge stored in millions of discrete databases across the Web. This is a vital advance in fostering greater transparency, flexibility and autonomy within the Cybersphere.

But the battle for web control and Internet supremacy is only just beginning, not only between the US and China but also involving all other nations in the newly emerging multi-polar world. The US still maintains the controlling votes in ICAAN - the Domain management company, despite many attempts to democratise its management.

But now the US will be forced to flex up and stop playing the role of alpha male in an increasingly equal and diverse information world.

By its obsession with maintaining technology dominance of critical assets such as the Web, particularly in a time of global warming, with an urgent need to effectively manage
global resources for all populations, the US is ironically accelerating the rise of alternate Internets and Webs.

China is charging ahead with alternate communication networks, as in most areas of new technology. After all its search engine - Baidu, already has 500 million users - almost as many as Google worldwide. Baidu works hand in glove with the Central Communist Party and is the ultimate arbiter of reality for its users, committed to working within the Government's paranoid censorship parameters constrained by a massive firewall of 50,000 Internet police. But with 200 million bloggers producing trillions of words a day as well as subscribers to RenRen and Seina Weibo- the equivalent of Facebook and Twitter, it’s becoming an increasingly tough call- even for a totalitarian government.

So now the momentum is building for a multi-Internet infrastructure as governments of all colours attempt to impose their will and dominate the evolution of the pre-eminent artefact of our civilisation, which may hold the key to the planet’s survival.

In the short term China cannot replicate the mega optic fibre cable, satellite and server networks of the present Internet, but it can deploy a mesh of alternate wireless channels linking its own network assets to other friendly systems, for example in Africa, South America, Iran and Russia; at the same time constructing a topology complete with their own domain servers. In addition, it will develop its own knowledge hubs while leveraging the existing core public assets such as the priceless science, engineering, social and economic databases of the current Web.

The new US Net Neutrality rules recently introduced to prevent balkanisation are already under heavy fire, with broadband providers prevented from engaging in anti-competitive behaviour by blocking content or slowing access to sites and applications, as Comcast attempted to do in 2007 with the BitTorrent "peer-to-peer" protocol.

But as the pressure to bypass the new rules to allow a multi-speed Internet has increased, so too have the tensions been building between the major Social Web, Broadband and Cloud providers- Google, Apple, Facebook, Cisco, Verizon, Amazon, VMware etc. Cloud vendors have been erecting a new set of proprietary firewalls, with VMware the exception, adopting an open architecture to encourage developers to leverage and extend its technology.

The more such closed architecture with differing operational and security standards gain traction however, the higher the risk that the CloudSphere will eventually become fragmented, less productive and more vulnerable to hacking.

Meanwhile, despite its financial problems, the EU plans to spend billions on boosting broadband speeds to increase productivity and competitiveness. The European Commission will spend 9 billion euros to rollout super-fast broadband infrastructure and services across the European Union to help create a single market for digital public services by 2020 for half its population including- e-health, intelligent energy and cyber security applications, assisting utility companies, construction cooperatives, public authorities and rural users.

New Internet Architecture options are also on the horizon, with a number of innovations in train, forecast to improve the Web’s flexibility while avoiding fragmentation. But these could be put in jeopardy by the US’s intransigence over ceding control.

For example the National Science Foundation has established the Future Internet Architecture program- Nebula, to better secure Internet- Id verification, data safety, mobile access and cloud computing. Google is also setting up a new Web architecture to improve search effectiveness.

At a recent Internet Conference run by the European Paradiso Group, a number of advanced options were discussed including- Internet routing algorithms with quantum options to provide more efficient and secure routing paths; flexible spectrum allocation; a smart Internet environment enabled by networked sensors; a content and context aware Web combined with self-organising and self-adaptive capabilities to provide more autonomy and optimisation.

In addition, the proposed Named Data Networking (NDN) architecture shifts the communication emphasis from today's focus on resource addresses, servers, and hosts, to one oriented to content and context. By identifying data objects instead of just locations, NDN transforms data into the primary Internet focus. While the current Internet secures the channel or path between two communication points, adding data encryption as an extra, NDN will implicitly secure content security and trust.

These and other advances will result in the emergence of Internet Mark 3.0, following its early incarnation as a simple packet data transfer system and then transforming into a pervasive information search powerhouse over the last decade

But Internet 3.0 will only emerge if fragmentation of its infrastructure and the ensuing chaos is avoided

Internet Mark 3.0 will offer- complex multidimensional and ultra-efficient processing and the dissemination of realtime, multi-services and decision-making based on content and context– not just physical objects.

Such capability will drive societal transformation at hyper speed, catalysing - urbanisation, mobility, vastly improved health and education services and all forms of virtual reality, as well as the beginning of a truly symbiotic Web-Human partnership in complex decision-making.

The Future of the Web has been discussed in a number of previous blogs by the author.

In summary-

By 2015 Web 2.0- The Social Web- will have developed into a complex multimedia interweaving of ideas, knowledge and social commentary, connecting over three billion people on the planet.

By 2025, Web 3.0- The Semantic Web- will have made many important contributions to new knowledge through network science, logical inference artificial intelligence. It will be powered by a seamless, computational mesh, enveloping and connecting human and artificial life and will encompass all facets of our social and business lives- always on and available to manage every need.

By 2035, Web 4.0- the Intelligent Web- will be ubiquitous- able to interact with the repository of all available knowledge of human civilisation- past and present, digitally coded and archived for automatic retrieval and analysis. Human intelligence will have co-joined with advanced forms of artificial intelligence, creating a higher or meta-level of knowledge processing. This will be essential for supporting the complex decision-making and problem solving capacity required for civilisation's future survival and progress.

Also by 2035 the last of the enterprise walled gardens will break down and leak like stone walls surrounding an ancient town. Techniques and technologies across the spectrum of knowledge will continue to spread, expand and link in new ways as they always have, bypassing temporary impediments, because that is the physical reality of information and knowledge.

The future Internet will inevitably follow these laws- becoming more open and flexible, using common protocols as enterprises and consumers demand greater flexibility. As an increasing number of data providers begin to implement Tim Bernier Lee’s Linked Data principles, it will transform into the creation of an open global Infosphere containing billions of links and coordinated by the World Wide Web Consortium.

This will offer a blueprint for connecting information from different sources into a single global data repository, with the Global Commons and Public Domain models playing an increasingly important democratic role.

Most importantly the Web will be equally available to and controlled by all nations, under the auspices of a specially constituted UN body, devolving forever away from US control.

But this can only happen if the underlying structural integrity of the Internet and Web is preserved. If managed as a global cooperative project it will result in enormous benefits for the whole of humanity. But if the Future Internet splits and fragments along geopolitical and competitive lines, as its current evolution suggests, then much of its potential benefit for our civilisation and planet will dissipate.

The next evolutionary phase of this pre-eminent human-engineered organism of the 21st century will be critical.

Tuesday, September 13, 2011

Future Enterprise- Cyberwars

The Fortune top 2000 companies as well as Governments across the world are under serious cyber attack and it is likely to get much worse.
Cybercrime is a generic term for the illegal incursion and disruption at the national, enterprise and community level, of both cyber and physical assets. Cyber assets include the key information and knowledge resources, including the data, policies, reports, IP, algorithms and applications, programs and operational procedures, that a modern society in the 21st century relies on to operate and manage its business.
Physical assets include an increasing number of everyday objects and services controlled by computers and increasingly connected to the Internet including- infrastructure, manufacturing and production machinery, industrial control and communication centres, security systems, medical devices, electricity grids and meters, vehicles and transport systems as well as billions of consumer and industrial electronic devices.
Cybercrime is a relatively new phenomenon but because of its recent scale and game-changing implications for both government and industry it is rapidly becoming the dominant risk theme of the 21st century.
The opportunity for cyber attacks grows daily as corporations and governments continue to amass information about individuals in complex networks across the Web and at the same time new generations of cyber activists, some motivated purely by money and others by the desire to expose and destabilise corporations and governments, continue to hack into organisational secrets.
No enterprise, no matter how small or benign, will be safe from attack in the future, with an estimated 250,000 site breaches reported in the last few years including- EMC's RSA Security unit, the Public Broadcaster PBS, Sony's PlayStation network, Apple administration password database, the International Monetary Fund, South Korea's largest banks, the Spanish Police, US Senate, Texas Police Department, the CIA, Turkish and Malaysian governments, Google's Gmail, the Nokia forum site and Citibank's Credit Card accounts.
In the latest Norton Cybercrime Report, it was reported that breaches of various types claimed 431 million adult victims last year, with 73% of adults in the US alone incurring estimated financial losses of $US140 billion. As a criminal activity, cyber incursion is now almost as lucrative as the illegal drug trade. The total cost last year, including lost productivity and direct cash losses resulting from cyber attacks associated with viruses, malware and identity theft is estimated at $US 388 billion.
The security firm McAfee report listed a range of cybercrime technologies deployed including- denial of service attacks, malware, spam, phishing, social site engineering, mobile phone viruses, botnets and phone sms Trojan messages. Also more recently, hacking drones- remote controlled aerial vehicles which can automatically detect and compromise wireless networks, by locating a weak spot in a corporate internet connection have been developed. To make matters worse, the first flaws in the advanced encryption standard used for internet banking and financial transactions as well as Government secure transmission, have been discovered.
But most worrying, security experts from McAfee have now discovered the biggest series of cyber attacks to date, involving infiltration of the networks of 72 organisations around the world including- the UN, the governments of the US, Taiwan, India, South Korea, Vietnam and Canada, ASEAN, the International Olympic committee and an array of companies from defence contractors to high-tech enterprises including Google- with most of the victims unaware of the breaches.
This represents a massive loss of economic advantage- possibly the biggest transfer of IP wealth in history. Currently every company in every industry of significant size, with valuable IP, contracts or trade secrets is potentially under attack and this will inevitably extend to smaller organisations such as strategic hi-tech start-ups in the future. At the national level it involves exposure of sensitive state secrets including- policy intentions and decisions covering all levels and functions of Government such as trade, defence and industry policy.
The stakes are huge; a challenge to economies and global markets. From both an enterprise and State perspective therefore this is an intolerable situation; but because it has exploded at such speed, the response to date has largely been fragmented and ineffective.
But this is about much more than ruthless criminal intent to pillage credit cards, steal trade data or bring down unpopular sites. On a global scale, cybercrime has the potential to morph into full blown Cyberwar!
The main players in this game of cat and mouse currently include three broad groups, each with different motivations, although overlapping to a degree.
First- the State sponsored hackers- China, Iran, Russia, Estonia, Israel- recently upping the cyberwar stakes with its Stuxnet attack on the nuclear facilities of Iran, Indonesia, North Korea and Syria. At the same time dictatorial regimes across the world, from Syria to Saudi Arabia have introduced extreme punitive measures to monitor and control access by dissidents, particularly during the Arab Spring. And they have often coerced US and European technology companies to assist them, including Siemens- in the cross-hairs for assisting the autocratic Government of Bahrain track down dissidents.
Second- the White hats- independent freelance hacker groups such as Anonymous/LulzSec. Their aim according to their manifesto is to expose the corruption and greed inherent in the play-books of big business and rogue regimes powered by hyper-capitalism and intent on plundering the natural resources of the planet. They also support whistle-blower groups such as WikiLeaks and social activist groups in general.
Third- the Black hats- with much more clearly defined goals, from overtly criminal to destructive and anarchistic. They are marshalling their attacks primarily on the Midas riches of credit card and financial databases across the globe, at the same time as China and Russia are hacking other Government’s IP, email and trade secrets.
Cyber Hackers now make up a complex substratum of social crime, composed of an ad hoc combination of hackers and security experts, each with a fiercely competitive agenda. But already fragmentation is extending to inter-cyber warfare between these rapidly evolving networks of dysfunctional society, at the same time overlapping with global terrorist groups.
The world's superpowers have already begun to introduce new cyber-policies to desperately protect their intellectual property, infrastructure and financial assets, as well control the flow of information within their populations- but is already bogged down.
The European Convention on Cybercrime is moving at glacial speed because EU governments are reluctant to share sovereign IT information with other powers, even if friendly. The new US Cyber Manifesto has also been stymied. The policy aims to support open access to the Internet while at the same time pursuing a policy of aggressive physical deterrence against any foreign powers such as China and Iran or organisations like WikiLeaks, which attempt to penetrate US computer systems. But this policy is meeting resistance from vested US business interests on issues of regulatory control and government surveillance of business system security.
China on the other hand appears to be going for the jugular. It has established The State Internet Information Office with the express purpose of regulating and controlling its vast Internet population and had even considered building an alternative Internet to sidestep the US controlled ICAAN.
Cybercrime may also be made a lot easier by the ubiquitous application of Cloud technology in the future. Most major corporations and government agencies will be using at least one Cloud to store and process its operational data, leased from Google, Cisco, IBM, Amazon, Microsoft, HP etc. Already several of these clouds including Amazon have been breached and others have had outages. Gaining access to data from a dozen major information sources would be a lot easier than penetrating thousands of individual databases.
Even though most Cloud installations had incorporated security software easily able to ward off rudimentary distributed denial-of-service and hacker attacks, future Cybergent technologies would be much more effective because of superior forensic intelligence.
So the race is on to co-opt the most advanced cyber technology both to gain advantage, but also for prevention. Present day cybercrime technologies however will appear largely primitive within the next few years. The emphasis will shift to the application of much more sophisticated Cyberagent software technology.
The first generation of software agents appeared in the nineties and was used to trawl the Web, applying basic search procedures to locate information resources such as online shopping or travel sites and locating the best prices.
The second generation emerged around five years later. These programs were smarter, incorporating artificial intelligence that enabled them to make decisions more autonomously to meet their operational goals. They were deployed mainly in simulations of interactive population behaviour and interaction in a variety of environments- shopping malls, supply chains as well as disaster and conflict areas. In addition, they possessed superior negotiation and decision logic skills, using Game theory and semantic inferencing techniques.
But the third generation agents will be something else again. These will be based on complementary combinations of advanced AI techniques such as- ‘evolutionary algorithms’, that allow them to constantly improve their skills; 'neural networks' for superior pattern recognition and learning; ‘bayesian logic’ for powerful inferencing capabililty; ‘ant foraging' to help find the most efficient paths through complex network environments and ‘swarm' technology, allowing individual agent intelligence to be amplified by working cooperatively in large groups.
They will increasingly also be capable of tapping into the enormous computational intelligence of the Web, including the public databases of mathematical and scientific algorithms, eventually allowing their intelligence to be amplified by a factor of a hundredfold over previous agent capabilities.
Such agent swarms will also be equipped behaviourally and cognitively to focus on their missions with laser or Zen-like concentration, to the exclusion of everything else, until they have chased down their quarry; whether corporate strategic plans, government covert secrets or nuclear missile blueprints.
This Uber-level of intelligence will transform Agent swarms into formidable cyber strike forces, which could operate under deep cover or in sleeper mode, transforming into harmless chunks of code until a cell and attack was activated and could also replicate rapidly if additional forces were required.
Although this might sound like science fiction, the AI techniques involved, such as evolutionary algorithms, neural networks and swarm architectures have been in common use in business and industry for over ten years. The capacity to harness them in cyber strike force mode is only a matter of time.
But all parties now beginning to understand that the nature of conflict and the balance of world power is shifting with lightning speed, obsoleting overnight the nature of war and traditional economic dominance in a globalised cyber-world. Future conflicts will not be about destroying an enemy armed with billion dollar hi-tech armaments such as tanks, jets and warships, but will be played out largely in future cyberspace.
What value a sophisticated weapons system if it can be disabled by an elite cyber hacker with a Stuxnet-type virus?
What value armies of highly trained soldiers if their command and control centres can be disabled with a few keyboard strokes and a swarm of smart software agents?
What value the trillions of dollars spent on containing Al-Qaeda if the economic and logistical systems supporting the attack can be thrown into disarray by a powerful artificial intelligence algorithm?
But the CEOs of major corporations and military commanders of the major powers are still coming to terms with the mind-blowing ramifications of Cyberwar. Not only would their systems soon be obsolete but so would their command structures.
Adding to the pressure is the impact of global warming and the overuse of the planet’s finite natural resources. Cyberwars are more likely to flourish in times of food and critical resource shortages, with countries and enterprises desperate for inside knowledge to secure access to critical supply information. That time is not far off, with estimates of critical food shortages and rising prices as early as 2013, with a follow on spike in global conflict highly likely.
One thing is certain. From now on Cyberspace will be the new corporate and state battleground and Cybercrime the main risk protagonist.
The threat of all out Cyber war is now an urgent issue that transcends lines between individual enterprises or governments. Unless a global cyber security framework, binding both the private and public sectors can be engineered, a world of disorder will rapidly emerge - a turbulent world, where change has ceased to be beneficial and becomes ultimately destructive.

Friday, July 1, 2011

Future Enterprise- The Knowledge Universe

David Hunter Tow- Director of the Future Enterprise Research Centre contends that the dynamics and evolution of the Knowledge Universe are governed by the laws of physics just as the objects in our physical galaxy and universe.

Our Milky Way is a large barred spiral arm galaxy approximately 100,000 light years across in which we have a pantheon of amazing cosmic objects including- at least 200 billion suns and double that number of planets- some just like earth, black holes- including a massive one at its centre equal in power to 4 million suns, numerous dying or dead stars- burnt out white and brown dwarfs neutron stars and remnants of supernovas, trillions of asteroids and meteorites, and vast clouds of hydrogen gas and other molecules giving birth to new stars.

The dynamic links between these galactic entities are primarily a function of the all-pervasive force of gravity, which warps spacetime, creating black holes and initiating the birth and death of stars.

This incredible menagerie does not function as separate objects therefore, but constitutes a gigantic and complex network in a constant state of evolution, emitting radiation from the longest microwave and infrared to the shortest and most energetic x-ray and gamma wavelengths. In turn it is influenced by the other 100-200 billion galaxies that exist in our universe, which in turn may be influenced by other universes or causal patches in a multiverse.

And our small planet, harbouring perhaps the most advanced life form in the universe, is directly or indirectly influenced by all of them.

Our planet’s emerging Knowledge Universe is analogous to this gigantic network of linked galactic objects; a boundless array of information and knowledge objects connected within the networks of the Internet and Web and controlled by its own physical laws.

Information and knowledge objects evolve in a similar way to stars, planets and black holes by adapting to the laws of physics and information within their environments.

They may be loosely classified in terms of a dozen major categories including-

Knowledge Repositories- databases, data warehouses, data centres and modern-day Clouds; Knowledge Processors and Generators- including a vast array of enterprises, web and social sites, specialist software developers as well as community, social, cultural and scientific groups and institutions. These utilise a range of powerful computing devices increasingly linked to the Internet as well as human minds, interconnected via the Web in the form of a powerful computational intelligence.

In addition there exist a plethora of Knowledge Aggregators, Interpreters and Distributors- news feed publishers in both printed and electronic forms, modern day encyclopedia creators such as Wikipedia, and compilers of mathematical, biological, environmental, economic, financial and demographic statistics; utilising networks of all types- wired and wireless, channelling knowledge between and within objects across the Web.

These and many other knowledge object classes and sub-classes constitute a vast network of networks, constantly combining and morphing in unlimited combinations.

A Cloud for example not only stores information, but may process and transmit it as a service. Likewise a social or gaming network may function as a utility, applying database technology such as SQL and many other software tools; but also may manage its knowledge by storing member details and applications via an internal or external Cloud, distributing services via mobile media devices to its members, advertisers and other processing agents. In turn ubiquitous mobile devices - smart phones and tablets, increasingly perform heavy duty processing and provide significant internal storage as well as wireless transmission connected to other networks.

All these objects have a role to play in the knowledge universe menagerie. And in doing so they’re involved in an evolutionary dance of cosmic proportions. But the thing is, this dance is never going to stop and is accelerating in both volume and complexity.

It is estimated that by 2015 the amount of information will quadruple, generated by vast volumes of video transmission as well as countless new applications from the business, social and science research worlds- measured in petabytes.

Knowledge objects are also similar to and interwoven with the cosmic physical forces of the galaxy as they are born, grow, merge, morph, split, regenerate and die, based on the adaptive pressures of their environments. And more and more end up residing in the free public domain.

The evolution of each object is therefore a function of all other knowledge objects in its galaxy, following its own information laws controlled by physical principles and constraints. These include for example the Laws of thermodynamics and entropy, which define the limits of computation and the conversion of data into knowledge. And Shannon’s Laws, which set limits on information channel capacity and transmission.

The laws of physics also includes those governing information and knowledge flows such as the Action Principle – which defines the shortest and least energy intensive path between objects.

The Least Action Principle postulates that any dynamical process, whether the trajectory of a light ray or orbit of a planet, follows a path of least resistance or one which minimises the 'action' or overall energy expended.

Physicist Richard Feynman showed that quantum theory also incorporates a version of the Action Principle and underlies a vast range of processes from physics to linguistics, communication and biology. The evidence suggests a deep connection between this principle based on energy minimisation and self-organising systems including light waves, information flows and natural system topographies, such as the flow of a river.

Information and knowledge is now flowing seamlessly to every corner of the planet and its populations, mediated by the Internet and Web, reaching even the poorest communities in developing countries via cheap PCs, wireless phones and an increasing variety of other mobile devices.

Trying to block or bypass this flow is a pointless exercise and a sure way to hasten an enterprise’s demise. Essential knowledge may be temporarily blocked for example by patents, which protect IP but in 80% of cases are not applied, but used by large enterprises as a competitive blocking strategy. In the process this may deprive poorer populations of essential products such as life-saving drugs.

But regardless, eventually patents run out or are obsoleted by more advanced technologies. This is happening at an increasing rate in all fields - graphene-based electronics, superconducting materials, genetic-based therapies, green technologies, AI and quantum based computing methods.

Enterprise walled gardens therefore eventually break down or leak like a stone wall surrounding an ancient town, as the technology’s lifetime expires and new developments, opportunities and entrepreneurs emerge. Techniques and technologies across the spectrum of knowledge will continue to spread, expand and link in new ways as they always have, bypassing temporary impediments, because that is the physical reality of information and knowledge.

There are many examples of the recent spread and linking of knowledge objects in galactic orbits within the Knowledge universe including-

The Education Galaxy-

The transfer of knowledge is the basis of the education process and is now providing a global flow of free educational material and resources online, including open access courseware. Free courseware is already offered by a number of prestigious tertiary institutions including- The Massachusetts Institute of Technology, Yale and Harvard, as well as free knowledge reference sites such as Wikipedia. And this will accelerate, becoming pervasive in the near future; making it much cheaper and easier for educational resources to reach previously illiterate societies and communities, instead of being monopolised by traditional institutions such as Universities, particularly as a generational shift takes place.

The Knowledge Universe driven by The Action Principle will by 2040 finally allow the developing world to achieve equal status with the developed world in terms of access to knowledge, training and the realisation of human potential.

The Social Galaxy-

It is predicted there will be thousands of social networks within the Knowledge Universe over the next twenty years apart from the scores that exist today such as Facebook, Linkedin, Google Plus, Badoo, Ning, Academia, Craigslist, Foursquare, Plaxo, Yelp, WiserEarth, Meetup, Mebo, Friendster etc, each catering to the needs of specialised groups.

In the near future these will be seamlessly connected by new applications such as Diaspora, avoiding the walled garden effect and allowing individuals to roam at will across the social universe unimpeded.

The Media Galaxy-

In the Media arena the die has been cast. The older print companies are desperately trying to reposition to face of the online revolution. But by 2015 most print media will be forced to radically adapt towards an online multimedia model. Newspapers are already in turmoil with advertising revenues collapsing as traditional classified streams dry up due to online competition.

Traditional news, both local and global, is rapidly being reduced to a stream of headlines with minimal analysis. Special editions and feature articles will continue in reduced quantity, but online short-burst information- text, video and audio streams will become increasingly popular, distributed via multimedia platforms such as the new generation smart phones, tablets and eBooks, already in common use.

By 2020- traditional free to air television channels will also have largely disappeared, along with many cable channels, with television advertising similarly caught in the headlight glare of change. The switch will be to web channels covering every topic- personalised to individual taste- viewable anywhere, anytime and watched primarily on mobile media screens. The personalised channel will be ubiquitous, with news, information, music and video filtered and customised to suit every personal taste.

All print media including magazines and books will also have followed newspapers to a multimedia model distributed over the Web for flexible viewing. The same already applies to music and video. The power of traditional publishers and creative gatekeepers is now being challenged as online stores such as Amazon, Apple and Google and many smaller companies allow any author, song writer or video producer to self-publish globally and cheaply.

The Cloud Galaxy-

The Cloud is a metaphor for shared infrastructure, software and data storage within the web.

Clouds already support a large range of knowledge environments including- social, cultural, business, energy, financial, office, retail, manufacturing, supply chain, booking, engineering, gaming, music, photo, video, media, communications and scientific applications.

Most of the major service and software providers including IBM, EDS, Apple, Google, Amazon, Yahoo, Microsoft and e-Bay still adopt a walled garden approach, providing access to proprietary databases through proprietary Web Application Programming Interfaces-APIs.

APIs, rely on different ID and access mechanisms as well as data in specific formats for example to support music, video, particle collider and human genome information. Therefore APIs have tended to slice the web into separate sources and silos, restricting its full potential.

However In the future Clouds will become more generic and open using common protocols as enterprises demand greater flexibility. But the next evolutionary phase will offer much more- in particular Data Linking. This will promote the sharing of datasets across diverse domains and between business, research and group partners, bringing the full semantic power of the Web into play and changing the face of business forever.

Tim Berniers Lee’s recent publication of Linked Data Principles for connecting structured data on the web, provides a future blueprint for connecting information from different sources into a single global data repository; accessible by generic data browsers and standard database and query languages. An increasing number of data providers have now begun to implement these Linked Data principles, leading to the creation of an open global data space containing billions of links and coordinated by the World Wide Web Consortium.

And so the trendlines are now becoming clear. The Web is advancing as a multi-dimensional medium for the discovery, generation and linking of knowledge in all its forms, leveraging semantic and artificial intelligence. Individual supplier services will obviously continue to multiply, but enterprises will increasingly demand access to open source data clouds as well as most utility services.

Cloud spaces will continue to blend and split, fragment and reform in unlimited combinations and permutations. They will share data as media organisations already do amongst themselves and with countless news aggregators. The divide lines between public and private ownership of application IP will also become fuzzy, with most applications and algorithms over time converting to generic forms- as many critical software tools such as Linux, Java and SQL.

The Global Commons and Public Domain models therefore will play an increasingly important role. They represent a free sharing knowledge marketplace accessible for the global benefit, where everyone wins as value-added services proliferate. Alternate knowledge and social hubs such as the thousands of Wikipedia lookalikes, controlled by consumer groups, will start to compete with and displace the power of the media and Uber-web enterprises such as Google, which will be forced to cede part of its global knowledge control in its own survival self-interest.

The Web will be controlled by all nations via the global commons in conjunction with a specially constituted body such as the present ICAAN, devolving away from US control.

Many companies have tried to go against the evolutionary flow in the past and paid the price – including GM and Ford which continued to produce large gas-guzzling vehicles. They survived the low carbon/electric vehicle revolution only because of taxpayer largesse.

IBM was another that attempted to force the market to accept its large mainframes- against the trend towards small desktop computers and later the internet. IBM almost died but recovered just in time by embracing software and services, and now leveraging its Smart Planet Strategy.

Microsoft has until recently continued to promote desktop computing against the trend to internet and mobile computing and has been caught flat footed. It may survive as it belatedly adapts its office software to the Internet, but not in its previous dominant position.

Nokia was king of mobile phones but failed to see the shift to smarter phones and applications. It has now been forced to merge to survive, with a low likelihood of ever returning to its glory days.

Oracle, Apple and Facebook are busy building walled gardens. Although looking dominant today their longer term survival will also be in jeopardy if they continue their retro strategy against the flow.

The latest 'Smart Planet' paradigm, in which the infrastructure and processes of the planet- whether manufacturing, supply chains, electricity grids, water pipelines or traffic flows, are being re-engineered to optimise performance and achieve greener, more sustainable outcomes, will be the major driver for the enterprise of the future.

The Smart Planet will also demand that decisions be made more rigorously, efficiently, adaptively and therefore largely autonomously, within a radically new networked architecture.

This will be a major disruptive paradigm for many traditional IT companies which will be forced to redesign their applications and services from the ground up. Those that are too slow will be overtaken by the new generation of nimble system developers, not weighed down by legacy systems. The larger software enterprises in particular will struggle to keep up with the constant flow of knowledge and innovation required to survive, after comfortably dominating their market segment for years, as the cycles of change get shorter and shorter.

The flow of information and knowledge according to physical principles will continue at an accelerating rate, but still many companies will try to continue to swim against the flow to their eventual cost.

Within two decades today’s Internet and Web itself will have split into many alternate distributed but connected network descendants, eventually criss-crossing the knowledge universe and supporting autonomously managed worlds with different processing efficiency and reliability requirements.

Software and system developers and suppliers will need to differentiate their products increasingly as focussed value-added services, targeted to specific enterprises and industries. Service applications will therefore be differentiated primarily by the level of value they contribute to the enterprise- not their generic capability.

Enterprises in turn will need to be very agile, not only because of the exponential rise in the diversity and volume of knowledge, but also its potential for interweaving and creating opportunities in countless applications. They will therefore need to keep acutely tuned to the signals from their environment to survive.

As the Knowledge Universe expands and complexifies as a network of networks, with the spread of information and knowledge according to the laws of physics, enterprises will have only one avenue of escape. That is to continually innovate to generate new knowledge in the form of new products and services before the next wave of science and technology innovation overtakes them; just as electric cars, digital photography and smart phones have already obliterated whole sectors of industry in the blink of an eye.

No enterprise can escape this remorseless race. Better to join it rather than putting up a wall which will inevitably crumble.

They will need to run very hard just to survive- just like the Red Queen.