Thursday, November 22, 2012

Future Enterprise- The Future of Services


The Director of the Future Enterprise Research Centre, David Hunter Tow, predicts that the current explosion of new services will trigger the biggest treasure hunt in the history of computing technology.
The Services Sector is currently in turmoil with thousands of startup companies cashing in on new opportunities to re-engineer traditional ways of doing business- and this is just the beginning.

Every process is currently being transformed into a new service- not just in traditional service sectors such as retail, media, education, healthcare, tourism and finance, but also in industry areas such as manufacturing- with made to order 3D printing techniques, medical processing- offering personalised DNA sequencing and diagnostics instantly on an iPhone chip, and inexpensive solar energy and water purification systems cheaply available for domestic use in developing countries.  
There’s not one established service process that’s not being seriously disrupted by smaller more agile independent players, leaving the lumbering giants that once dominated commerce in the 20th century stumbling blindly in their wake.

All major service sectors are currently being carved up, their key functions hived off and new innovations successfully introduced in competition with those of the original gatekeepers, continuing to guard their crumbling IP parapets, while new knowledge is generated by the terabyte.
What is catalysing this frenzy and where is it heading?
A number of convergent factors are involved in this 21st century phenomenon – breakthroughs in new technology- mobile computing, augmented reality, artificial intelligence and information analysis,  massive social change in the form of deregulation of knowledge generation and access spurred on by the social media and global Web; and an unstoppable surge in the creative potential of a new generation steeped since birth in the cyber revolution, but now cheaply available; all combined with the very low cost of market entry for innovative entrepreneurs.

Of course online retail and marketing started the ball rolling in the nineties and has never looked back. Traditional main street bricks and mortar retail has been fighting a ferocious rearguard action, but by and large it’s been a losing battle ever since. The smart retailers hedged their bets by combining new boutiques and online websites, but overall, new leaders in the revolution such as Amazon, eBay, Apple and Google as well as thousands of smaller specialists, just kept upping the ante with greater global choice, faster service delivery and deeper discounts.
Then came the new wave of targeted retail service apps nibbling away at the leaders- companies like Foursquare peddling Deals of the Day- but now over 130,000 Android and 300,000 Apple apps covering- self-checkout, barcode scanning, loyalty programs, coupon and discount offers, retail location discovery, best buys, augmented reality advertising, customer reviews, consumer preferences and recommendations and mobile platform payment functions- all making the online and store shopping experience easier and more exciting. And for developers- the social media to inexpensively promote them.

In the meantime the traditional Media and Advertising industries were hurting. Gone were the salad days of broadsheets generating the golden streams of classified advertising revenue from job, real estate and used car advertisements; paying for the packed newsrooms and inflated expense accounts in five star global hot spots.  
In their place one stop outsourced stories and editorials with duplicated online headlines.

Again a frenzy of online experimentation began. But pay walls had only limited success, even for the heavyweights such as the New York Times and Wall Street Journal. Classified and banner advertising continued to haemorrhage, migrating to web sites at greatly reduced prices and the social and alternative media led by a myriad young independent operators, grabbing the best headlines and news stories via cheap phones.
As a result the print operators such as News Ltd are only pale shadows of their former selves and have quietly retreated to the more glamorous world of cable television and film. But this is only a temporary reprieve as the low budget independent film and documentary makers gain ground on YouTube and in Arthouse cinema seats, shooting with low cost video cameras,  while at the same time chasing the more interesting reality footage; all supported by the citizen journalists and freelance bloggers desperate for a voice in the brave new cyber world.

As a result of this revolution the power of traditional media services has seriously waned and is likely to have largely disappeared within a few decades, replaced by countless personalised web channels and DVD and gaming startups, controlled by myriad smaller, more energised groups and individuals.
At the same time the Advertising industry is in a monumental bind- caught in the headlight glare of change; trying to find the magic brand formula for clients by mixing and matching the traditional and burgeoning new media- but apart from reverting to Google and Facebook, not having a lot of success with either, unable to capitalise effectively on the thousands of creative local specialists and the cornucopia of apps.

Over time traditional advertising will therefore become less significant to major brands as it transitions to an infotainment format, with thousands of independent product sites and apps providing instant comparative advice to consumers without the retrospin of big business.

Education
But the big revolution in services- the game changer of the 21st century, will come from easy global access via mobile online learning to high quality inexpensive education. This, according to educators, will turn every mobile phone into a knowledge portal and return education to the golden age of sharing ideas among communities of scholars, releasing them from a boring classroom environment with second rate lecturers more interested in their next overseas conference schedule.

Interestingly the revolution is being led from the inside by some of the biggest and most hallowed institutions- Harvard, Stanford, MIT and Yale. Suddenly global tertiary level courseware and soon secondary level as well, is available at very low cost from these prestige US universities through massive online platforms such as Coursera; while third party reference sites across the Web such as Wikipedia, Google, Microsoft and Facebook plus a host of talented independent specialists will provide countless training services by integrating and coordinating domain related knowledge.
This is the next phase in the democratisation of the world’s storehouse of information, driven by the need to realise the potential of the vast under-educated populations of Africa, Asia and the Middle East that have missed out on the planet’s opportunities. It will allow anyone with a mobile phone or tablet to access the same level of knowledge regardless of location, income or the availability of local training resources. 

Over the coming decades therefore the services of learning and education will undergo a profound shift, from the traditional classroom/face to face method of knowledge transfer to a much more abstract model, where teaching will be largely separated from its current physical infrastructure, such as classrooms and campuses.

It will also be linked to the Cyber Revolution- transforming the world’s knowledge base into a vibrant multimedia forum- using the latest 3D, virtual reality and gaming technologies- all delivered by smart mobile and embedded multi-media kinetic devices linked to the Intelligent Web.

Healthcare
Now the medical and healthcare services sector is also ripe for revolution. Phone apps are increasingly available to act as remote monitors for home based medical and health support purposes- the remote diagnosis of life threatening conditions and algorithms to calculate correct drug dosages and interventions for acute illnesses such as diabetes, malaria  and HIV.  

This revolution has been driven to a large extent by the healthcare needs of half the planet’s population that still live in dire poverty, unable to afford traditional life-saving hospital support or medication.
These and many other diagnostic  and treatment services are now putting patients at the centre of the management of their own healthcare with the help of trained volunteers, bypassing the bottlenecks involved in the traditional delivery of medical services by scarce qualified practitioners.

Future services will also be based on the accessibility of whole-of-life eHealth records across both the developed and developing world, eventually allowing the creation of online global health records from pre-birth to death, providing personalised remote support services delivered on an iPhone or community personal computer. Within a decade, health records will include the sequencing of an individual’s genome as a vital diagnostic service at a cost of a few dollars.

A number of other technological breakthroughs will mark the expansion of new healthcare services within the next few decades including - stem cell therapies to repair human tissue and organs, reversing heart disease for example;  prevention of cancers and neurodegenerative diseases such as  breast cancer and Alzheimer’s;  sensory repair such as early retinal and corneal implants; prosthetics including neuron-controlled limbs; brain/ nervous system interfaces, overcoming spinal paralysis using brain signals; and interactive humanoid robots to provide human companionship and physical support.

All will be available as relatively inexpensive services once the enabling technologies have been approved. Why? Because if the major healthcare companies don’t provide them at  an affordable price, entrepreneurial groups will, as occurred with the generic drug revolution in developing countries when Big Pharma refused to drop their prices.

 Finance/Banking
The Finance and Banking industry has been a sitting duck for radical change for a long time- getting bigger and more obese with minimal outside competition – seeking to control ever more functions in a frenzy of greed- from investment, transaction processing, payments, exotic specialist derivatives, consumer credit, foreign exchange, mortgage provision, money transfers, advice on mergers, trading and anything else that shows a hint of making easy money and inflating their balance sheets. 

There have been numerous exposures of the underlying level of corruption within the finance and banking industries, to the point of defrauding their own customers and incurring horrendous trading losses by rogue dealers through sloppy oversight, in the process threatening bankruptcy for themselves and their clients and culminating in the GFC.  But it didn’t deter them for long and despite some fresh regulations and a massive infusion of taxpayer dollars, their insatiable greed continued to explode.
But if government regulators have failed to reign them in - a number of agile competitors offering cheaper, safer and more convenient services, may do the job for them.

The major looming battle is between the traditional finance industry and the global technology giants such as Apple, Google and Paypal- using their skills at creating innovative software to provide Payment and Credit card services, using wireless apps that allow mobile phones to store loyalty and credit card information, make payments and transfer money. Technology-poor African countries such as Kenya have taken the lead in these services of convenience and already provide perfectly viable phone money transfer services via text, bypassing expensive banking services.

Now the bloated goliaths are fighting back with their own brand apps. Banks are also using contactless near field technology to convert smartphones into mobile credit and payment devices. But it may already be too late as the genie has escaped the bottle and the smart entrepreneurs realise the banking emperor has no clothes, except Wall-Mart hand me downs.
It is likely that banks in their present form will cease to exist within the decade, effectively disembowelled by smarter consumer and business service providers. They may become primarily back office transaction processors and routine mortgage providers, with a veneer of  deal making, offering a line of credit for smaller uncomplicated businesses. All other functions will become the province of highly skilled specialists.

And so the frenzy of creativity and service disruption will continue in all areas of commerce and industry, as the current generation of software engineers and innovators becomes acutely aware that the rules that propped up the old corporate structures are obsolete.
The old software guard that controlled the boundaries of commerce so tightly are also increasingly ripe for the picking because the original rules governing old sectors such as retail, media, manufacturing, banking, pharmaceuticals, photography, music, publishing etc, just don’t hold up anymore. They don’t reflect the changing social currents of the new era, stuck in the quicksand of the past. Therefore the software and systems houses that propped them up and contributed to their stranglehold, such as SAP, Oracle, IBM, HP, Yahoo, Cisco etc, are also irrelevant, weighed down by their own legacy technologies, now being systematically cannibalised by more agile and visionary players. 

Take any industry. Who buys the software that was developed for it in the 80’s or even 90’s? Very few, except the dinosaurs locked in by exorbitant long term maintenance agreements and they are now paying a very high price for their conformity- unable to adapt or switch systems before being swallowed by the next wave of innovation. Their present systems just don’t reflect the changing way of doing business or social norms that the new generation of consumers want and have come to expect.
The technology keeps shifting and each time it moves it exposes the soft underbelly of the existing services and providers. Those like IBM that have survived have had to radically remodel their businesses. In the case of IBM – from hardware to software to services and now to ‘smart planet’ systems.

The next generation of providers- Google, Apple, Amazon and Facebook along with thousands of smaller service developers have already moved in on these crumbling bastions.  But even this new order are in turn being held to account by a host of smaller creative startups. And no matter how often the established leaders of any systems generation try to reinforce their monopolies by swallowing the smaller more agile enterprises, they are constantly outflanked by the tide of new knowledge and innovation.
And so the dance goes on – faster and faster and the treasure trove of potentially lucrative services keeps growing.

Saturday, October 6, 2012

Future Enterprise- The Future of Big Data


The Director of The Future Enterprise Research Centre- David Hunter Tow, predicts that Big Data  has the potential to unlock greater value for the enterprise and society, but in the process will radically disrupt traditional organisational functions at all levels,- particularly between IT departments and decision makers.
The connotation of the term ‘Big Data’ is at best extremely fuzzy and at worst highly misleading. It implicitly promises major benefits in direct relationship to the quantity of data corralled by an organisation. But Big Data is also hedged with constraints, contingencies and uncertainties, requiring the solution of a number of associated problems before it can translate into significant enterprise benefits.
To unlock real value in the future will require-

The design of more responsive enterprise and knowledge architectures based on a network model, allowing for the delivery of realtime adaptive decision responses;
A closer relationship between the business and its social environment enabling the enterprise to better understand the Big Picture;

The introduction of common data standards and streaming of seamless integrated multiple data types;
A quantum leap in intelligence through the application of more powerful artificial intelligence based, analytic, strategic and predictive modelling tools.

The need to upgrade the quality and security of current storage and processing infrastructure beyond current cloud architectures.

 Networked Architectures
A more flexible and responsive knowledge architecture for the future enterprise must reflect the reality of increasingly complex decision making, allowing far more agile reaction within the fast changing competitive environment of the 21st century.

This will involve the introduction of networked models at both the enterprise and information level, with nodes represented by decisions and information flows linking the relationships between them; eventually capable of autonomous adaptation within a constantly evolving social, technological, business environment.

.This Model, based on optimised decision pathways, with the capacity to dynamically route information and intelligence resources, supported by autonomous agent software, to appropriated decision makers, will be the core driver of Big Data architectures.  It must be capable of integrating and analysing streams of information from multiple sources in real time, channelling computing and information resources directly to relevant decision nodes, enabling critical decisions to be implemented in optimal time frames.

 
The Big Picture

 The capacity of the enterprise to mesh with its physical and social environment, will become increasingly vital for its survival in the future. Without such a grounded relationship, poor decision making based on an inwardly focussed mindset will continue to drive many large enterprises to bankruptcy.

 It will also be insufficient to plan just one, two or five years ahead. Although near-term sales and cost forecasts are important, understanding the bigger shifts likely to impact all businesses in a future dominated by climate change, geopolitics and globalisation, will be more essential to survival- allowing a better balance of creative planning, adaptive resilience and risk avoidance.

In fact enterprises- particularly the biggest, have a poor history in seeing the big picture. The larger the enterprise the more likely it is to believe in its own invincibility in the marketplace.

In more recent times both Ford and GM virtually went bankrupt and had to be bailed out by the public purse because they would not or could not see the obvious shift in consumer sentiment to smaller cars with lower fuel usage. And then there was AIG, Lehman Brothers and Citibank and Fanny Mae which also thought they were too big to fail. And now the giants Kodak and Sony and many others are struggling.

 In all the above cases, enterprise management ignored the signals coming loud and clear from their environments via consumers and customers, through a combination of ignorance and arrogance. In the meantime other more agile companies such as Microsoft picked up the trend towards desktop computing and exploited the opportunities left by IBM. But then Microsoft almost lost the plot to Google by not seeing the emerging power of the Internet as the dominant driver of information in today’s society.

 So despite the use of the latest business intelligence software busily scavenging for patterns from Big Data generated from past customer and financial data, standard industry forecasts,  plus some glitzy dashboard software, typical BI analysis without the guidance of the big picture will be virtually useless as a pathway to an uncertain future.

 Big Data now has the potential to open the door to this broader and more inclusive vision, which has rarely been a priority for most enterprises in the past. But that will now change. Survival, particularly for larger enterprises will be based on interpreting the bigger picture’s impact. If managed effectively Big Data will be the catalyst that provides a hedge against this myopia, but only if management’s mindset becomes more flexible and humble.

The Intelligent Enterprise
Big Data will also trigger the need to become much smarter, utilizing the latest artificial intelligence and statistical techniques such as evolutionary algorithms, neural networks and Bayesian logic to achieve a smarter enterprise. The latest 'Smart Planet' paradigm shift, in which the infrastructure, business and environmental processes of the planet are being re-engineered to optimise performance and achieve more sustainable outcomes, will also be a major driver for the networked smarter enterprise of the future.

The Smart Planet imperative will demand that decisions relating to society’s survival and wellbeing be made more rigorously, efficiently, adaptively and therefore autonomously.
But a Smart Planet revolution without a Smarter Enterprise mindshift won’t compute.

 Part of that mindshift will be a far greater emphasis on the future. The past is still a poor predictor of what’s to come. To improve the quality of decision-making, serious realtime forecasting will need to be boosted. Such systems will rely on information collected from numerous internal and external sources within an environment, using artificial intelligence algorithms and rules to analyse vital signals and interconnected trends and patterns that generate complex outcomes. The results of this analysis will then be channelled autonomously to the appropriate decision-makers for action.

But despite a range of mathematical improvements in our foresight and modelling methods, developed in tandem with a broader understanding of scientific and social principles, the corporate capacity to forecast effectively has been sadly lacking when data doesn’t follow obvious trends or when the signals of emerging change are faint.

Most forecasting textbooks traditionally list a number of well-developed techniques based around time series projections, regression analysis, Delphi and scenario expert options, as well as artificial neural networks and simulation modelling. But these have usually failed to predict the future in times of abrupt change within the broader physical, social and economic environment, such as the recent extreme disasters of the global financial crisis or the Arab democratic revolution.
The next phase in this evolution will be models powerful enough to not just deliver predictions but accurately prioritise the resources needed to manage those predictions, and develop project plans for their implementation. Then track the results to check on their effectiveness. In other words utilise textbook automatic feedback control principles much more rigorously. After a bridge or power grid has been built, its maintenance needs to be permanently and autonomously managed to prevent future catastrophic failures or escalating rebuild costs, tracked by the latest generation  of intelligent sensors.  

This common sense methodology of feedback and continuous monitoring of outcomes has been sadly lacking in many enterprises but will now be essential if business and society is to survive the onslaught of massive future shock. It will involve scanning for emerging problems, aggregating data streams from millions of internet-connected sensor systems and monitoring the pulse of the global environment- not just at the business level but also at the political, technological and environmental flashpoints.

Processes based on Big Data therefore need to be recognised as the beginning of the our civilisation’s survival fightback; applying adaptive and responsive techniques based on massive datasets, largely autonomously, because they are so big and complex that manual methods will fail, to the optimisation of the design, maintenance and operation of every process and application on the planet.

 The Cloud Solution

Collecting and storing the tsunami of data resulting from Big Data overload is a major stumbling block to the above goal, already creating unforeseen problems for the average enterprise by generating exponentially exploding datasets; as the Science communities- astronomers, biologists, cosmologists and particle physicists, have already discovered.

Traditional Relational or Hadoop databases, SOA architectures and SQL databases are not optimised for such massive real time processing, particularly as much of the data in the future will be unstructured and garnered from heterogeneous sources such as web pages, videos, RSS feeds, market intelligence, statistical data, electronic devices, instrumentation, control systems and sensors.

But just in time, Cloud processing management has emerged, offering an alternative solution, which few large organisations will be able resist. Now they will have the seductive choice of offloading the complete data management side of their operations to third parties in return for economies of scale and flexibility. The tradeoff is partial loss of control, but over time, providing security, backup and service levels are maintained at a rigorous standard, the organisation should benefit by being able to improve its focus on the core critical aspects of its operations. Only time and verification will tell if this tradeoff can deliver on its promise.
The IT department will be virtually invisible to decision-makers with the primary task to select the appropriate tools to implement enterprise strategies

Cloud computing will eventually offer a complete managed haven of services for Big Data- software, Security, Processing, Storage, Hardware and Infrastructure. All are now in the offing. But in the near future, Knowledge as a Service is likely to presage the greatest change within the Future Enterprise ecosystem.
Real-time integration of disparate data and application methodologies is a key challenge here, with the current conventional multi-staged approach being- build a data warehouse to consolidate storage, then aggregate information sources  and then select a BI tool and then process user queries.  But this is already proving expensive, slow and error prone.

A number of innovative platforms are being developed in this sector based on enterprise information streaming models. These provide a virtual unified view of the data stream without first transferring it to a central repository and also point the way to the next step of fully autonomous tool selection and decision support.

 Future Shock

It is now clear that the global environment is placing place enormous pressure on all organisations, not just from a competitive perspective but from the need to upgrade ethical and sustainability standards. This will continue at an accelerating pace. The changing technological environment in particular is already disrupting entire service industries  e-commerce in particular- retailing, banking, trading and supply. Now a second wave of service industries- manufacturing, healthcare, education, media, advertising, legal, hospitality and travel is being turned upside down by the revolution.

In this revolution Big Data is acting as a major catalyst, offering the glittering prize of untold value-added, but will generate this cornuopia only if it is also agile and precisely targeted, meeting the specific needs of multiple domains. Specialised decision-making in finance, biology, medical, cosmological, pharmaceutical, government, media and legal applications will require different classes of algorithmic support. And even domain analytic specialists may soon be obsolete as expert domain algorithms generated from the ever expanding cumulative knowledge of the Web begin to dominate the decision process. Critically however such algorithms will need to be continuously verified and adapted within a shifting social and business environment.

Because of the rate of innovation and subsequent disruption, service-based systems will therefore need to be self-adaptive; applying intelligent algorithms to support new options as well as the growth of collaborative ventures involving multiple stakeholders, such as commonly occur in service industries such as Hospitality, Travel and Real Estate.

New technological innovations such as smartphones and tablets are also increasingly filling mobile gaps and shortfalls in existing services. For example, by starting to displace traditional credit card/banking in the lucrative payments market and enabling the personalisation of healthcare and educational services in remote areas.

Such upgrades in the service sector imply the use of increasingly pervasive Big Datasets with low access time latencies. Response timeframes are critical, with cumbersome reporting and query tools way too slow for today’s end user needs. So the days of manual intervention in the decision process are drawing to a close as global markets creating decisions involving hundreds of variables required instantly.

So the stage is set. The filtering, pattern matching and super-intelligent analytic processing required to make sense of the overload of big Data, will mean that human intervention in the decision process will inevitably become a significant bottleneck.

But the future smart enterprise must have the flexibility to focus and deploy its cooperative intelligence autonomously, at all levels of the organisation. This will be a proactive response to new opportunities and competitive pressures in the marketplace.

The level of volume and complexity of decision-making will continually and rapidly increase over time in response to the changing social, geopolitical and technological environment. The resulting network interactions involving customers, supply chains, services, markets and logistics will eventually make it impossible for humans to compete. It will become just too complex and time-consuming even for dedicated teams of humans to manage, just as it is impossible to control complex trading, production, marketing operations manually or chemical plants and space missions today. 

The IT centre will rapidly transform into tomorrow’s Knowledge Technology Centre- KT Centre. This will place further pressure on the need for real-time high quality decision-making.

By 2030 humans will become partners in enterprise decision processes powered by intelligent algorithms based on realtime knowledge outcomes plus research encapsulated in the Intelligent Web. But over time their input, as for airline pilots and fast train drivers today, will be largely symbolic.

Big Data therefore will have provided a major catalyst for an extreme makeover of the future enterprise, the business environment, for society and the planet.

 




.


Sunday, May 6, 2012

Future Enterprise- The Future of Algorithms

Future Enterprise – The Future of Algorithms
Algorithms are taking over the world – at least the computational part of it- and that could be a good thing.
In a real sense the rise of algorithms is a sign of human intellectual maturity in terms of our capacity as a society to manage technology and science at a sophisticated level; representing the coming together of our mastery of computational science, together with the capacity to abstract the key essence of a process- to generalise and commoditise it.
The ubiquity of algorithms is in fact the next logical step in our technological evolution as a species and perhaps marks our evolution towards super-species status.

Algorithms translate a process into instructions that a computing machine can understand, based on a mathematical, statistical and logical framework. They are usually developed to minimise and rigorise the computing steps involved in a process or formula and therefore maximise its solution efficiency in terms of computing resources, while at the same time improving its accuracy and verifiability.

Algorithms come in all shapes and sizes and have been around a long time- well before the official computer age. Originally invented by Indian mathematicians, they were documented by a Muslim scholar of the 9th century- al-Khwarizmi and later applied by Euclid and Newton to assist in the ormalisation of their theories of geometry and forces of nature.
In the future almost every process or method will be converted to an algorithm for computational processing and solution, as long as it can be defined as a series of mathematical and logical statements, ideally capable of being run on a Turing machine. A Turing machine is a mathematical model of a general computing machine, invented by Alan Turing, which we use for our current computing requirements. Turing machines can come in a variety of flavours, including deterministic, quantum, probabilistic and non-deterministic, all of which can applied to solve different classes of problems.
But regardless, any computation, even those based on alternate logical models such as Cellular Automata or recursive programming languages, can also theoretically be performed on a Turing machine. The brain however, because of its enormous non-linear problem-solving capacity, has recently been classified as a Super-Turing machine, but the jury is still out as to whether it falls in a different computational class to the standard Turing model.

Many algorithms incorporating powerful standard mathematical and statistical techniques, such as error correction, matrix processing, random number generation, Fourier analysis, ranking, sorting and Mandelbrot set generation etc, were coded originally as computational computer routines, using languages dating from the 50s and 60s including- Fortran, Algol, Lisp, Cobol, C++ and PL1. Later common algorithms were also incorporated in mathematical libraries such as Mathematica making them easier to access and apply.

They have now infiltrated every application and industry on the planet, applied for example to streamline and rigorise operations in manufacturing, production, logistics and engineering. They cover standard operational control methods such as linear programming, process control and optimisation, simulation, queuing, scheduling and packing theory, critical path analysis, project management and quality control.

Engineers and scientists increasingly link them to AI techniques such as Bayesian and Neural networks, Fuzzy logic and Evolutionary programming, to optimise processes and solve complex research problems.
But over time, following the flow of computerisation, the ubiquitous algorithm has extended into every field of human endeavour including- business and finance, information technology and communication, robotics, design and graphics, medicine and biology, ecosystems and the environment and astronomy and cosmology; in the process applying data mining, knowledge discovery and prediction and forecasting techniques to larger and larger datasets.

Indeed, whenever new technologies emerge or mature, algorithms inevitably follow, designed to do the heavy computational lifting, allowing developers to focus on the more creative aspects.

Other algorithmic applications now cover whole sub-fields of knowledge such as- game theory, machine learning, adaptive organisation, strategic decision- making, econometrics, bioinformatics, network analysis and optimisation, resource allocation, planning, supply chain management and traffic flow logistics.

In addition, more and more applications are being drawn into the vortex of the algorithm which were once the province of professional experts including- heart and brain wave analysis, genome and protein structure research, quantum particle modelling, formal mathematical proofs, air traffic and transport system control, weather forecasting, automatic vehicle driving, financial engineering, stock market trading and encryption analysis.

A number of such areas also involve high risk to human life, such as heavy machine operation, automatic vehicle and traffic control and critical decisions relating to infrastructure management such as dams, power plants, grids, rolling stock, bridge and road construction and container loading.
The Web of course is the new playground for algorithms and these can also have far reaching impacts.
For example in 2010, the Dow Jones Industrial Average dropped 900 points in a matter of minutes in what is now known as a Flash Crash. It appears that for a few minutes several algorithms were locked in a death dance, in much the same manner as two closely bound neutron stars before implosion, triggering a massive collapse in the value of the US stock market. It was a wake-up call to the fact that in any complex system involving multiple feedback loops, it has been mathematically proven that unforeseen combinations of computational events will take place sooner or later.

Even today’s news headlines are shaped by algorithms. Not only is it normal for Internet users to select feeds relating to the personalised content they prefer, perhaps on a feel-good basis, but also stories are selected and curated by search-engine algorithms to suit categories of advertisers. This raises the issue of algorithms being applied to create different bubble realities that may not reflect the priorities of society as a whole- such as global warming, democracy at risk or a critical food shortage.


A major dimension of the impact of algorithms is the issue of job obsolescence. It is not just the unskilled jobs of shop assistants, office admin and factory workers, marketing and research assistants that are at risk, but middle-class, white-collar occupations, from para-legals to journalists to news readers. As algorithms become smarter and more pervasive this trend will extend up the food chain to many higher level management categories, where strategic rather than operational decision-making is primary.

And so we come to the millions of smartphone apps which are now available to support us in every aspect of our daily activities, but can also lead us to the dark side of a big brother society, where through the pervasive monitoring of location, shopping transactions and social connections, every individual’s life and timeline can be tracked and analysed using algorithms, with everyone eventually becoming a person of interest in the global society.

Social networks trade personal information to generate revenues, while the individual loses their right to privacy but doesn’t receive any compensation. Certainly the area of apps governing personal and lifestyle choice is now being invaded by ubiquitous algorithms in the form of Recommendation Systems. Much of the information garnered from social networks is filtered and personalised to guide lifestyle and entertainment; selecting an exercise regime, a relationship, online book or author, and restaurant or movie choice based on past experience and behavioural profiles. And already a third of US shoppers use the internet to make a buying decision.

These subliminal recommender systems represent the beginning of an always-on individual omnipresence, tracking your car on a GPS or recognising your face in a photograph, now combined with an AI generated virtual assistant such as Siri. More recent algorithms also have the potential to combine information to infer further hidden aspects of lifestyle.

But the real problem with such Recommender systems is their poor record at forecasting, particularly in areas of complex human behaviour and desires. And in addition the inner logic governing their Delphic predictions is generally hidden and opaque; meaning that guesswork is conveniently covered up while decision-making becomes dumbed down.
Enterprises such as banks, insurance companies, retail outlets and Government agencies compete to build algorithms to feed insatiable databases of personal profiles; constantly analysed for hidden consumer patterns to discover who is most likely to default on a loan, buy a book, listen to a song or watch a movie. Or who is most likely to build a bomb?
The rise of the algorithms embedded in our lives could not have occurred without the surge in the inter-connected, online existence we lead today. We are increasingly part of the web and its numerous sub-networks, constantly in a state of flux.
A supermarket chain can access detailed data not only from its millions of loyalty cards but also from every transaction in every branch. A small improvement in efficiency can save millions of dollars. The mushrooming processing power of computers means that the data collected can be stored and churned continuously in the hunt for persons of interest. So who is to stop them if consumer groups aren’t vigilant?

This is not too much of a nuisance when choosing a book or a movie, but it can be a serious problem if applied to credit rating assessment or authorisation of healthcare insurance. If an algorithm is charged with predicting whether an individual is likely to need medical care, how might that affect their quality of life? Is a computer program better able to calculate kidney transplant survival statistics and decide who should receive a donor organ? Algorithms are now available to diagnose cancer and determine the optimum heart management procedure using the latest worldwide research. Can human doctors compete in the longer term and will algorithms be better at applying game theory to determine the ethical outcomes of who should live or die?

The ethics of data mining is not limited to privacy or medical issues. Should the public have more control over the application of algorithms that guide killer drones towards human targets. Eventually computer-controlled drones will rule the skies, potentially deciding on targets independently of humans, as their AI selection algorithms improve. But if an innocent civilian is mistaken for the target or coordinates are accidently scrambled, can the algorithm be corrected in time to avoid collateral damage?

So algorithms must have built-in adaptation strategies to stay relevant like every other artifact or life form on the planet. If not they could become hidden time bombs. They will require ultra-rigorous testing and maintenance over time because they can become obsolete like any process governed by a changing environment – such as the y2k computer bug and the automatic trading anomaly. If used for prediction and trend forecasting they will be particularly risky to humans. If the environment changes outside the original design parameters, then the algorithm must be also immediately adapted. otherwise prediction models and simulators such as the proposed FuturICT global social observatory might deliver devastatingly misleading forecasts.

As mentioned, a number of artificial intelligent techniques depend on algorithms for their core implementation including: Genetic algorithms, Bayesian networks, Fuzzy logic, Swarm intelligence, Neural networks and Intelligent agents.

The future of business intelligence lies in systems that can guide and deliver increasingly smart decisions in a volatile and uncertain environment. Such decisions incorporating sophisticated levels of intelligent problem-solving will increasingly be applied autonomously and within real time constraints to achieve the level of adaptability required to survive, particularly now within an environment of global warming.

In this new adaptive world the algorithm is therefore a two-edged sword. On the one hand it can create the most efficient path to implementing a process. But on the other, if it is inflexible and incapable of adapting, for example choosing to continue to manufacture large fossil fuel burning vehicles, it can lead to collapse, as in the case of Ford and GM.
Good decision-making is therefore dependent on a process of adapting to changes in the marketplace which involves a shift towards predictive performance management; moving beyond simple extrapolation metrics to a form of artificial intelligence based software analysis and learning, such as offered by evolutionary algorithms.

Life depends on adaptive algorithms as well – assessing distance to a food source encoded in the dance of a bee, determining the meaning of speech or acoustic sounds, discriminating between friend or foe., the ability of a bird to navigate based on the polarisation angle of the sun or a bat avoiding collisions based on split-second acoustic calculations.
These algorithms have taken millions of years to evolve and they keep evolving as the animal adapts in relation to its environment.
But here’s the problem for man-made algorithms. Very few have been designed with the capacity to evolve without direct human intervention, which may come too late as in the case of an obsolete vaccine or inadequately encrypted file.

The rate of change impacting enterprise environments in the future will continue to accelerate, forcing the rate of decision making to increase in response autonomously, with minimal human intervention. This has already occurred in advanced control, communication and manufacturing systems and is becoming increasingly common at the operational level in e-business procurement, enterprise resource planning, financial management and marketing applications, all of which are dependent on a large number of algorithms.

Dynamic decision support architectures will be required to support this momentum and be capable of drawing seamlessly on external as well as internal sources of knowledge to facilitate focussed decision capability.
Algorithms will need to evolve to underpin such architectures and act as a bulwark in this uncertain world, eventually driving it without human intervention; but only if they are self- verifying within the parameters of their human and computational environment.

Thursday, January 12, 2012

Future Enterprise- The Future of The Internet

David Hunter Tow – Director of the Future Enterprise Research Centre, forecasts that within the next decade the Internet and Web may be at risk of splitting into a number of separate entities- fragmenting under technological, national, business and social pressures.

In its place may emerge a network of networks – continuously morphing- linking and fragmenting, with no central dominant domain backbone; instead a disconnected, random structure of networks with information channeled through uncoordinated switching stations and content hubs, controlled by a range of geopolitical, social and enterprise interests.

For authoritarian states such as China, North Korea, Iran and Syria as well as criminal cartels, this will facilitate the expansion of their operations, allowing them to circumvent exposure of illegal activities in much the same way as the current Darknet network.

Darknet- the alternate network of virtual channels that currently operates beneath the backbone of the Internet has long been a place for clandestine operations, by both criminal and state networks. It is also used as a tool by cyber authorities to provide evidence of DDoS, port scanning, worms and other malware; also allowing dissidents from repressive regimes to remain in touch with the outside world, providing protection to whistle blowers and hosting pirated movie and music sites- out of reach of traditional search engines.

Autocratic governments are also maintaining increasingly tight censorship over politically sensitive sites via controlled points of entry to their cyber fiefdoms, even to the extent of distorting current and historical events. Both China and Iran now have plans to establish their own Internet infrastructure to further strengthen the control and censorship of their populations and no doubt other authoritarian states will follow. But this power won’t be limited to dictator-run states. The increasing threat of Internet censorship via the proposed SOPA- Stop Online piracy Action legislation in the US, and now the exposure of the NSA's pervasive Cyberspy program, confirms the threat facing online privacy and freedom even within democratic nations and has motivated opposition by citizens and companies concerned about the risks of storing personal and confidential corporate data in US Clouds.

At the same time white hat hacker and pro-privacy groupsare  launching local wireless Meshnets without any centralsed control  as well as their own communication satellites linked to a grid of tracking stations in order to avoid such Government surveillance and interference, as discussed at the recent Chaos Communication Congress in Berlin.

But Apple, Facebook, Google, Amazon as well as Cable and Internet TV companies have already begun to fragment the web to support their own Walled Garden strategies of quarantining and manipulating membership data, applications, entertainment, search results and identities. Facebook membership data cannot be transferred to other social sites. Adobe’s Flash software as well as a number of developer applications were banned by Apple, which means the iPhone browser cannot display a large portion of the Internet. Likewise Amazon’s Kindle will only display books on sale or for rent by the company. Google fails to protect email privacy or adequately attribute search results to original sources.

Such social sites have become closed silos, similar in many respects to those of authoritarian sites such as China.

The more this type of restricted, proprietary architecture gains traction on the Web the more it will become fragmented and the easier it will be for criminal groups to exploit, placing the open and egalitarian charter of the future Internet at risk.

But there are compelling reasons why such closed silo strategies and gross invasion of citizen privacy, introduced by Governments and mega Web companies is likely to eventually collapse.

As outlined in previous blogs, physics ordains that information flows cannot be constrained and will eventually spread by pathways of least resistance, driven byconsumer demand, competitive pressure and technological advances. In addition, biological ecosystems with limited genetic variation are the most vulnerable to extinction. Companies within the cyber ecosphere are equally vulnerable- more susceptible to competition and rapid changes in their technological and social environments if open access to innovative ideas and information flows is restricted. And balkanisation of the Intenet is very bad for business- particularly US business as companies retreat from using vulnerable Cloud and Social Media services.

The emergence of the Semantic Web is also a catalyst for greater openness, facilitating the interpretation, linking and application of knowledge stored in millions of discrete databases across the Web. This is a vital advance in fostering greater transparency, flexibility and autonomy within the Cybersphere.

But the battle for web control and Internet supremacy is only just beginning, not only between the US and China but also involving all other nations in the newly emerging multi-polar world. The US still maintains the controlling votes in ICAAN - the Domain management company, despite many attempts to democratise its management.

But now the US will be forced to flex up and stop playing the role of alpha male in an increasingly equal and diverse information world.

By its obsession with maintaining technology dominance of critical assets such as the Web, particularly in a time of global warming, with an urgent need to effectively manage
global resources for all populations, the US is ironically accelerating the rise of alternate Internets and Webs.

China is charging ahead with alternate communication networks, as in most areas of new technology. After all its search engine - Baidu, already has 500 million users - almost as many as Google worldwide. Baidu works hand in glove with the Central Communist Party and is the ultimate arbiter of reality for its users, committed to working within the Government's paranoid censorship parameters constrained by a massive firewall of 50,000 Internet police. But with 200 million bloggers producing trillions of words a day as well as subscribers to RenRen and Seina Weibo- the equivalent of Facebook and Twitter, it’s becoming an increasingly tough call- even for a totalitarian government.

So now the momentum is building for a multi-Internet infrastructure as governments of all colours attempt to impose their will and dominate the evolution of the pre-eminent artefact of our civilisation, which may hold the key to the planet’s survival.

In the short term China cannot replicate the mega optic fibre cable, satellite and server networks of the present Internet, but it can deploy a mesh of alternate wireless channels linking its own network assets to other friendly systems, for example in Africa, South America, Iran and Russia; at the same time constructing a topology complete with their own domain servers. In addition, it will develop its own knowledge hubs while leveraging the existing core public assets such as the priceless science, engineering, social and economic databases of the current Web.

The new US Net Neutrality rules recently introduced to prevent balkanisation are already under heavy fire, with broadband providers prevented from engaging in anti-competitive behaviour by blocking content or slowing access to sites and applications, as Comcast attempted to do in 2007 with the BitTorrent "peer-to-peer" protocol.

But as the pressure to bypass the new rules to allow a multi-speed Internet has increased, so too have the tensions been building between the major Social Web, Broadband and Cloud providers- Google, Apple, Facebook, Cisco, Verizon, Amazon, VMware etc. Cloud vendors have been erecting a new set of proprietary firewalls, with VMware the exception, adopting an open architecture to encourage developers to leverage and extend its technology.

The more such closed architecture with differing operational and security standards gain traction however, the higher the risk that the CloudSphere will eventually become fragmented, less productive and more vulnerable to hacking.

Meanwhile, despite its financial problems, the EU plans to spend billions on boosting broadband speeds to increase productivity and competitiveness. The European Commission will spend 9 billion euros to rollout super-fast broadband infrastructure and services across the European Union to help create a single market for digital public services by 2020 for half its population including- e-health, intelligent energy and cyber security applications, assisting utility companies, construction cooperatives, public authorities and rural users.

New Internet Architecture options are also on the horizon, with a number of innovations in train, forecast to improve the Web’s flexibility while avoiding fragmentation. But these could be put in jeopardy by the US’s intransigence over ceding control.

For example the National Science Foundation has established the Future Internet Architecture program- Nebula, to better secure Internet- Id verification, data safety, mobile access and cloud computing. Google is also setting up a new Web architecture to improve search effectiveness.

At a recent Internet Conference run by the European Paradiso Group, a number of advanced options were discussed including- Internet routing algorithms with quantum options to provide more efficient and secure routing paths; flexible spectrum allocation; a smart Internet environment enabled by networked sensors; a content and context aware Web combined with self-organising and self-adaptive capabilities to provide more autonomy and optimisation.

In addition, the proposed Named Data Networking (NDN) architecture shifts the communication emphasis from today's focus on resource addresses, servers, and hosts, to one oriented to content and context. By identifying data objects instead of just locations, NDN transforms data into the primary Internet focus. While the current Internet secures the channel or path between two communication points, adding data encryption as an extra, NDN will implicitly secure content security and trust.

These and other advances will result in the emergence of Internet Mark 3.0, following its early incarnation as a simple packet data transfer system and then transforming into a pervasive information search powerhouse over the last decade

But Internet 3.0 will only emerge if fragmentation of its infrastructure and the ensuing chaos is avoided

Internet Mark 3.0 will offer- complex multidimensional and ultra-efficient processing and the dissemination of realtime, multi-services and decision-making based on content and context– not just physical objects.

Such capability will drive societal transformation at hyper speed, catalysing - urbanisation, mobility, vastly improved health and education services and all forms of virtual reality, as well as the beginning of a truly symbiotic Web-Human partnership in complex decision-making.

The Future of the Web has been discussed in a number of previous blogs by the author.

In summary-

By 2015 Web 2.0- The Social Web- will have developed into a complex multimedia interweaving of ideas, knowledge and social commentary, connecting over three billion people on the planet.

By 2025, Web 3.0- The Semantic Web- will have made many important contributions to new knowledge through network science, logical inference artificial intelligence. It will be powered by a seamless, computational mesh, enveloping and connecting human and artificial life and will encompass all facets of our social and business lives- always on and available to manage every need.

By 2035, Web 4.0- the Intelligent Web- will be ubiquitous- able to interact with the repository of all available knowledge of human civilisation- past and present, digitally coded and archived for automatic retrieval and analysis. Human intelligence will have co-joined with advanced forms of artificial intelligence, creating a higher or meta-level of knowledge processing. This will be essential for supporting the complex decision-making and problem solving capacity required for civilisation's future survival and progress.

Also by 2035 the last of the enterprise walled gardens will break down and leak like stone walls surrounding an ancient town. Techniques and technologies across the spectrum of knowledge will continue to spread, expand and link in new ways as they always have, bypassing temporary impediments, because that is the physical reality of information and knowledge.

The future Internet will inevitably follow these laws- becoming more open and flexible, using common protocols as enterprises and consumers demand greater flexibility. As an increasing number of data providers begin to implement Tim Bernier Lee’s Linked Data principles, it will transform into the creation of an open global Infosphere containing billions of links and coordinated by the World Wide Web Consortium.

This will offer a blueprint for connecting information from different sources into a single global data repository, with the Global Commons and Public Domain models playing an increasingly important democratic role.

Most importantly the Web will be equally available to and controlled by all nations, under the auspices of a specially constituted UN body, devolving forever away from US control.

But this can only happen if the underlying structural integrity of the Internet and Web is preserved. If managed as a global cooperative project it will result in enormous benefits for the whole of humanity. But if the Future Internet splits and fragments along geopolitical and competitive lines, as its current evolution suggests, then much of its potential benefit for our civilisation and planet will dissipate.

The next evolutionary phase of this pre-eminent human-engineered organism of the 21st century will be critical.