Saturday, October 6, 2012

Future Enterprise- The Future of Big Data


The Director of The Future Enterprise Research Centre- David Hunter Tow, predicts that Big Data  has the potential to unlock greater value for the enterprise and society, but in the process will radically disrupt traditional organisational functions at all levels,- particularly between IT departments and decision makers.
The connotation of the term ‘Big Data’ is at best extremely fuzzy and at worst highly misleading. It implicitly promises major benefits in direct relationship to the quantity of data corralled by an organisation. But Big Data is also hedged with constraints, contingencies and uncertainties, requiring the solution of a number of associated problems before it can translate into significant enterprise benefits.
To unlock real value in the future will require-

The design of more responsive enterprise and knowledge architectures based on a network model, allowing for the delivery of realtime adaptive decision responses;
A closer relationship between the business and its social environment enabling the enterprise to better understand the Big Picture;

The introduction of common data standards and streaming of seamless integrated multiple data types;
A quantum leap in intelligence through the application of more powerful artificial intelligence based, analytic, strategic and predictive modelling tools.

The need to upgrade the quality and security of current storage and processing infrastructure beyond current cloud architectures.

 Networked Architectures
A more flexible and responsive knowledge architecture for the future enterprise must reflect the reality of increasingly complex decision making, allowing far more agile reaction within the fast changing competitive environment of the 21st century.

This will involve the introduction of networked models at both the enterprise and information level, with nodes represented by decisions and information flows linking the relationships between them; eventually capable of autonomous adaptation within a constantly evolving social, technological, business environment.

.This Model, based on optimised decision pathways, with the capacity to dynamically route information and intelligence resources, supported by autonomous agent software, to appropriated decision makers, will be the core driver of Big Data architectures.  It must be capable of integrating and analysing streams of information from multiple sources in real time, channelling computing and information resources directly to relevant decision nodes, enabling critical decisions to be implemented in optimal time frames.

 
The Big Picture

 The capacity of the enterprise to mesh with its physical and social environment, will become increasingly vital for its survival in the future. Without such a grounded relationship, poor decision making based on an inwardly focussed mindset will continue to drive many large enterprises to bankruptcy.

 It will also be insufficient to plan just one, two or five years ahead. Although near-term sales and cost forecasts are important, understanding the bigger shifts likely to impact all businesses in a future dominated by climate change, geopolitics and globalisation, will be more essential to survival- allowing a better balance of creative planning, adaptive resilience and risk avoidance.

In fact enterprises- particularly the biggest, have a poor history in seeing the big picture. The larger the enterprise the more likely it is to believe in its own invincibility in the marketplace.

In more recent times both Ford and GM virtually went bankrupt and had to be bailed out by the public purse because they would not or could not see the obvious shift in consumer sentiment to smaller cars with lower fuel usage. And then there was AIG, Lehman Brothers and Citibank and Fanny Mae which also thought they were too big to fail. And now the giants Kodak and Sony and many others are struggling.

 In all the above cases, enterprise management ignored the signals coming loud and clear from their environments via consumers and customers, through a combination of ignorance and arrogance. In the meantime other more agile companies such as Microsoft picked up the trend towards desktop computing and exploited the opportunities left by IBM. But then Microsoft almost lost the plot to Google by not seeing the emerging power of the Internet as the dominant driver of information in today’s society.

 So despite the use of the latest business intelligence software busily scavenging for patterns from Big Data generated from past customer and financial data, standard industry forecasts,  plus some glitzy dashboard software, typical BI analysis without the guidance of the big picture will be virtually useless as a pathway to an uncertain future.

 Big Data now has the potential to open the door to this broader and more inclusive vision, which has rarely been a priority for most enterprises in the past. But that will now change. Survival, particularly for larger enterprises will be based on interpreting the bigger picture’s impact. If managed effectively Big Data will be the catalyst that provides a hedge against this myopia, but only if management’s mindset becomes more flexible and humble.

The Intelligent Enterprise
Big Data will also trigger the need to become much smarter, utilizing the latest artificial intelligence and statistical techniques such as evolutionary algorithms, neural networks and Bayesian logic to achieve a smarter enterprise. The latest 'Smart Planet' paradigm shift, in which the infrastructure, business and environmental processes of the planet are being re-engineered to optimise performance and achieve more sustainable outcomes, will also be a major driver for the networked smarter enterprise of the future.

The Smart Planet imperative will demand that decisions relating to society’s survival and wellbeing be made more rigorously, efficiently, adaptively and therefore autonomously.
But a Smart Planet revolution without a Smarter Enterprise mindshift won’t compute.

 Part of that mindshift will be a far greater emphasis on the future. The past is still a poor predictor of what’s to come. To improve the quality of decision-making, serious realtime forecasting will need to be boosted. Such systems will rely on information collected from numerous internal and external sources within an environment, using artificial intelligence algorithms and rules to analyse vital signals and interconnected trends and patterns that generate complex outcomes. The results of this analysis will then be channelled autonomously to the appropriate decision-makers for action.

But despite a range of mathematical improvements in our foresight and modelling methods, developed in tandem with a broader understanding of scientific and social principles, the corporate capacity to forecast effectively has been sadly lacking when data doesn’t follow obvious trends or when the signals of emerging change are faint.

Most forecasting textbooks traditionally list a number of well-developed techniques based around time series projections, regression analysis, Delphi and scenario expert options, as well as artificial neural networks and simulation modelling. But these have usually failed to predict the future in times of abrupt change within the broader physical, social and economic environment, such as the recent extreme disasters of the global financial crisis or the Arab democratic revolution.
The next phase in this evolution will be models powerful enough to not just deliver predictions but accurately prioritise the resources needed to manage those predictions, and develop project plans for their implementation. Then track the results to check on their effectiveness. In other words utilise textbook automatic feedback control principles much more rigorously. After a bridge or power grid has been built, its maintenance needs to be permanently and autonomously managed to prevent future catastrophic failures or escalating rebuild costs, tracked by the latest generation  of intelligent sensors.  

This common sense methodology of feedback and continuous monitoring of outcomes has been sadly lacking in many enterprises but will now be essential if business and society is to survive the onslaught of massive future shock. It will involve scanning for emerging problems, aggregating data streams from millions of internet-connected sensor systems and monitoring the pulse of the global environment- not just at the business level but also at the political, technological and environmental flashpoints.

Processes based on Big Data therefore need to be recognised as the beginning of the our civilisation’s survival fightback; applying adaptive and responsive techniques based on massive datasets, largely autonomously, because they are so big and complex that manual methods will fail, to the optimisation of the design, maintenance and operation of every process and application on the planet.

 The Cloud Solution

Collecting and storing the tsunami of data resulting from Big Data overload is a major stumbling block to the above goal, already creating unforeseen problems for the average enterprise by generating exponentially exploding datasets; as the Science communities- astronomers, biologists, cosmologists and particle physicists, have already discovered.

Traditional Relational or Hadoop databases, SOA architectures and SQL databases are not optimised for such massive real time processing, particularly as much of the data in the future will be unstructured and garnered from heterogeneous sources such as web pages, videos, RSS feeds, market intelligence, statistical data, electronic devices, instrumentation, control systems and sensors.

But just in time, Cloud processing management has emerged, offering an alternative solution, which few large organisations will be able resist. Now they will have the seductive choice of offloading the complete data management side of their operations to third parties in return for economies of scale and flexibility. The tradeoff is partial loss of control, but over time, providing security, backup and service levels are maintained at a rigorous standard, the organisation should benefit by being able to improve its focus on the core critical aspects of its operations. Only time and verification will tell if this tradeoff can deliver on its promise.
The IT department will be virtually invisible to decision-makers with the primary task to select the appropriate tools to implement enterprise strategies

Cloud computing will eventually offer a complete managed haven of services for Big Data- software, Security, Processing, Storage, Hardware and Infrastructure. All are now in the offing. But in the near future, Knowledge as a Service is likely to presage the greatest change within the Future Enterprise ecosystem.
Real-time integration of disparate data and application methodologies is a key challenge here, with the current conventional multi-staged approach being- build a data warehouse to consolidate storage, then aggregate information sources  and then select a BI tool and then process user queries.  But this is already proving expensive, slow and error prone.

A number of innovative platforms are being developed in this sector based on enterprise information streaming models. These provide a virtual unified view of the data stream without first transferring it to a central repository and also point the way to the next step of fully autonomous tool selection and decision support.

 Future Shock

It is now clear that the global environment is placing place enormous pressure on all organisations, not just from a competitive perspective but from the need to upgrade ethical and sustainability standards. This will continue at an accelerating pace. The changing technological environment in particular is already disrupting entire service industries  e-commerce in particular- retailing, banking, trading and supply. Now a second wave of service industries- manufacturing, healthcare, education, media, advertising, legal, hospitality and travel is being turned upside down by the revolution.

In this revolution Big Data is acting as a major catalyst, offering the glittering prize of untold value-added, but will generate this cornuopia only if it is also agile and precisely targeted, meeting the specific needs of multiple domains. Specialised decision-making in finance, biology, medical, cosmological, pharmaceutical, government, media and legal applications will require different classes of algorithmic support. And even domain analytic specialists may soon be obsolete as expert domain algorithms generated from the ever expanding cumulative knowledge of the Web begin to dominate the decision process. Critically however such algorithms will need to be continuously verified and adapted within a shifting social and business environment.

Because of the rate of innovation and subsequent disruption, service-based systems will therefore need to be self-adaptive; applying intelligent algorithms to support new options as well as the growth of collaborative ventures involving multiple stakeholders, such as commonly occur in service industries such as Hospitality, Travel and Real Estate.

New technological innovations such as smartphones and tablets are also increasingly filling mobile gaps and shortfalls in existing services. For example, by starting to displace traditional credit card/banking in the lucrative payments market and enabling the personalisation of healthcare and educational services in remote areas.

Such upgrades in the service sector imply the use of increasingly pervasive Big Datasets with low access time latencies. Response timeframes are critical, with cumbersome reporting and query tools way too slow for today’s end user needs. So the days of manual intervention in the decision process are drawing to a close as global markets creating decisions involving hundreds of variables required instantly.

So the stage is set. The filtering, pattern matching and super-intelligent analytic processing required to make sense of the overload of big Data, will mean that human intervention in the decision process will inevitably become a significant bottleneck.

But the future smart enterprise must have the flexibility to focus and deploy its cooperative intelligence autonomously, at all levels of the organisation. This will be a proactive response to new opportunities and competitive pressures in the marketplace.

The level of volume and complexity of decision-making will continually and rapidly increase over time in response to the changing social, geopolitical and technological environment. The resulting network interactions involving customers, supply chains, services, markets and logistics will eventually make it impossible for humans to compete. It will become just too complex and time-consuming even for dedicated teams of humans to manage, just as it is impossible to control complex trading, production, marketing operations manually or chemical plants and space missions today. 

The IT centre will rapidly transform into tomorrow’s Knowledge Technology Centre- KT Centre. This will place further pressure on the need for real-time high quality decision-making.

By 2030 humans will become partners in enterprise decision processes powered by intelligent algorithms based on realtime knowledge outcomes plus research encapsulated in the Intelligent Web. But over time their input, as for airline pilots and fast train drivers today, will be largely symbolic.

Big Data therefore will have provided a major catalyst for an extreme makeover of the future enterprise, the business environment, for society and the planet.

 




.