There’s this idea out there that technology develops in more or less a straight line. Phones get more powerful, storage becomes easier, performance gets faster. There might be big jumps from time to time as companies discover new ways of approaching a problem, but progress is more or less forward.
The classic technological axiom Moore’s Law is an example of this. (In case you’ve forgotten, Moore’s Law in its popular form posits that processor performance will double roughly every 18 months.) No matter what else happens, we’ll keep building faster and smaller processors.
Two routes to progress
I would argue, however, that there are two kinds of technological development. Moore’s Law is part of the first kind, the (logarithmic) linear kind. But there’s a second pattern of development that’s much more interesting to observe over time. Technology can develop in a cyclical pattern:
- Web browsers. When internet became broadly available in the mid-1990s there were multiple browsers: Internet Explorer, Netscape, Opera. Market share consolidated into one major browser in Internet Explorer, before breaking out again into the four or five major options you see today – Internet Explorer, Chrome, Safari, Firefox, and browsers built specifically for mobile devices.
- Servers. Back in the day, we used mainframes as our hardware homes to manage the operations within the business. Then we decentralized dramatically by shifting to PCs. Then we re-centralized first with client servers, and then in the cloud.
- Databases. When I started working with business software in the mid ‘80s, we started each application with programming the underlying flat-file database. But when relational databases debuted, coding your own DB became gauche and unnecessary. From there, we moved to multi-dimensional DBs for analytics processing. However, with the promise of in-memory storage of data for processing, it’s fair game to program databases from scratch again. As we become more Big Data-savvy, it is even OK to go back to flat-file storage if it allows massive parallel processing of data. In 25 years, we’ve gone completely full circle on database building.
- Privacy. In the Stone Age, communities of people knew everything there was to know about everybody else. There were no secrets, because those secrets didn’t help the group in its quest for survival. Humanity spent the next millennia, the past couple of centuries in particular, fighting to achieve new degrees of personal freedom and privacy expectations. And suddenly, within the last decade, we’ve gone completely in the other direction, largely due to social media and the advent of our mobile “virtual ever-presence”.
There are ongoing trends in both development tracks, the linear and the cyclical. On the cyclical side, I think we’ll see many of the applications that moved to software in the last 20 years come back to hardware. Servers are already being designed specifically for analytic-heavy tasks (due in part to the advent of Big Data
), and the demand for servers that can handle the load is exploding. Instead of just designing the solutions to make up performance gaps from a software perspective, it will make more sense to rig the processor for specific uses. For example, Intel recently launched their own version of Big Data encryption for Hadoop directly on their Xeon chip.
Bringing the cloud back to Earth
This may sound counterintuitive, but I believe we’re about to see a movement away from the cloud as the desired platform for high-performance analytics
. More and more companies are competing (on analytics) in a business area, and as a result, the platform that’s closest to home will offer the best options to be tweaked towards high-performance analytics. At VLDB 2012, one listener asked me, “if all users need 100% of processing capacity at the same time, does virtualization in the cloud make sense?” The answer is no – the underlying system designed to support, balance, and elasticize workloads will become an unnecessary overhead.
In the future, companies will need 100% of the processing power they can get their hands on at all times for algorithmic or exploratory analytics. It won’t make sense to use the cloud anymore.
Looking out the window to check the weather
From a linear perspective, there’s a trend of businesses needing more and more information to make educated decisions, and the amount of external information used for those decisions is growing accordingly. Social data and open data play a big part of business decision-making, and the solutions within an organization have to be ready to accommodate that.
In this context, it is notable that there are different growth paces for the internal and the external data available. In other words, the data we have in a company will likely grow at a pace close to the growth rates of the company itself. From a numbers perspective, very few companies have growth above 100% for long; a big company will be considered very successful at yearly growth rates of 20-30%.
According to IDC’s newest estimate (2011), there are 1.8 zettabytes of digital data in the world, projected to grow into 7.9 zettabytes by 2015. (A zettabyte is 1021 or 1,000,000,000,000,000,000,000 bytes, 1 trillion gigabytes, or 1 quadrillion megabytes.) The pace of growth is driven by external data, rather than internal data.
Therefore, we are steadily approaching a tipping point where the majority of data relevant for competitive analytics will be coming from the outside of our organizations. Since at this point we will have to compete on data that doesn’t “belong” to us, it’s a total paradigm shift on the way business decisions are made. Data that you have today may be gone or made unavailable by its owner tomorrow, and we may never have full influence on data quality or knowledge thereof. But it’s still pertinent to the decision you have to make now, and it’s critical to analyze that data to uncover the useful bits of information that can make your decision more informed.
What do you think are the other big cyclical and linear technology development trends, and where are they going?
Curious to know more?
Check out our product pages or go to Dr. Morton's lab
Dr. Morten Middelfart
Founder and Chairman of Social Quant
I've been working professionally in the software industry since I was 14 years old, and my passion for computers has never stopped growing. Today, I'm deeply involved in educational activities that advocate my research within business intelligence and analytics. By the time I was 25, I had established Morton Systems, my first business intelligence and analytics c..