TechTarget recently came out with a Big Data article entitled, The Evolution of the ‘Big Data’ Concept What is different about this article is that it talks about the history of Big Data instead of the current state of it and how to leverage it.

The term “big data” has been around for decades. A Quora posting provides an example of its usage dating back to 1987. Almost 10 years later, in 1996, Silicon Graphics International Corp.’s chief scientist, John Mashey, gave a talk called “Big data and the next wave of ‘infrastress.'”

“[Infrastress is] stress on the infrastructure of computing,” he said in a 1999 interview with Government Computer News. “It’s what happens when technologies move at different speeds and put stress on the parts that aren’t moving so fast.”

In his presentation, Mashey explained that CPUs, memory and disk space were advancing faster than other aspects of computing, such as bandwidth and file systems. This disparity can create bottlenecks, instability and force businesses to find workarounds, he said.

At the time, Mashey typically referred to the big data as the growth in data volume, pointing to a relatively new data source known as the Internet, and discussed its impact on storage systems.

Aside from the internet (primarily e-commerce), the author also states how sensors are another factor in Big Data evolving to the point that it is at today. The internet combined with sensors and scanners caused the frequency of data to increase at astronomical rates.

I like the point the author makes about it not being about data growth, but being able to use and transform data. The ultimate challenge of Big Data is to be able to integrating multiple rapidly growing data sources together for “enhanced insight discovery, decision-making and process automation”.

Advertisements