The history of Big Data analytics can be traced back to the early days of computing, when organizations first began using computers to store and analyze large amounts of data.

However, it was not until the late 1990s and early 2000s that Big Data analytics really began to take off, as organizations increasingly turned to computers to help them make sense of the rapidly growing volumes of data being generated by their businesses.

Today, Big Data analytics has become an essential tool for organizations of all sizes across a wide range of industries. By harnessing the power of Big Data, organizations are able to gain insights into their customers, their businesses, and the world around them that were simply not possible before.

Milestones that led to today’s big data revolution from 1600s’ statistical analysis to the first programmable computer in the 40s to the internet, Hadoop, IoT, AI and more

As the field of Big Data analytics continues to evolve, we can expect to see even more amazing and transformative applications of this technology in the years to come.

The concept of big data has been around for years; most organizations now understand that if they capture all the data that streams into their businesses (potentially in real time), they can apply analytics and get significant value from it.

This is particularly true when using sophisticated techniques like artificial intelligence. But even in the 1950s, decades before anyone uttered the term “big data,” businesses were using basic analytics (essentially, numbers in a spreadsheet that were manually examined) to uncover insights and trends. 

Some of the best benefits of big data analytics are speed and efficiency. Just a few years ago, businesses gathered information, ran analytics and unearthed information that could be used for future decisions.

Today, businesses can collect data in real time and analyze big data to make immediate, better-informed decisions. The ability to work faster – and stay agile – gives organizations a competitive edge they didn’t have before.

Big data has revolutionized the modern business environment in recent years. A mixture of structured, semistructured and unstructured data, big data is a collection of information that organizations can mine for business purposes through machine learning, predictive modeling, and other advanced data analytics applications.

The bedrock of big data

A foundational period where clever people started seeing the value of turning to statistics and analysis to make sense of the world around them.

1663

John Graunt introduces statistical data analysis with the bubonic plague. The London haberdasher published the first collection of public health records when he recorded death rates and their variations during the bubonic plague in England.

1865

Richard Millar Devens coins the term “business intelligence.” As we understand it today, business intelligence is the process of analyzing data, and then using it to deliver actionable information. In his “Cyclopædia of Commercial and Business Anecdotes,” Devens described how a banker used information from his environment to turn a profit.

1884

Herman Hollerith invents the punch card tabulating machine, marking the beginning of data processing. The tabulating device Hollerith developed was used to process data from the 1890 U.S. Census. Later, in 1911, he founded the Computing-Tabulating-Recording Company, which would eventually become IBM.

1926

Nikola Tesla predicts humans will one day have access to large swaths of data via an instrument that can be carried “in [one’s] vest pocket.” Tesla managed to predict our modern affinity for smartphones and other handheld devices based on his understanding of how wireless technology would change particles: “When wireless is perfectly applied, the whole earth will be converted into a huge brain, which in fact it is, all things being particles of a real and rhythmic whole. We shall be able to communicate with one another instantly, irrespective of distance.”

1928

Fritz Pfleumer invents a way to store information on tape. Pfleumer’s process for putting metal stripes on magnetic papers eventually led him to create magnetic tape, which formed the foundation for video cassettes, movie reels and more.

1943

The U.K. created a theoretical computer and one of the first data processing machines to decipher Nazi codes during WWII. The Colossus, as it was called, performed Boolean and counting operations to analyze large volumes of data.

1959

Arthur Samuel, a programmer at IBM and pioneer of artificial intelligence, coined the term machine learning (ML).

1965

The U.S. plans to build the first data center buildings to store millions of tax returns and fingerprints on magnetic tape.

1969

Advanced Research Projects Agency Network (ARPANET), the first wide area network that included distributed control and TCI/IP protocols, was created. This formed the foundation of today’s internet.

1989

Tim Berners-Lee and Robert Cailliau found the World Wide Web and develop HTML, URLs and HTTP while working for CERN. The internet age with widespread and easy access to data begins.

1996

Digital data storage becomes more cost-effective than storing information on paper for the first time in 1996, as reported by R.J.T. Morris and B.J. Truskowski in their 2003 IBM Systems Journal paper, “The Evolution of Storage Systems.”

1997

The domain google.com is registered a year before launching, starting the search engine’s climb to dominance and development of numerous other technological innovations, including in the areas of machine learning, big data and analytics.

1998

Carlo Strozzi develops NoSQL, an open source relational database that provides a way to store and retrieve data modeled differently from the traditional tabular methods found in relational databases.

1999

Based on data from 1999, the first edition of the influential book, How Much Information, by Hal R. Varian and Peter Lyman (published in 2000), attempts to quantify the amount of digital information available in the world to date.

2001

Doug Laney of analyst firm Gartner coins the 3Vs (volume, variety and velocity), defining the dimensions and properties of big data. The Vs encapsulate the true definition of big data and usher in a new period where big data can be viewed as a dominant feature of the 21st century. Additional Vs — such as veracity, value and variability — have since been added to the list.

2005

Computer scientists Doug Cutting and Mike Cafarella create Apache Hadoop, the open source framework used to store and process large data sets, with a team of engineers spun off from Yahoo.

2006

Amazon Web Services (AWS) starts offering web-based computing infrastructure services, now known as cloud computing. Currently, AWS dominates the cloud services industry with roughly one-third of the global market share.

2008

The world’s CPUs process over 9.57 zettabytes (or 9.57 trillion gigabytes) of data, about equal to 12 gigabytes per person. Global production of new information hits an estimated 14.7 exabytes.

2009

Gartner reports business intelligence as the top priority for CIOs. As companies face a period of economic volatility and uncertainty due to the Great Recession, squeezing value out of data becomes paramount.

2011

McKinsey reports that by 2018 the U.S. will face a shortage of analytics talent. Lacking between 140,000 and 190,000 people with deep analytical skills and a further 1.5 million analysts and managers with the ability to make accurate data-driven decisions.

Also, Facebook launches the Open Compute Project to share specifications for energy-efficient data centers. The initiative’s goal is to deliver a 38% increase in energy efficiency at a 24% lower cost.

2012

The Obama administration announces the Big Data Research and Development Initiative with a $200 million commitment, citing a need to improve the ability to extract valuable insights from data and accelerate the pace of STEM (science, technology, engineering, and mathematics) growth, enhance national security and transform learning. The acronym has since become STEAM, adding an A by incorporating the arts.

2013

The global market for big data reaches $10 billion.

2014

For the first time, more mobile devices access the internet than desktop computers in the U.S. The rest of the world follows suit two years later, in 2016.

2016

Ninety percent of the world’s data was created in the last two years alone, and IBM reports that 2.5 quintillion bytes of data is created every day (that’s 18 zeroes).

2017

IDC forecasts big data analytics market would reach $203 billion in 2020.

2022

Allied Market Research reports the big data and business analytics market hit $193.14 billion in 2022, and estimates it will grow to $420.98 billion by 2027 at a compound annual growth rate of 10.9%.