There are many factors that contributed to the emergence of today's big data ecosystem, but there's a general consensus that big data came about because of a range of hardware and software designs that simply allowed big data to existing.
A conventional definition of big data is as follows: Data sets that are sufficiently large and complex that they defy easy iterative management or management by hand. Big data sets are often identified as data sets that can't fit into a simple database network because their analysis requires too much work on the part of the servers handling the data.
With that in mind, a major part of what created big data is the idea we know as Moore's Law, or the doubling of transistors on a circuit every two years, creating ever smaller hardware and data storage devices (as well as more powerful microprocessors). In conjunction with Moore's Law
, and probably because of it, the computing ability of accessible software systems kept increasing, to the point where even personal computers could handle much larger amounts of data, and business and vanguard systems started to be able to handle data sizes inconceivable only several years before. Personal systems moved from kilobytes to megabytes, and then to gigabytes, in a process transparent to consumers. Vanguard systems moved from gigabytes to terabytes and petabytes, and onto orders of magnitude like zettabytes, in ways that were much less transparent to the average citizen.
Another advance accommodating big data was changed in the ways that handlers processed data sets. Rather than linear processing through conventional relational database design, handlers started to use tools like Apache Hadoop
and related hardware management pieces to eliminate bottlenecks in data processes.
The result is the big data world that we live in, where massive data sets are stored and maintained in data centers, and increasingly accessed by a wide range of technologies for a wide range of uses. From commerce to ecology, from public planning to medicine, big data is becoming more and more accessible. Meanwhile, government agencies and other larger organizations are still pushing the boundaries of big data sizes and implementing even more advanced solutions.