By: William Alexander Emailed: 1697 times Printed: 2202 times
Still remember the days when your computer had just 512MB RAM or even less? Computer memories have grown over years both internal memory and external memory. Storage technologies have also grown over the years. Storage capacity has grown from Kilobytes, megabytes, giagbytes, terabytes, petabytes, exabytes, zettabytes to even yottabytes. Given the rate at which the world's data is growing and with the concept of "Internet of Things", the data is set to grow even further.
In fact, Gartner defines the growth of data in three forms the so called "3Vs"; Volume, Velocity and Variety. Which means data is growing in high volume at a high speed with different forms of data. Although the data has grown huge and growing every minute, the software required to manage these data, search, visualize and mine are lacking. Computer scientists today agree that the software available today such as relational databases and other software are only capable of handling exabytes of data. Any data which is bigger than exabyte or in other words Big data, require new and better software applications.
Mobile devices, remote sensors, software logs, cameras, microphones, RFID readers, wireless sensors, social networks etc are generating these Big Data. These Big Data is difficult to work with using our existing RDBMS and data visualizing technologies are not sufficient, instead massive number of parallel software running in hundreds of servers are required to handle and process such big data to find business trends, determine quality of research, determine realtime traffic conditions, prevent diseases etc..
As a developer, it is time for you to imaging, understand and equip yourself with 'Big Data' tools and technologies and closely follow leaders such as IBM, Microsoft, Google and the likes to keep uptodate on 'big data' updates.
View Tutorial By: Bobby Deol at 2014-01-17 00:48:44
Most Viewed Articles (in Trends )
Most Emailed Articles (in Trends)