Hadoop

hadoop is a framework for running applications on large clusters of commodity hardware. The Hadoop framework transparently provides applications both reliability and data motion. Hadoop implements a computational paradigm named map/reduce, where the application is divided into many small fragments of work, each of which may be executed or re-executed on any node in the cluster. In addition, it provides a distributed file system that stores data on the compute nodes, providing very high aggregate bandwidth across the cluster. Both map/reduce and the distributed file system are designed so that node failures are automatically handled by the framework.

Good article – covering a lot of Hadoop terminology.

http://www.zdnet.com/hadoop-how-it-became-big-datas-lynchpin-and-where-its-going-next-7000016894/?s_cid=e539&ttag=e539

 

 

Advertisements