BigData / Hadoop basics
What is Apache Hadoop?
Hadoop is a opensource framework that facilitates the distributed processing of large data sets across clusters of computers using simple programming models.
Hadoop provides the reliable, scalable way of distributed computing.
It is designed to scale up from single machine(server) to thousands of machines, each offering its own local computation and storage. The hadoop software library is designed in a way it is capable of detecting and handling failures at the application layer enables delivering a highly-available service on top of a cluster of computers/servers, each of which may be liable to failures.
Hadoop is developed using Java.
More Related questions...