Help the world stop coronavirus! Stay home!

Prev Next

BigData / Hadoop basics

What is Apache Hadoop?

Hadoop is a opensource framework that facilitates the distributed processing of large data sets across clusters of computers using simple programming models.

Hadoop provides the reliable, scalable way of distributed computing.

It is designed to scale up from single machine(server) to thousands of machines, each offering its own local computation and storage. The hadoop software library is designed in a way it is capable of detecting and handling failures at the application layer enables delivering a highly-available service on top of a cluster of computers/servers, each of which may be liable to failures.

Hadoop is developed using Java.

❤Cash Back At Stores you Love !!!❤

Earn your $10 reward when you make your first purchase through Ebates by signing up with clicking below button.

Ebates Coupons and Cash Back

More Related questions...

Show more question and Answers...

Hadoop MapReduce

Comments & Discussions