What are the hadoop ecosystems


Assignment:

Part 1: 180 words, critical response to the follow discussion forum topic. APA formatting with reference

Initial posting: What are the two core components of Hadoop?

There are basically 3 important core components of hadoop;

1. MapReduce - A software programming model for processing large sets of data in parallel

2. HDFS - The Java-based distributed file system that can store all kinds of data without prior organization.

3. YARN - A resource management framework for scheduling and handling resource requests from distributed applications.

For computational processing i.e. MapReduce: MapReduce is the data processing layer of Hadoop. It is a software framework for easily writing applications that process the vast amount of structured and unstructured data stored in the Hadoop Distributed Filesystem (HSDF). It processes huge amount of data in parallel by dividing the job (submitted job) into a set of independent tasks (sub-job).

In Hadoop, MapReduce works by breaking the processing into phases: Map and Reduce. The Map is the first phase of processing, where we specify all the complex logic/business rules/costly code. Reduce is the second phase of processing, where we specify light-weight processing like aggregation/summation.

For storage purpose i.e.HDFS :Acronym of Hadoop Distributed File System - which is basic motive of storage. It also works as the Master-Slave pattern. In HDFS NameNode acts as a master which stores the metadata of data node and Data node acts as a slave which stores the actual data in local disc parallel.

Yarn : which is used for resource allocation. YARN is the processing framework in Hadoop, which provides Resource management, and it allows multiple data processing engines such as real-time streaming, data science and batch processing to handle data stored on a single platform.

Part 2: 180 words, critical response to the follow discussion forum topic. APA formatting with reference

What are the Hadoop ecosystems and what kinds of ecosystems exist?

The Hadoop ecosystem is a very vast set of software bundles that are categorized as belonging to a distributed filesystem ecosystem or a distribute programming ecosystem that can interact with each other and other non-Hadoop software bundle ecosystems as well (Roman, n.d.). I will not list all of the software bundles in this website but just enough to give you an idea of what types of software bundles makes up the Hadoop ecosystem

Distributed Filesystems:

• Apache HDFS (Hadoop Distributed File System) stores large complex files across clusters, often ran with other programs such as Zookeeper, YARN, Weave, etc.

• Red Hat GlusterFS is described as a Red Hat Hadoop alternative for network servers.

• Quantcast File System (QFS) works with large-scale batch processing and MapReduce loads. Considered an alternative to Apache Hadoop HDFS. This DFS uses striping instead of full multiple replication to save storage capacity.

• Ceph File system works well with large amounts of object, block, or file storage much like Hadoop.

• Lustre File System is for distributed files systems that need high performance and availability over large networks through SCSI protocol. Hadoop 2.5 supports Lustre.

Distributed Programming:

• Apache Ignite is distributed computing of large-scale data for a wide variety of data types to include key-value, some SQL, map-reduce, etc.

• Apache MapReduce processes large data sets in parallel distributed clusters, with YARN as the resource manager.

• Apache Pig executes data in parallel to Hadoop, using Hadoop HDFS and MapReduce. The main concern of Apache Pig is data flow and uses its own language called Pig Latin.

• JAQL supports, JSON documents, XML, CSV data, SQL data.

NoSQL Databases:

• Apache HBase is derived from Google Big Table, used as the database for Hadoop. Column-orientated works well with MapReduce.

• Apache Cassandra is also derived from Google Big Table and Google File System can run with or without a HDFS. Also has some of he features of Facebook's Dynamo.

SQL-on-Hadoop:

• Apache Hive can provide SQL like language but it is not SQL92 compliant. Uses HiveQL for data summarization, query, and analysis.

Solution Preview :

Prepared by a verified Expert
Management Information Sys: What are the hadoop ecosystems
Reference No:- TGS02017976

Now Priced at $20 (50% Discount)

Recommended (94%)

Rated (4.6/5)