Sent Successfully.
Home / Blog / Interview Questions / Hadoop Interview Questions & Answers
Hadoop Interview Questions & Answers
Table of Content
- What is the compatible version for Hadoop 2.x?
- What is the compatible version for Hadoop 3.x?
- How to handle the fault tolerance in Hadoop 2.x?
- How to handle the fault tolerance in Hadoop 3.x?
- In Hadoop 2.x, what do we need for namenode recovery?
- What is the Storage space in Hadoop 3.x comparing Hadoop 2.x?
- Which command will be included to print the java.library.path in Hadoop 3.x?
- To improve scalability and reliability in Hadoop 3, the YARN Timeline service has been enhanced with?
- Hadoop can run on?
- What is the default size of the distributed cache?
- How to check the status of all daemons running in the HDFS?
- Which NoSQL database is scalable with Hadoop?
- Which command is used to check inconsistencies in Hadoop?
- In how many nodes pig is operated?
- What is Snappy compression?
- What is the slowest compression technique in Hadoop?
- Which is the compression technique to compress 10% to 15% from the best available techniques?
- Which company developed Pig?
- How to read Data in Pig?
- What kind of language is Pig?
- Where do we use HIVE?
- How Facebook tackles Big Data on HADOOP?
- What is used for Machine learning on HADOOP?
- In HADOOP ‘put’ command is used for?
- Spark uses Hadoop in how many ways?
- What is the wrong way for Spark Deployment?
- Which component is on top of Spark Core?
-
What is the compatible version for Hadoop 2.x?
- a) Java 6
- b) Java 7
- c) Java 8
- d) Java 5
Answer - b) Java 7
-
What is the compatible version for Hadoop 3.x?
- a) Java 6
- b) Java 7
- c) Java 8
- d) Java 5
Answer - c) Java 8
-
How to handle the fault tolerance in Hadoop 2.x?
- a) Replication
- b) Erasure Coding
- c) Data Handling
- d) Flow control
Answer - a) Replication
-
How to handle the fault tolerance in Hadoop 3.x?
- a) Replication
- b) Erasure Coding
- c) Data Handling
- d) Flow control
Answer - b) Erasure Coding
-
In Hadoop 2.x, what do we need for namenode recovery?
- a) Erasure Coding
- b) Java 8
- c) Manual Intervention
- d) Data Handling
Answer - c) Manual Intervention
-
What is the Storage space in Hadoop 3.x comparing Hadoop 2.x?
- a) 100%
- b) 200%
- c) 300%
- d) 400%
Answer - b) 200%
-
Which command will be included to print the java.library.path in Hadoop 3.x?
- a) hadoop jnipath
- b) hadoop inipath
- c) hadoop knipath
- d) hadoop ijnipath
Answer - a) hadoop jnipath
-
To improve scalability and reliability in Hadoop 3, the YARN Timeline service has been enhanced with
- a) ATS V1
- b) ATS V2
- c) ATS V3
- d) ATS V4
Answer - b) ATS V2
-
Hadoop can run on
- a) Windows
- b) Linux
- c) Mac
- d) Cross-Platform
Answer - d) Cross-Platform
-
How to control the HDFS replication factor?
- a) Yarn-site.xml
- b) Hdfs-site.xml
- c) Cross-site.xml
- d) Into-site.xml
Answer - b) Hdfs-site.xml
-
What is the default size of the distributed cache?
- a) 10 GB
- b) 20 GB
- c) 16 GB
- d) 32 GB
Answer - a) 10 GB
-
Which NoSQL database is scalable with Hadoop?
- a) MongoDB
- b) Hbase
- c) Cassandra
- d) Mysql
Answer - b) Hbase
-
Which command is used to check inconsistencies in Hadoop?
- a) Fsck
- b) Fsk
- c) Fskc
- d) Fks
Answer - a) Fsck
-
In how many nodes pig is operated?
- a) 1
- b) 2
- c) 3
- d) 4
Answer - b) 2
-
What is Snappy compression
- a) GZip
- b) Bzip2
- c) LZO
- d) None of the above
Answer - c) LZO
-
What is the slowest compression technique in Hadoop?
- a) GZip
- b) Bzip2
- c) LZO
- d) None of the above
Answer - b) Bzip2
-
Which is the compression technique to compress 10% to 15% from the best available techniques?
- a) Bzip2
- b) LZip
- c) GZip
- d) LZO
Answer - a) Bzip2
-
Which company developed Pig?
- a) Wipro
- b) Google
- c) Yahoo
- d) Microsoft
Answer - c) Yahoo
-
How to read Data in Pig?
- a) Load
- b) Read
- c) Write
- d) Perform
Answer - a) Load
-
What kind of language is Pig?
- a) High-Level Language
- b) Low-level Language
- c) Mid-Level Language
- d) All of the Above
Answer - a) High-Level Language
-
Where do we use HIVE?
- a) Hadoop Query Engine
- b) Hadoop SQL Server Interface
- c) Both a) & b)
- d) None of the Above
Answer - c) Both a) & b)
-
How Facebook tackles Big Data on HADOOP?
- a) Raw Data
- b) Project
- c) Prism
- d) Rojectbid
Answer - c) Prism
-
What is used for Machine learning on HADOOP?
- a) HDFS
- b) Pig
- c) Mahoot
- d) CBase
Answer - c) Mahoot
-
In HADOOP ‘put’ command is used for?
- a) TO Copy files or directories from local system to HDFS
- b) To Copy files from HDFS to a local machine
- c) To Copy files from HDFS to a local system
- d) To Copy HDFS to HDFS
Answer - a) TO Copy files or directories from local system to HDFS
-
Spark uses Hadoop in how many ways?
- a) 3
- b) 2
- c) 4
- d) 1
Answer - b) 2
-
What is the wrong way for Spark Deployment?
- a) Spark in MapReduce
- b) Yarn
- c) HDFS
- d) Spark SQL
Answer - d) Spark SQL
-
Which component is on top of Spark Core?
- a) Spark SQL
- b) RDD’s
- c) Yarn
- d) All of the Above
Answer - a) Spark SQL
Do You want to know more about What is Hadoop?
Navigate to Address
360DigiTMG - Data Analytics, Data Science Course Training Hyderabad
2-56/2/19, 3rd floor, Vijaya Towers, near Meridian School, Ayyappa Society Rd, Madhapur, Hyderabad, Telangana 500081
099899 94319