How To Create Hadoop Cluster In Just 10 Minutes ?
Hadoop Cluster is use to solve BigData problems. Hadoop is product of Appache and it is very popular in BigData world, facebook is using this to solve there BigData problem to Store there data of user
In 2012, Facebook has revealed that it is generating around 500+ terabytes of data every day. In which 2.7 billion were likes and around 300 million photos per day. Another exciting thing is Facebook is scanning around 105 terabytes of data per each half hour.
So let's see how to build Hadoop cluster to solve bigdata problem
To create this hadoop cluster we need some basic things first
- O.S. ( we are using RedHat 8 )
- Hadoop Latest Version
- Jdk ( java development kit )
- More than one OS ( You can use VM, Cloud )
That's it let's do this
- First Download Hadoop and JDK
- Now Install this with redhat cmd "rpm -i file(hadoop/jdk)"
- Now Configure Hadoop Core and HDFS file to create Cluster
After same as this create a client and upload some file and use tcp dump cmd to check uploading of file
now we see file on webui, for this type ip:50070 on browser
Now we check uploading of file in our slave node for this we use tcpdump -i enp0s3 -n -X
Now Check in your slave node that client ip is coming in your pc to upload file
if you have more than one slave node than if you stopped one of slave node than also hadoop client is uploading file because it's create replication of file
To prove this we uploaded a big file to see replication, block is showing in the pic is the replication of file
You can perform this practical alone just create more than instance, like this i done this same on AWS Instance
Thank you
SDE@OneCard | Java Backend Developer | Problem Solver | Health and Tech
4yGreat
DevOps Engineer @Amdocs
4yNice work ✨
Software Developer @ThinkTech Software Inc. | Full Stack | React | Python | DSA
4yWell presented 🤟