Posts

Automating LVM with Python Script

Python is a very flexible programming language writing scripts Here I have build a python script to automate the tasks of LVM Manually it is hectic to perform any particular task in LVM This can be considered as a simple CLI tool for LVM   This link redirects to the python script over GitHub https://github.com/sankeerth-bussa/lvm-python-script  

Configuring HTTPD Server on Docker Container and Setting up Python Interpreter and running Python Code on Docker Container

  #arthbylw #vimaldaga #righteducation #educationredefine #rightmentor #worldrecordholder #ARTH #linuxworld #makingindiafutureready #righeudcation #docker #webserver #elasticity #lvm

Integrating LVM with Hadoop and providing Elasticity to DataNode Storage

Image
Displaying total disks available in the node. The LVM is performed on sdb and sdc. Step -1 Creating Physical Volume (PV) for sdb and sdc Step-2 Creating Volume Group (VG) from the PVs created earlier Step-3 Creating Logic Volumes (LV) from the Volume Group Step-4 The LV created should be formatted so that it can be used Step-5 This is the Hadoop cluster report before mounting LV Step-6 The LV can be used after it is mounted Step-7 Now we can see an increase in storage, this is from LV Step-8 The LV is extended without any interruption in the cluster Step-9 We  can again see an increase in the storage size of the cluster #arthbylw #vimaldaga #righteducation #educationredefine #rightmentor #worldrecordholder #ARTH #linuxworld #makingindiafutureready #righeudcation #docker #webserver #elasticity #lvm

Elasticity

Image
Devices attached to the system /dev/sdb and /dev/sdc are the devices on which LVM is to be performed Creating physical volume for sdb Creating physical volume for sdc Creating volume group with sdb and sdc having name hvg Creating logical volume with 25 GB having name hlv Formatting the logical volume hlv Mounting the hlv volume to the directory dn Hadoop report before mounting of logical volume Hadoop report after mounting the logical volume Extending the logical volume with 3GB Hadoop report after increasing the logical volume Hadoop report after reducing the logical volume by 8GB

Allocating specific amount of storage to as slave to cluster

Attaching EBS volume to OS First we have to create additional EBS volume and attach it to our OS. This volume has to be partitioned, formatted and at last has to be mounted to a drive. This drive has to mentioned in hdfs-site.xml file. Partitioning the new volume fdisk  /dev/xvdf  Formatting the partition mkfs.ext4  /dev/xvdf1 Mounting the partition on directory mount  /dev/xvdf1  /datanode Configuring hdfds-site.xml file in slave node of hadoop cluster <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>dfs.data.dir</name> <value>/datanode</value> </property> </configuration> Here datanode is the directory on which the volume is mounted. Configuring core-site.xml file in slave node of hadoop cluster <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configura