It is very popular for developers create a virtual private Hadoop and Spark cluster environment to perform testing, simulation and all other learning purposes. Especially with the advances in cloud technology, it takes only a few minutes and extremely easy to prepare a virtual machine for a cluster installation.
However, it is a known issue when VMWare tools was used to prepare a virtual machine, the shared folder feature did not functioning if the guest operating system was installed with the minimum CentOS or RedHat 7.x.
The share folder(s) will show up under directory â€ś/mnt/hgfsâ€ť if a GUI tool is installed on the virtual machine with the VMware Tools installed. Otherwise, share folders did not show up.
The objective of this post is to show you how to fix this feature.
- CentOS 7.x
- RHEL 7.x
- VMWare Workstation 10 & 12
- Activate VMware Tools
- Mount VMWare tool software
- Uncompress VMware tools software to a temp location
- Install VMware Tools
- Unmount and clean up
- Verify vmware-tools is installed, activated and reboot safe.
- Activate shared folder
- Verify shared folder
Click on “VM” from the menu bar, then select “Reinstall VMware Toolsâ€¦”
$ sudo mkdir /mnt/cdrom
$ sudo mount /dev/cdrom /mnt/cdrom
$ sudo tar zxvf /mnt/cdrom/VMwareTools-.tar.gz -C /tmp/
$ cd /tmp/vmware-tools-distrib
$ sudo ./vmware-install.pl
$ sudo yum groupinstall "Development Tools"
$ sudo yum install net-tools kernel-headers kernel-devel gcc perl
$ sudo yum update
$ sudo umount /mnt/cdrom
$ sudo rm -rf /tmp/vmware-tools-distrib
$ sudo systemctl status vmware-tools
$ sudo systemctl enable vmware-tools
$ sudo /usr/bin/vmhgfs-fuse .host:/ /mnt/hgfs -o subtype=vmhgfs-fuse,allow_other
$ ls -la /mnt/hgfs