During open surgery, a surgeon relies not only on the detailed view of the organ being operated upon and on being able to feel the fine details of this organ but also heavily relies on the combination of these two senses. In laparoscopic surgery, haptic feedback provides surgeons information on interaction forces between instrument and tissue. There have been many studies to mimic the haptic feedback in laparoscopic-related telerobotics studies to date. However, cutaneous feedback is mostly restricted or limited in haptic feedback-based minimally invasive studies. We argue that fine-grained information is needed in laparoscopic surgeries to study the details of the instrument’s end and can convey via cutaneous feedback. We propose an exoskeleton haptic hand wearable which consists of five 4 × 4 miniaturized fingertip actuators, 80 in total, to convey cutaneous feedback. The wearable is described as modular, lightweight, Bluetooth, and WiFi-enabled, and has a maximum power consumption of 830 mW. Software is developed to demonstrate rapid tactile actuation of edges; this allows the user to feel the contours in cutaneous feedback. Moreover, to demonstrate the idea as an object displayed on a flat monitor, initial tests were carried out in 2D. In the second phase, the wearable exoskeleton glove is then further developed to feel 3D virtual objects by using a virtual reality (VR) headset demonstrated by a VR environment. Two-dimensional and 3D objects were tested by our novel untethered haptic hand wearable. Our results show that untethered humans understand actuation in cutaneous feedback just in a single tapping with 92.22% accuracy. Our wearable has an average latency of 46.5 ms, which is much less than the 600 ms tolerable delay acceptable by a surgeon in teleoperation. Therefore, we suggest our untethered hand wearable to enhance multimodal perception in minimally invasive surgeries to naturally feel the immediate environments of the instruments.