Touchscreens are becoming a more attractive interaction technology in our daily lives and they are quickly replacing most of the conventional user interface controls. The ability to continuously modify and reconfigure screen contacts is a valuable feature in any device, especially in mobile devices like smartphones and tablets, where every inch matters. Perhaps the most inviting aspect of touchscreens is their ability to detect gestures and recognize human activities. Unlike externally static interfaces with a dedicated input device, such as a keypad with discrete well-defined keys; most touch sensitive displays are embodied as a flat, stiff and ridged screen surface. As a result, touch sensitive displays are breaking down and do not follow either ergonomic rules and standards nor physiological and psychological models/concepts of the afferent flow information processing. This, in turn, means that these systems diminish perceptual and intuitive haptic feedback which hinders and sometime limits user interaction. This paper defines a Haptic User Interface Enhancement System (UIES) that transforms the conventionally flat and stiff touchscreen surfaces intoa haptically adaptive interaction hub which is not only able to provide generic vibrotactile stimulation for conformational haptic feedback but is able to guide the user though onscreen User Interface controls via kinetic feedback cues which includes components of forces and torques applied dynamically in the place of contact to the fingertips.