Haptic wearables as sensory replacement, sensory augmentation and trainer – a review

Haptic wearables can act as a sensory replacement for total impairments. This section
covers haptic applications involving missing upper and lower limbs followed by vision
and auditory loss.

Upper-limb prosthetics

Prosthetic hands have achieved remarkable mechatronic capabilities (e.g. Revolutionizing
Prosthetics and Otto Bock), however, up to 39 % of amputees wearing myoelectrically
controlled prostheses do not use them regularly or at all due to a lack of tactile
sensory feedback 23]–26]. Current grasp information in prosthetic users occurs through visual observation
(77 %), listening (67 %) and residual limb sensations (57 %) 27]. Haptics for total impairment aims to restore missing tactile or proprioceptive information
vital to prosthetic grasp to prolong sustained prosthesis use 28]–31]. A major challenge is orchestrating spatial and temporal stimulation patterns and
energy demands such that they give rise to congruent neuronal representations of vibration,
contact, force, pressure, slip or muscle impedance during long-term use.

Haptic feedback for upper limb prostheses restores the sense of touch by relaying
force, pressure, and slip measurements to the user. Force and pressure feedback are
commonly used in tactile devices to relay information about grip force. This information
is typically transmitted mechanically, such as through skin tapping 32]–35], or through electro- or vibro-stimulation 35]–38] (Fig. 2 (left)). Patterson et al. 33] translated grip pressure from an object to hydraulic pressure in a cuff around the
upper arm. By comparing combinations of pressure, vibration, and vision feedback,
they found that pressure feedback resulted in the highest grasp performance. Rombokas
et al. 39] found that vibrotactile feedback applied to the upper arm in force-motion tasks improved
virtual manipulation performance for able bodied and prosthetic users.

Fig. 2. Haptic wearables for upper-limb prostheses. (left) Mechanical and vibroelectric haptic
device for relaying pressure and vibration. Image from 35] used with permission from IEEE. (right) Compact wearable device for contact, pressure,
vibration, shear, and temperature for amputees who underwent targeted nerve reinnervation
surgery. Image from 47] used with permission from IEEE

Slip, or shear forces between prosthesis and object held, is pivotal for determining
grasp stability and minimum grasp force 40]–44]. Slip and force feedback in combination allow manipulation of a virtual object with
lower forces than with force feedback alone 45]. Slip speed feedback, implemented as electrotactile stimulation on the skin, increases
the success in stopping slip and regulates the user’s grip reaction time 46]. Kim et al. 47] built a tactile device for amputees after targeted nerve reinnervation surgery (Fig. 2 (right)). The device relays contact, pressure, vibration and shear through a mechanically-actuated
tactor in contact with an 8 mm diameter patch of skin. Damian et al. 48] developed a wearable haptic device that relays slip speed, through a series of tactors
that sweep across the skin and grip force through frequency-encoded tapping on the
skin.

While many skin sites have been explored for tactile stimulation 49]–52], fingertips are an attractive location due to the high density of the mechanoreceptors
and the congruency of grasp sensation with the lost hand. Sites closest to the lost
limb are preferred for the exploitation of redundant afferent terminals 35], 48], 53]. Other locations where skin sensation is used relatively less in normal life such
as the arm or back have a lower density of mechanoreceptors but do not interfere with
manipulative tasks 33], 52], 54]. However, it may be that the location of skin stimulation is less important than
other factors such as learning rates 55].

Artificial motion proprioception allows prosthesis users to reach targets more accurately
and reduces visual attention during manipulation 56], 57]. Witteveen et al. 58] used an array of eight vibrotactors on the arm to represent eight discrete positions
in closing a prosthetic hand during grasping. Vibrotactile feedback was found superior
to no feedback in grasp success and duration during virtual object grasping tasks.
Bark et al. 6] introduce a wearable haptic device for rotational skin stretch to display proprioceptive
limb motion. Users were able to discriminate rotational displacements of stretch within
6 degrees of the total range of motion. Artificial impedance feedback can support
prostheses users to adapt the interaction of their prosthesis to a variety of environments.
Blank et al. 59] showed that human users provided with position and force feedback are able to evaluate
the effects of prosthesis impedance and its adjustability improves the users’ performance
in minimizing contact forces with a moving object. In addition, vibrotactile 60] and skin stretch 61] have been used to provide users with the ability to regulate environment interaction
forces.

These investigations show clear benefits of wearable haptic feedback for upper-limb
prosthetics by restoring lost force, pressure, slip, and proprioception sensations.
Current studies have primarily focused on restoring a single sensation, such as slip,
while restoring multiple sensations simultaneously could endow users with more stable
grasp and higher dexterity in real-life manipulation scenarios. A major challenge
is miniaturizing bulky multi-function haptic wearables to a size where the benefits
of the wearable device outweigh discomfort and inconveniences of complex devices which
have thus far limited long-term user compliance.

Lower-limb prosthetics

While a variety of lower limb prostheses exist, relatively few provide sensory feedback
as compared to upper limb prosthetics 62]. However, the absence of feedback can lead to abnormalities in gait coordination,
deficient balance, and prolonged rehabilitation 63]–65]. To relay ground-to-prosthesis contact force information, Fan et al. 66] developed a tactile system consisting of a cuff of four silicone pneumatic balloons
placed around the thigh that respond monotonically to pressure patterns recorded by
force sensors in the insole of the user. Six healthy subjects were able to differentiate
inflation patterns and direction of pressure stimuli, recognize three force levels
and discriminate gait movements with 99.0 %, 94.8 %, 94.4 % and 95.8 % accuracy, respectively.
Crea et al. 67] mapped the force recorded in the insole to vibrotactile feedback on the thigh skin,
providing information about gate-phase transition. They demonstrated that the spatial
and temporal relationships between vibrotactile time-discrete feedback and gait-phase
transitions can be learned. In a study on twenty four transtibial prostheses users,
Rusaw et al. 68] conveyed body motion through vibratory feedback proportional to signals from force
sensors placed under the prosthetic foot. Vibratory feedback improved postural stability
and reduced response time for avoiding falls. Proprioceptive feedback in lower-limb
prostheses was investigated by Buma et al. 69] using a spatial electrotactile display of the prosthetic knee angle during gait.
Subjects wore electrodes on the medial side of the thigh just above the knee, and
the results showed that intermittent stimulation reduced habituation after 15 minutes.
Finally, Sharma et al. 70] investigated the response in limb motion given vibration stimuli applied to the thigh,
and showed that average response time was 0.8 sec, and response accuracy was greater
than 90 %.

Most studies involving wearable haptics for lower-limb prosthetics have extracted
various gait characteristics, such as foot pressure patterns or gait phase detection,
from force-sensing insoles and then mapped these characteristics to prosthetic users
via haptic feedback. While these initial studies are promising, future research should
focus on restoring missing proprioceptive sensations at the ankle and knee joints
in combination with foot pressure patterns.

Vision aid for the blind

Engineers and scientists have long sought to enable visual substitution for the blind.
In a seminal study, Bach-Y-Rita et al. 71] used a 20 x 20 array of tactors embedded in a dental chair to stimulate the skin
of the back of blind subjects giving them a sense of “vision” through tactile substitution.
Research built on these initial efforts has resulted in a host of haptic wearables
as vision aids for the blind (see survey articles 72], 73]).

Although the waist has low tactile acuity, it is a natural location for haptic feedback
as it moves relatively little during ambulation. McDaniel et al. 74] developed a tactile belt of 7 equidistantly spaced tactors around the waist to cue
a blind user of another person’s presence. Results showed that the belt could convey
another person’s direction via vibration location and another person’s distance via
vibration duration. Karcher et al. 75] used a tactile belt consisting of 30 equidistantly spaced tactors in combination
with a digital compass to display the direction of magnetic north by continually vibrating
the closest tactor aligned with the magnetic north direction. Johnson and Higgins
76] used a tactile belt with two attached web cameras to convert visual information to
a two-dimensional tactile depth map. Sensed objects triggered belt vibrations in the
object’s direction, with closer objects causing higher vibration frequencies. Several
studies have used tactile belts with GPS sensing for outdoor navigation by vibrating
tactors in the direction of required movement to reach an intended waypoint or final
destination 77]–79].

The high density of mechanoreceptors in the hands and fingers make these good locations
for haptic feedback. Amemiya et al. 80] attached vibrotactors to 3 fingers of each hand (Fig. 3) for guidance and navigation for the blind. Meers et al. 81] used electrostimulation gloves to relay tactile stimulation proportional to the distance
to objects in the environment. Blindfolded subjects were able to report obstacle locations,
avoid them, and walk to predefined destinations while navigating through outdoor locations
including a car parking lot and college campus. Koo et al. 82] developed a soft, flexible fingertip tactile display with 20 electroactive polymer
for Braille and displaying visual information through the skin. Shah et al. 83] created a cylindrical handheld tactile device with 4 ultrasonic sensors pointing
front, left, right, and below the device held in front of the user. A 4 x 4 array
of vibrotactors embedded in the handle aligned with the fingers grasping the device,
with 4 tactors for each finger, excluding the thumb. Visual information from the ultrasonic
sensors mapped to the tactors and enabled blindfolded subjects to navigate to a predefined
location while avoiding obstacles. Ito et al. 84] created a handheld device tethered via a metal wire to the user’s belt. Users point
the device in the direction of intended navigation, and when ultrasonic sensors detect
objects, the wire tightens pulling the hand toward the belt. When objects are far
away, the wire loosens allowing the hand to extend. Gallo et al. 85] equipped a white cane with tactile vibrators for distance feedback and a spinning
inertia wheel to augment the contact sensation.

Fig. 3. Wearable finger vibrotactors can be used to encode Braille characters and for guidance
and navigation for the blind. Image from 80] used with permission from IEEE

Other locations targeted for haptic feedback as vision aids include the tongue, mouth,
torso, head, and feet. Bach-Y-Rita et al. 49] developed a tongue stimulator composed of a 7 x 7 electrotactile elements. Users
recognized tactile stimulation patterns including circles, squares, and triangles,
which could potentially be used for blind navigation. Tang and Beebe 86] designed an oral tactile mouthpiece which stimulates the roof of the mouth via a
7 x 7 electrotactile display. The device delivers basic navigation direction cues
including move left, right, forward, or backward. Jones et al. 87] used a 4 x 4 array of vibrotactors along the lower back to guide subjects through
a grid of cones outside in a field. Mann et al. 88] retrofitted a helmet with a Kinect camera and a vibrotactile array around the forehead
to display visual information haptically for applications of blind navigation. Finally,
tactors have been embedded in insoles and used to give direction cues for navigation
and to communicate an elevated risk of falling potential 89], 90] (Fig. 4).

Fig. 4. Vibration insoles can assist in navigation for the blind. Image from 89] used with permission from IEEE

There is a clear tradeoff between user comfort and density of feedback information
when deciding on the location to apply haptic feedback as a vision aid. While applying
tactile sensations to the waist or sole of the foot may be natural locations given
that most people already wear belts and shoe insoles, stimulating high-density mechanoreceptor
areas such as the mouth and fingertips enables higher resolution feedback that may
more realistically convey visual information. A key emphasis moving forward should
be identifying the most critical visual information for the blind and mapping this
in an intuitive way to the users. Given that human response to visual information
tends to be application specific, such as responding to non-verbal communication cues
versus changing gait patterns to avoid an identified obstacle during navigation, haptic
feedback strategies may also need to be application-specific instead of attempting
to generalize all visual information.

Auditory aid for the deaf

To hold conversations, the hearing impaired typically rely on visual or tactile cues,
such as fingerspelling, lip reading, or Tadoma. Alternatively, tactile vocoders perform
a frequency analysis of incoming auditory signals and display spectral information
as stimulation on the skin of the hearing impaired 91], 92]. Saunders et al. 93] presented an abdomen belt of electrotactile stimulators encoding speech frequencies
for speech recognition in profoundly deaf children (hearing loss of greater than 90 dB
for 250 Hz sound frequencies). Improvement in speech production and intelligibility
was observed after a 4-month exploratory study. Boothroyd et al. 94] showed that intonation can be more easily recognized using mechanical strokes on
the skin implemented as an array of eight solenoids actuated depending on the pitch
extracted from a microphone or accelerometer. A comparison between multichannel vibrotactile
and electrical tactile stimulation for relaying sound frequency is presented in 95]. The two tactile display devices differed in stimulation modality (vibrotactile,
electrotactile), location of stimulation (forearm, abdomen), and voice processing
(with and without noise suppression). Results showed that both devices provide benefits
beyond lipreading alone. Bernstein et al. 96] compared three vibrotactile vocoders on the forearm in normal and hearing-impaired
subjects and found that greater resolution in the second formant region and linear
output scaling led to significant improvements of sentence lipreading with vocoders.

Apart from speech recognition, it is also difficult for the hearing impaired to discriminate
environmental sound. Reed et al. 97] demonstrated that normal hearing and profoundly deaf subjects equipped with a wearable
spectral tactual aid are able to identify two bits of information in four 10-item
sets of sounds. Furthermore, because it is difficult for the hearing-impaired to control
voice pitch, it is challenging for them to maintain a stable tone while speaking or
singing. Sakajiri et al. 98] developed a device of 64 piezoelectric vibrators arranged in rows of displacing pins
that contact the user’s finger. The pins push onto the skin displaying the difference
between user and target pitch. Two hearing-impaired subjects with knowledge and practice
in music tested the device capability to aid their singing. The tactile display system
reduces the average musical interval deviation to 117.5 cent (cent is a logarithmic
unit of measure used for musical intervals), which is comparable to that of normal
hearing children.

The inherent complexity of language and subject-to-subject differences raises serious
challenges in developing highly effective haptic displays for auditory replacements.
It may be more realistic for haptic feedback to supplement existing auditory activities
such as supplementing lipreading to resolve ambiguous lip-read messages 96], 99]. Further research should integrate more sensed auditory modalities into wearable
haptic technology, such as audio frequencies, voice aspiration, and temporal characteristics
patterns. Further work to optimize voice signal filters to comply with subject-specific
impairments could bring further benefits through haptic displays.