Ultrasound technology has continued to be miniaturized at a rapid pace for the past several decades. Recently, handheld smartphone-sized ultrasound systems have emerged and are enabling point-of-care imaging in austere environments and resource-poor settings. With further miniaturization, one can imagine that wearable smartwatch-sized imaging systems may soon be possible. What new opportunities can you imagine with wearable imaging? My research group has been pondering this question for a while, and we have been working on an unexpected application: using ultrasound imaging to sense muscle activity and volitionally control robotic devices.
Since antiquity, humans have been working on developing articulated prosthetic devices to replace limbs lost to injury. One of the earliest designs of an articulated mechanical prosthetic hand dates from the Second Punic War (218–201 BC). However, robust and intuitive volitional control of prosthetic hands has been a long-standing challenge that has yet to be adequately solved. Even though significant research investments have led to the development of sophisticated mechatronic hands with multiple degrees of freedom, a large proportion of amputees eventually abandon these devices, often citing limited functionality as a major factor.
A major barrier to improving functionality has been the challenge of inferring the intent of the amputee user and to derive appropriate control signals. Inferring the user’s intent has primarily been limited to noninvasively sensing electrical activity of muscles in the residual limbs or more invasive sensing of electrical activity in the brain. Commercial myoelectric prosthetic hands utilize 2 skin-surface electrodes to record electrical activity from the flexor and extensor muscles of the residual stump. To select between multiple grips with just these 2 degrees of freedom, users often have to perform a sequence of non-intuitive maneuvers to select among pre-programmed grips from a menu. This rather unnatural control mechanism significantly limits the potential functionality of these devices for activities of daily living.
Recently, systems with multiple electrodes that utilize pattern recognition algorithms to classify the intended grasp end-state from recorded signals have shown promise. However, the ability of amputees to translate end-state classification to intuitive real-time control with multiple degrees of freedom continues to be limited.
To address these limitations, invasive strategies, such as implanted myoelectric sensors are being pursued. Another approach, known as targeted muscle reinnervation, involves surgically transferring the residual peripheral nerves from the amputated limb to different intact muscle targets that can function as a biological amplifier of the motor nerve signal. While these invasive strategies have exciting promise, there continues to be a need for better noninvasive sensing.
Recently, our research group has demonstrated that ultrasound imaging can be used to resolve the activity of the various muscle compartments in the residual forearm. When amputees imagine volitionally controlling their phantom limb, the innervated residual muscles in the stump contract and this mechanical contraction can be visualized clearly on ultrasound. Indeed, one of the major strengths of ultrasound is the exquisite ability to quantify even minute tissue motion. Contractions of both superficial and deep-seated functional muscle compartments can be spatially resolved enabling high specificity in differentiating between different intended movements.
Our research has shown that sonomyography can exceed the grasp classification accuracy of state-of-the-art pattern recognition, and crucially enables intuitive proportional position control by utilizing mechanical deformation of muscles as the control signal. In studies with transradial amputees, we have demonstrated the ability to generate robust control signals and intuitive position-based proportional control across multiple degrees of freedom with very little training, typically just a few minutes.
We are now working on miniaturizing this technology to a low-power wearable system with compact electronics that can be incorporated into a prosthetic socket and developing prototype systems that can be tested in clinical trials. The feedback we have received so far from our amputee subjects and clinicians indicates that this ultrasound technology can overcome many of the current challenges in the field, and potentially improve functionality and quality of life of amputee users.
Now, if only noninvasive ultrasound neuromodulation can be used to provide haptic and sensory feedback to amputee users in a closed loop ultrasound-based sensing and stimulation system, we will be a step closer to restoring sensorimotor functionality to amputee users, and a grand challenge in the field of neuroprosthetics may be within reach. That will, of course, require some more research.
I was attracted to ultrasound research as a graduate student because of the nearly limitless possibilities of ultrasound technology beyond traditional imaging applications. As wearable sensors revolutionize healthcare, perhaps wearable ultrasound may have a role to play. One can only imagine what other novel applications may be enabled as the technology continues to be miniaturized. I think it is an exciting time to be an ultrasound researcher.
What new opportunities can you imagine with wearable imaging? Are you working on something using miniaturized ultrasound? Comment below or let us know on Twitter: @AIUM_Ultrasound.
Siddhartha Sikdar, PhD, is a Professor in the Bioengineering Department in the Volgenau School of Engineering at George Mason University.