Is this project an undergraduate, graduate, or faculty project?
Undergraduate
Project Type
group
Authors' Class Standing
All, Senior
Lead Presenter's Name
Adam J. Berlier
Faculty Mentor Name
Eduardo A. Divo
Abstract
Traditional remotely operated vehicles (ROV’s) require extensive setup and unnatural control systems. Integrating wearable devices as a control system, operators gain mobility and situational awareness to execute additional tasks. Analysis is conducted to understand if wearable devices connected by Internet of Things (IoT) allows for a more natural control system. A gesture recognition armband is worn around the operator’s forearm reading surface electromyography (sEMG) signals produced by their muscles to recognize hand gestures. An Augmented Reality (AR) headset overlays supplemental information on a heads-up display (HUD). IoT enables each component of the system to transmit and receive data over a network. The AR headset serves as the central processing unit, processing sEMG signals and transmitting respective commands to a ROV. The ROV acts on the received commands and transmits data, describing its actions and environment, to be displayed. A library of electrical signals that relate to hand gestures defined in US Army Publication TC3-21.60 are developed as a control set of commands. Signal processing and machine learning methods are implemented to reduce cross-talk and interference of weak sEMG signals for accurate gesture recognition. Results provide insight on the effectiveness of neuromuscular control compared to human-to-human instruction, and how wearable control systems can increase operator situational awareness.
Did this research project receive funding support (Spark, SURF, Research Abroad, Student Internal Grants, Collaborative, Climbing, or Ignite Grants) from the Office of Undergraduate Research?
Yes, Ignite Grant
Integration of Augmented Reality and Neuromuscular Control Systems for Remote Vehicle Operations
Traditional remotely operated vehicles (ROV’s) require extensive setup and unnatural control systems. Integrating wearable devices as a control system, operators gain mobility and situational awareness to execute additional tasks. Analysis is conducted to understand if wearable devices connected by Internet of Things (IoT) allows for a more natural control system. A gesture recognition armband is worn around the operator’s forearm reading surface electromyography (sEMG) signals produced by their muscles to recognize hand gestures. An Augmented Reality (AR) headset overlays supplemental information on a heads-up display (HUD). IoT enables each component of the system to transmit and receive data over a network. The AR headset serves as the central processing unit, processing sEMG signals and transmitting respective commands to a ROV. The ROV acts on the received commands and transmits data, describing its actions and environment, to be displayed. A library of electrical signals that relate to hand gestures defined in US Army Publication TC3-21.60 are developed as a control set of commands. Signal processing and machine learning methods are implemented to reduce cross-talk and interference of weak sEMG signals for accurate gesture recognition. Results provide insight on the effectiveness of neuromuscular control compared to human-to-human instruction, and how wearable control systems can increase operator situational awareness.