Is this project an undergraduate, graduate, or faculty project?
Undergraduate
Project Type
group
Authors' Class Standing
Charles Pollock - Senior Jeremiah Lantzer - Senior Marcus Isnard - Senior Abdullah Alosaimi - Senior
Lead Presenter's Name
Jeremiah Lantzer
Faculty Mentor Name
Dr. Eduardo Divo
Abstract
Developing heterogeneous vehicle operations is the next step in progressing how technology can be utilized. Vehicles that operate in similar environments have an easier time working together autonomously rather than vehicles of different environments. This project focuses on utilizing the unique abilities of two vehicles with different operating environments and merging them into one system controlled by one user. The system uses one ground vehicle and one quadcopter which work in tandem. Spatial awareness and maneuverability of the user is achieved by utilizing a Heads-Up Display along with EMG sensors to recognize hand gestures. By using EMG sensors, a physical controller is no longer needed, allowing the user to keep their hands free should they need to complete other tasks. Upon completion of this project in the coming weeks, each vehicle will have the ability to be independently controlled by the user with sensor information sent back to the Heads-Up Display. The system will also allow both vehicles to work in tandem by having the user control a single vehicle while the second vehicle reacts to the movements of the first vehicle.
Did this research project receive funding support (Spark, SURF, Research Abroad, Student Internal Grants, Collaborative, Climbing, or Ignite Grants) from the Office of Undergraduate Research?
Yes, Ignite Grant
Cross Platform Training via Augmented Reality and Neuromuscular Control Systems
Developing heterogeneous vehicle operations is the next step in progressing how technology can be utilized. Vehicles that operate in similar environments have an easier time working together autonomously rather than vehicles of different environments. This project focuses on utilizing the unique abilities of two vehicles with different operating environments and merging them into one system controlled by one user. The system uses one ground vehicle and one quadcopter which work in tandem. Spatial awareness and maneuverability of the user is achieved by utilizing a Heads-Up Display along with EMG sensors to recognize hand gestures. By using EMG sensors, a physical controller is no longer needed, allowing the user to keep their hands free should they need to complete other tasks. Upon completion of this project in the coming weeks, each vehicle will have the ability to be independently controlled by the user with sensor information sent back to the Heads-Up Display. The system will also allow both vehicles to work in tandem by having the user control a single vehicle while the second vehicle reacts to the movements of the first vehicle.