Access to master’s theses is restricted to current ERAU students, faculty, and staff.

Date of Award

Summer 2001

Document Type

Thesis - Open Access

Degree Name

Master of Science in Human Factors & Systems

Department

Human Factors and Systems

Committee Chair

John A. Wise, Ph.D.

Committee Member

Dennis A. Vincenzi, Ph.D.

Committee Member

Andrew J. Kornecki, Ph.D.

Abstract

Humans interact with their environment by obtaining information from various modalities of sensing. These various modalities of sensing combine to facilitate manipulation and interaction with objects and the environment. The way humans interact with computers mirrors this environmental interaction with the absence of feedback from the tactile channel. The majority of computer operation is completed visually because currently, the primary feedback humans receive from computers is through the eyes. This strong dependence on the visual modality can cause visual fatigue and fixation on displays, resulting in errors and a decrease in performance. Distributing tasks and information across sensory modalities could possibly solve this problem. This study added tactile feedback to the human computer interface through vibration of a mouse to more accurately reflect a human's multi-sensory interaction with their environment. This investigation used time off target to measure performance in a pursuit-tracking task. The independent variables were type of feedback with two levels, (i.e., tactile feedback vs no tactile feedback) and speed of target at three different levels, (i.e., slow, medium, and fast). Tactile feedback improved pursuit-tracking performance by 6%. Significant main effects where found for both the speed and feedback factors, but no significant interaction between speed and feedback was obtained. This improvement in performance was consistent with previous research, and lends further support to the advantages multimodal feedback may have to offer man-machine interfaces.

Share

COinS