Robotics and distortions modulate adaptation
In the neural engineering field of human-machine-interaction, one exciting new goal is to modulate the neural control using interactions with virtual reality and robotic interfaces, which can substantially advance the conventional training practices in areas such as surgery, piloting, military remote control, music/sports performance, and neurorehabilitation. Models of human motor control advocate the importance of movement errors in modulating the central nervous system. As such, naturally occurring movement errors may be artificially distorted in haptic-graphic virtual reality environments. This thesis examined two studies of new classes of error augmentation in virtual reality environments, and included the development of a novel novel robotic interface for making one of these studies possible. The purpose was to determine whether observed adaptation patterns from haptic-graphic interactions would be more amiable with the aid of error augmentation. The first study indicated that subjects who received error augmentation performed better by the end of training by reaching targets more quickly, more accurately, and with more continuous movements. The second study utilized a methodology to develop and integrate a robot into a haptic-graphic interface. Benefits included increased access for collaboration between research labs, and decreased developmental overhead. The third study demonstrated that subjects who received error augmentation were capable of conditioning movement variability within a region, while performing redundant tasks. Results indicated that error in training is permissible, so long as it does not influence performance. These results may be of particular interest to facilitate the rehabilitation of members of society who suffer motor deficits, such as stroke and traumatic brain injured patients. Furthermore, results may be widely applicable in any training environment where distortions can be applied.