Sensorimotor transformations are used to translate sensory information on intrinsic properties of objects (i.e., size, shape, orientation) onto motor commands for appropriate hand-object interaction. Hence, the direct result of sensorimotor transformation for reach-to-grasp action is hand kinematics (hand shaping) fitting with the object size. We assembled and evaluated a sensor-based glove to measure finger flexion during reaching of differently sized cylinders. Once ensured of the good functioning of the tool, we adopt the glove in two studies dealing with grasping with and without vision. The first study aimed to causally draw a functional map of PMC for visually-based grasping. Specifically, online TMS was applied over a grid covering the whole precentral gyrus while subjects grasped three differently sized cylinders. Output from our sensor glove was analyzed with a hypothesis-independent approach using classification algorithms. Results from classifiers convincingly suggested a multifocal representation of visually-based grasping in human PMC involving the ventral PMC and, for the first time in human, the supplementary motor area. The second study aimed to establish whether the gaze direction modulated hand shaping during haptically-based reaching as it does during visually-based reaching. Participants haptically explored and then grasped an object of three possible sizes aligned with body midline while looking in the direction of the object or laterally to it. Results showed that gaze direction asymmetrically affected finger flexion during haptically-based reaching. Despite this asymmetrical effect, the investigation provided evidence for retinotopic coding of haptically-explored objects.
|Titolo:||Neuropsychological and behavioral studies on object grasping in humans with and without vision|
|Data di pubblicazione:||2021|
|Appare nelle tipologie:||07.13 Doctoral Thesis|