umu.sePublications
Change search
Refine search result
1 - 11 of 11
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the 'Create feeds' function.
  • 1.
    Bowman, Miles C
    et al.
    Centre for Neuroscience Studies and Department of Psychology, Queen’s University, Kingston, ON K7L 3N6, Canada.
    Johansson, Roland S
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Flanagan, John Randall
    Centre for Neuroscience Studies and Department of Psychology, Queen’s University, Kingston, ON K7L 3N6, Canada.
    Eye-hand coordination in a sequential target contact task2009In: Experimental Brain Research, ISSN 0014-4819, E-ISSN 1432-1106, Vol. 195, no 2, 273-283 p.Article in journal (Refereed)
    Abstract [en]

    Most object manipulation tasks involve a series of actions demarcated by mechanical contact events, and gaze is typically directed to the locations of these events as the task unfolds. Here, we examined the timing of gaze shifts relative to hand movements in a task in which participants used a handle to contact sequentially five virtual objects located in a horizontal plane. This task was performed both with and without visual feedback of the handle position. We were primarily interested in whether gaze shifts, which in our task shifted from a given object to the next about 100 ms after contact, were predictive or triggered by tactile feedback related to contact. To examine this issue, we included occasional catch contacts where forces simulating contact between the handle and object were removed. In most cases, removing force did not alter the timing of gaze shifts irrespective of whether or not vision of handle position was present. However, in about 30% of the catch contacts, gaze shifts were delayed. This percentage corresponded to the fraction of contacts with force feedback in which gaze shifted more than 130 ms after contact. We conclude that gaze shifts are predictively controlled but timed so that the hand actions around the time of contact are captured in central vision. Furthermore, a mismatch between the expected and actual tactile information related to the contact can lead to a reorganization of gaze behavior for gaze shifts executed greater than 130 ms after a contact event.

  • 2. Flanagan, J Randall
    et al.
    Bittner, Jennifer P
    Johansson, Roland S
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Experience can change distinct size-weight priors engaged in lifting objects and judging their weights.2008In: Current Biology, ISSN 0960-9822, E-ISSN 1879-0445, Vol. 18, no 22, 1742-7 p.Article in journal (Refereed)
    Abstract [en]

    The expectation that object weight increases with size guides the control of manipulatory actions [1-6] and also influences weight perception. Thus, the size-weight illusion, whereby people perceive the smaller of two equally weighted objects to be heavier, is thought to arise because weight is judged relative to expected weight that, for a given family of objects, increases with size [2, 7]. Here, we show that the fundamental expectation that weight increases with size can be altered by experience and neither is hard-wired nor becomes crystallized during development. We demonstrate that multiday practice in lifting a set of blocks whose color and texture are the same and whose weights vary inversely with volume gradually attenuates and ultimately inverts the size-weight illusion tested with similar blocks. We also show that in contrast to this gradual change in the size-weight illusion, the sensorimotor system rapidly learns to predict the inverted object weights, as revealed by lift forces. Thus, our results indicate that distinct adaptive size-weight maps, or priors, underlie weight predictions made in lifting objects and in judging their weights. We suggest that size-weight priors that influence weight perception change slowly because they are based on entire families of objects. Size-weight priors supporting action are more flexible, and adapt more rapidly, because they are tuned to specific objects and their current state.

  • 3. Flanagan, J Randall
    et al.
    Bowman, Miles C
    Johansson, Roland
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Control strategies in object manipulation tasks.2006In: Current Opinion in Neurobiology, ISSN 0959-4388, Vol. 16, no 6, 650-9 p.Article in journal (Other academic)
    Abstract [en]

    The remarkable manipulative skill of the human hand is not the result of rapid sensorimotor processes, nor of fast or powerful effector mechanisms. Rather, the secret lies in the way manual tasks are organized and controlled by the nervous system. At the heart of this organization is prediction. Successful manipulation requires the ability both to predict the motor commands required to grasp, lift, and move objects and to predict the sensory events that arise as a consequence of these commands.

  • 4.
    Flanagan, J Randall
    et al.
    Department of Psychology and Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada.
    Merritt, Kyle
    Department of Psychology and Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada.
    Johansson, Roland S
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Predictive mechanisms and object representations used in object manipulation2009In: Sensorimotor Control of Grasping: Physiology and Pathophysiology, Cambridge: Cambridge University Press , 2009, 161-177 p.Chapter in book (Other (popular science, discussion, etc.))
    Abstract [en]

    Skilled object manipulation requires the ability to estimate, in advance, the motor commands needed to achieve desired sensory outcomes and the ability to predict the sensory consequences of the motor commands. Because the mapping between motor commands and sensory outcomes depends on the physical properties of grasped objects, the motor system may store and access internal models of objects in order to estimate motor commands and predict sensory consequences. In this chapter, we outline evidence for internal models and discuss their role in object manipulation tasks. We also consider the relationship between internal models of objects employedby the sensorimotor system and representations of the same objects used by the perceptual system to make judgments about objects.

  • 5. Flanagan, J Randall
    et al.
    Terao, Yasuo
    Johansson, Roland S
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Gaze behavior when reaching to remembered targets.2008In: Journal of Neurophysiology, ISSN 0022-3077, E-ISSN 1522-1598, Vol. 100, no 3, 1533-43 p.Article in journal (Refereed)
    Abstract [en]

    People naturally direct their gaze to visible hand movement goals. Doing so improves reach accuracy through use of signals related to gaze position and visual feedback of the hand. Here, we studied where people naturally look when acting on remembered target locations. Four targets were presented on a screen, in peripheral vision, while participants fixed a central cross (encoding phase). Four seconds later, participants used a pen to mark the remembered locations while free to look wherever they wished (recall phase). Visual references, including the screen and the cross, were present throughout. During recall, participants neither looked at the marked locations nor prevented eye movements. Instead, gaze behavior was erratic and was comprised of gaze shifts loosely coupled in time and space with hand movements. To examine whether eye and hand movements during encoding affected gaze behavior during recall, in additional encoding conditions, participants marked the visible targets with either free gaze or with central cross fixation or just looked at the targets. All encoding conditions yielded similar erratic gaze behavior during recall. Furthermore, encoding mode did not influence recall performance, suggesting that participants, during recall, did not exploit sensorimotor memories related to hand and gaze movements during encoding. Finally, we recorded a similar lose coupling between hand and eye movements during an object manipulation task performed in darkness after participants had viewed the task environment. We conclude that acting on remembered versus visible targets can engage fundamentally different control strategies, with gaze largely decoupled from movement goals during memory-guided actions.

  • 6.
    Johansson, Roland
    et al.
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Flanagan, JR
    Queen’s University, Kingston, ON, Canada.
    Tactile sensory control of object manipulation in humans2008In: The senses, a comprehensive reference: somatotsensation Vol 6 / [ed] Esther Gardner, Jon H Kaas, Amsterdam: Elsevier , 2008, 1, Vol. 6, 67-86 p.Chapter in book (Other (popular science, discussion, etc.))
    Abstract [en]

    Dexterous object manipulation is a hallmark of human skill. The versatility of the human hands in manipulation tasks depends on both the anatomical structure of the hands and the neural machinery that controls them. Research during the last 20 years has led to important advances in our understanding of the sensorimotor control mechanisms that underlie dexterous object manipulation. This article focuses on the sensorimotor control of fingertip actions with special emphasis on the role of tactile sensory mechanisms. It highlights the importance of sensory predictions, especially related to mechanical contact events around which manipulation tasks are organized, and analyzes how such predictions are influenced by tactile afferent signals recorded in single neurons in awake humans.

  • 7.
    Johansson, Roland S
    et al.
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Flanagan, J Randall
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Coding and use of tactile signals from the fingertips in object manipulation tasks2009In: Nature Reviews Neuroscience, ISSN 1471-003X, E-ISSN 1471-0048, Vol. 10, no 5, 345-359 p.Article in journal (Refereed)
    Abstract [en]

    During object manipulation tasks, the brain selects and implements action-phase controllers that use sensory predictions and afferent signals to tailor motor output to the physical properties of the objects involved. Analysis of signals in tactile afferent neurons and central processes in humans reveals how contact events are encoded and used to monitor and update task performance.

  • 8.
    Johansson, Roland S
    et al.
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Flanagan, J Randall
    Umeå University, Faculty of Medicine, Department of Odontology.
    Sensorimotor control of manipulation2009In: Encyclopedia of Neuroscience, Elsevier , 2009, 8, 593-604 p.Chapter in book (Other (popular science, discussion, etc.))
  • 9.
    Johansson, Roland S
    et al.
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Flanagan, JR
    Sensory control of object manipulation2009In: Sensorimotor control of grasping: Physiology and pathophysiology / [ed] Dennis A. Nowak, Joachim Hermsdörfer., Cambridge: Cambridge books , 2009, 141-160 p.Chapter in book (Other (popular science, discussion, etc.))
    Abstract [en]

    Series of action phases characterize natural object manipulation tasks where each phase is responsible for satisfying a task subgoal. Subgoal attainment typically corresponds to distinct mechanical contact events, either involving the making or breaking of contact between the digits and an object or between a held object and another object. Subgoals are realized by the brain selecting and sequentially implementing suitable action-phase controllers that use sensory predictions and afferents signals in specific ways to tailor the motor output in anticipation of requirements imposed by objects' physical properties. This chapter discusses the use of tactile and visual sensory information in this context. It highlights the importance of sensory predictions, especially related to the discrete and distinct sensory events associated with contact events linked to subgoal completion, and considers how sensory signals influence and interact with such predictions in the control of manipulation tasks.

  • 10. Rotman, Gerben
    et al.
    Troje, Nikolaus F
    Johansson, Roland S
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Flanagan, J Randall
    Eye movements when observing predictable and unpredictable actions.2006In: Journal of Neurophysiology, ISSN 0022-3077, Vol. 96, no 3, 1358-69 p.Article in journal (Refereed)
    Abstract [en]

    We previously showed that, when observers watch an actor performing a predictable block-stacking task, the coordination between the observer's gaze and the actor's hand is similar to the coordination between the actor's gaze and hand. Both the observer and the actor direct gaze to forthcoming grasp and block landing sites and shift their gaze to the next grasp or landing site at around the time the hand contacts the block or the block contacts the landing site. Here we compare observers' gaze behavior in a block manipulation task when the observers did and when they did not know, in advance, which of two blocks the actor would pick up first. In both cases, observers managed to fixate the target ahead of the actor's hand and showed proactive gaze behavior. However, these target fixations occurred later, relative to the actor's movement, when observers did not know the target block in advance. In perceptual tests, in which observers watched animations of the actor reaching partway to the target and had to guess which block was the target, we found that the time at which observers were able to correctly do so was very similar to the time at which they would make saccades to the target block. Overall, our results indicate that observers use gaze in a fashion that is appropriate for hand movement planning and control. This in turn suggests that they implement representations of the manual actions required in the task and representations that direct task-specific eye movements.

  • 11.
    Sailer, Uta
    et al.
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Flanagan, J Randall
    Johansson, Roland
    Umeå University, Faculty of Medicine, Department of Integrative Medical Biology (IMB), Physiology.
    Eye-hand coordination during learning of a novel visuomotor task.2005In: Journal of Neuroscience, ISSN 1529-2401, Vol. 25, no 39, 8833-42 p.Article in journal (Refereed)
    Abstract [en]

    We investigated how gaze behavior and eye-hand coordination change when subjects learned a challenging visuomotor task that required acquisition of a novel mapping between bimanual actions and their visual sensory consequences. By applying isometric forces and torques to a rigid tool held freely between the two hands, subjects learned to control a cursor on a computer screen to hit successively displayed targets as quickly as possible. The learning occurred in stages that could be distinguished by changes in performance (target-hit rate) as well as by gaze behavior and eye-hand coordination. In a first exploratory stage, the hit rate was consistently low, the cursor position varied widely, and gaze typically pursued the cursor. In a second skill acquisition stage, the hit rate improved rapidly, and gaze fixations began to mark predictively desired cursor positions, indicating that subjects started to program spatially congruent eye and hand motor commands. In a third skill refinement stage, performance continued to improve gradually, and gaze shifted directly toward the target. We suggest that during the exploratory stage, the learner attempts to establish basic mapping rules between manual actions and eye-movement commands. In this process, subjects may establish correlations between hand motor commands and their visual sensory consequences, primarily in fovea-anchored, gaze-centered coordinates, and correlations between recent hand motor commands and eye motor commands. The established mapping rules are then implemented and refined in the skill acquisition and refinement stages.

1 - 11 of 11
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf