Custom cover image
Custom cover image

Modelling Human Motion : From Human Perception to Robot Design / edited by Nicoletta Noceti, Alessandra Sciutti, Francesco Rea

Contributor(s): Resource type: Ressourcentyp: Buch (Online)Book (Online)Language: English Series: Springer eBook CollectionPublisher: Cham : Springer International Publishing, 2020Publisher: Cham : Imprint: Springer, 2020Edition: 1st ed. 2020Description: 1 Online-Ressource(IX, 354 p. 100 illus., 89 illus. in color.)ISBN:
  • 9783030467326
Subject(s): Additional physical formats: 9783030467319 | 9783030467333 | 9783030467340 | Erscheint auch als: 9783030467319 Druck-Ausgabe | Erscheint auch als: 9783030467333 Druck-Ausgabe | Erscheint auch als: 9783030467340 Druck-AusgabeDOI: DOI: 10.1007/978-3-030-46732-6Online resources: Summary: 1. Modelling Human Motion: A Task at the Crossroads of Neuroscience, Computer Vision and Robotics -- 2. The Neurophysiology of Action Perception -- 3. Beyond Automatic Motor Mapping: New Insights into Top-down Modulations on Action Perception -- 4. The Visual Perception of Biological Motion in Adults -- 5. The Development of Action Perception -- 6. The Importance of the Affective Component of Movement in Action Understanding.Summary: The new frontiers of robotics research foresee future scenarios where artificial agents will leave the laboratory to progressively take part in the activities of our daily life. This will require robots to have very sophisticated perceptual and action skills in many intelligence-demanding applications, with particular reference to the ability to seamlessly interact with humans. It will be crucial for the next generation of robots to understand their human partners and at the same time to be intuitively understood by them. In this context, a deep understanding of human motion is essential for robotics applications, where the ability to detect, represent and recognize human dynamics and the capability for generating appropriate movements in response sets the scene for higher-level tasks. This book provides a comprehensive overview of this challenging research field, closing the loop between perception and action, and between human-studies and robotics. The book is organized in three main parts. The first part focuses on human motion perception, with contributions analyzing the neural substrates of human action understanding, how perception is influenced by motor control, and how it develops over time and is exploited in social contexts. The second part considers motion perception from the computational perspective, providing perspectives on cutting-edge solutions available from the Computer Vision and Machine Learning research fields, addressing higher-level perceptual tasks. Finally, the third part takes into account the implications for robotics, with chapters on how motor control is achieved in the latest generation of artificial agents and how such technologies have been exploited to favor human-robot interaction. This book considers the complete human-robot cycle, from an examination of how humans perceive motion and act in the world, to models for motion perception and control in artificial agents. In this respect, the book will provide insights into the perception and action loop in humans and machines, joining together aspects that are often addressed in independent investigations. As a consequence, this book positions itself in a field at the intersection of such different disciplines as Robotics, Neuroscience, Cognitive Science, Psychology, Computer Vision, and Machine Learning. By bridging these different research domains, the book offers a common reference point for researchers interested in human motion for different applications and from different standpoints, spanning Neuroscience, Human Motor Control, Robotics, Human-Robot Interaction, Computer Vision and Machine Learning. Chapter 'The Importance of the Affective Component of Movement in Action Understanding' of this book is available open access under a CC BY 4.0 license at PPN: 1726027597Package identifier: Produktsigel: ZDB-2-SCS | ZDB-2-SEB | ZDB-2-SXCS
No physical items for this record

Powered by Koha