Publication:
Affordance-based altruistic robotic architecture for human–robot collaboration

dc.contributor.authorImre, M.
dc.contributor.authorÖztop, Erhan
dc.contributor.authorNagai, Y.
dc.contributor.authorUgur, E.
dc.contributor.departmentComputer Science
dc.contributor.ozuauthorÖZTOP, Erhan
dc.date.accessioned2020-07-06T09:44:31Z
dc.date.available2020-07-06T09:44:31Z
dc.date.issued2019-08
dc.description.abstractThis article proposes a computational model for altruistic behavior, shows its implementation on a physical robot, and presents the results of human-robot interaction experiments conducted with the implemented system. Inspired from the sensorimotor mechanisms of the primate brain, object affordances are utilized for both intention estimation and action execution, in particular, to generate altruistic behavior. At the core of the model is the notion that sensorimotor systems developed for movement generation can be used to process the visual stimuli generated by actions of the others, infer the goals behind, and take the necessary actions to help achieving these goals, potentially leading to the emergence of altruistic behavior. Therefore, we argue that altruistic behavior is not necessarily a consequence of deliberate cognitive processing but may emerge through basic sensorimotor processes such as error minimization, that is, minimizing the difference between the observed and expected outcomes. In the model, affordances also play a key role by constraining the possible set of actions that an observed actor might be engaged in, enabling a fast and accurate intention inference. The model components are implemented on an upper-body humanoid robot. A set of experiments are conducted validating the workings of the components of the model, such as affordance extraction and task execution. Significantly, to assess how human partners interact with our altruistic model deployed robot, extensive experiments with naive subjects are conducted. Our results indicate that the proposed computational model can explain emergent altruistic behavior in reference to its biological counterpart and moreover engage human partners to exploit this behavior when implemented on an anthropomorphic robot.en_US
dc.description.sponsorshipEuropean Union's Horizon 2020 research and innovation program ; JST CREST "Cognitive Mirroring: Assisting people with developmental disorders by means of self-understanding and social sharing of cognitive processes'' ; Bogazici Research Fund (BAP) project IMAGINE-COG++
dc.description.versionPublisher versionen_US
dc.identifier.doi10.1177/1059712318824697en_US
dc.identifier.endpage241en_US
dc.identifier.issn1059-7123en_US
dc.identifier.issue4en_US
dc.identifier.scopus2-s2.0-85061196719
dc.identifier.startpage223en_US
dc.identifier.urihttp://hdl.handle.net/10679/6705
dc.identifier.urihttps://doi.org/10.1177/1059712318824697
dc.identifier.volume27en_US
dc.identifier.wos000475457000001
dc.language.isoengen_US
dc.peerreviewedyesen_US
dc.publicationstatusPublisheden_US
dc.publisherSageen_US
dc.relation.ispartofAdaptive Behavior
dc.relation.publicationcategoryInternational Refereed Journal
dc.rightsopenAccess
dc.subject.keywordsAltruistic behavioren_US
dc.subject.keywordsComputational modelingen_US
dc.subject.keywordsBrain-inspired roboticsen_US
dc.subject.keywordsAffordancesen_US
dc.subject.keywordsHuman-robot interactionen_US
dc.subject.keywordsGoal inferenceen_US
dc.titleAffordance-based altruistic robotic architecture for human–robot collaborationen_US
dc.typearticleen_US
dspace.entity.typePublication
relation.isOrgUnitOfPublication85662e71-2a61-492a-b407-df4d38ab90d7
relation.isOrgUnitOfPublication.latestForDiscovery85662e71-2a61-492a-b407-df4d38ab90d7

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Affordance-based altruistic robotic architecture for human–robot collaboration.pdf
Size:
4.1 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
Placeholder
Name:
license.txt
Size:
1.45 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections