Show simple item record

dc.contributor.authorKırtay, M.
dc.contributor.authorÖztop, Erhan
dc.contributor.authorKuhlen, A. K.
dc.contributor.authorAsada, M.
dc.contributor.authorHafner, V. V.
dc.date.accessioned2023-08-11T10:46:33Z
dc.date.available2023-08-11T10:46:33Z
dc.date.issued2022
dc.identifier.isbn978-166541311-4
dc.identifier.urihttp://hdl.handle.net/10679/8635
dc.identifier.urihttps://ieeexplore.ieee.org/document/9962212
dc.description.abstractThis study presents a robot trust model based on cognitive load that uses multimodal cues in a learning setting to assess the trustworthiness of heterogeneous interaction partners. As a test-bed, we designed an interactive task where a small humanoid robot, Nao, is asked to perform a sequential audio-visual pattern recall task while minimizing its cognitive load by receiving help from its interaction partner, either a robot, Pepper, or a human. The partner displayed one of three guiding strategies, reliable, unreliable, or random. The robot is equipped with two cognitive modules: a multimodal auto-associative memory and an internal reward module. The former represents the multimodal cognitive processing of the robot and allows a 'cognitive load' or 'cost' to be assigned to the processing that takes place, while the latter converts the cognitive processing cost to an internal reward signal that drives the cost-based behavior learning. Here, the robot asks for help from its interaction partner when its action leads to a high cognitive load. Then the robot receives an action suggestion from the partner and follows it. After performing interactive experiments with each partner, the robot uses the cognitive load yielded during the interaction to assess the trustworthiness of the partners -i.e., it associates high trustworthiness with low cognitive load. We then give a free choice to the robot to select the trustworthy interaction partner to perform the next task. Our results show that, overall, the robot selects partners with reliable guiding strategies. Moreover, the robot's ability to identify a trustworthy partner was unaffected by whether the partner was a human or a robot.en_US
dc.description.sponsorshipDeutsche Forschungsgemeinschaft ; Osaka University
dc.language.isoengen_US
dc.publisherIEEEen_US
dc.relation.ispartof2022 IEEE International Conference on Development and Learning (ICDL)
dc.rightsrestrictedAccess
dc.titleForming robot trust in heterogeneous agents during a multimodal interactive gameen_US
dc.typeConference paperen_US
dc.publicationstatusPublisheden_US
dc.contributor.departmentÖzyeğin University
dc.contributor.authorID(ORCID 0000-0002-3051-6038 & YÖK ID 45227) Öztop, Erhan
dc.contributor.ozuauthorÖztop, Erhan
dc.identifier.startpage307en_US
dc.identifier.endpage313en_US
dc.identifier.doi10.1109/ICDL53763.2022.9962212en_US
dc.subject.keywordsHeterogeneous interactionen_US
dc.subject.keywordsInternal rewarden_US
dc.subject.keywordsMultimodal integrationen_US
dc.subject.keywordsTrusten_US
dc.identifier.scopusSCOPUS:2-s2.0-85138728513
dc.relation.publicationcategoryConference Paper - International - Institutional Academic Staff


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record


Share this page