• No results found

Playful exploration of a robot’s gesture production and recognition abilities

N/A
N/A
Protected

Academic year: 2021

Share "Playful exploration of a robot’s gesture production and recognition abilities"

Copied!
3
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Tilburg University

Playful exploration of a robot’s gesture production and recognition abilities

de Wit, Jan; Willemsen, Bram; de Haas, Mirjam; Wolfert, Pieter; Vogt, Paul; Krahmer, Emiel

Published in:

Workshop on Gesture & Technology, Warwick 2018

Publication date:

2018

Document Version

Publisher's PDF, also known as Version of record Link to publication in Tilburg University Research Portal

Citation for published version (APA):

de Wit, J., Willemsen, B., de Haas, M., Wolfert, P., Vogt, P., & Krahmer, E. (2018). Playful exploration of a robot’s gesture production and recognition abilities. In Workshop on Gesture & Technology, Warwick 2018

General rights

Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain

• You may freely distribute the URL identifying the publication in the public portal

Take down policy

If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim.

(2)

Playful exploration of a robot’s gesture production and

recognition abilities

Jan de Wit*, Bram Willemsen*, Mirjam de Haas*, Pieter Wolfert*, Paul Vogt and Emiel Krahmer

Tilburg center for Cognition and Communication (TiCC), Tilburg University, Tilburg, the Netherlands

There is an increasing interest in the use of humanoid robots as a platform for presenting (educational) content. The robot’s ability to communicate non-verbally can increase understanding between human and robot, and can help to maintain an engaging interaction. For example, in the context of the L2TOR project [1], we have seen that a robot performing iconic gestures when teaching children a second language helps long-term memorization of new words [3].

To gather and make publicly available a dataset of Kinect recordings, from a diverse group of participants performing iconic gestures, and to learn more about the comprehensibility of these recorded gestures when translated to a humanoid robot, we propose an exploratory study where participants play ten rounds of a gesture guessing game with a NAO robot. First, the participant performs an iconic gesture, depicting an object (out of a predetermined set). Then, the robot will perform a gesture (that it has “learned” from the Kinect recording of a previous participant) and the participant will have to guess. The set-up of the

experiment is shown in Figure 1. The system consists of several components which are outlined in Figure 2. For the clustering and recognition steps, we attempt to extract the gist (essence) of a gesture, inspired by [2]. Because participants effectively rate the robot’s gestures by guessing, we expect to discover which of the recorded gestures remain comprehensible when performed by the robot, taking into account its physical limitations.

The proposed study will take place at the NEMO science museum in Amsterdam.

(3)

Figure 2: Proposed system design for generating and recognizing gestures.

References:

1. Tony Belpaeme, James Kennedy, Paul Baxter, Paul Vogt, Emiel J Krahmer, Stefan Kopp, Kirsten Bergmann, Paul Leseman, Aylin C Küntay, Tilbe Göksun, Amit K Pandey, Rodolphe Gelin, Petra Koudelkova, and Tommy Deblieck. 2015. L2TOR - Second Language Tutoring using Social Robots.

1st Int. Workshop on Educational Robotics at the Int. Conf. Social Robotics, January. Retrieved June 16, 2017

from https://ilk.uvt.nl/~pvogt/publications/wonder2015.pdf

2. Maria Eugenia Cabrera and Juan Pablo Wachs. 2017. A Human-Centered Approach to One-Shot Gesture Learning. Frontiers in Robotics and AI 4: 8. https://doi.org/10.3389/frobt.2017.00008 3. Jan de Wit, Thorsten Schodde, Bram Willemsen, Kirsten Bergmann, Mirjam de Haas, Stefan Kopp,

Emiel Krahmer, and Paul Vogt. 2018. The Effect of a Robot ’ s Gestures and Adaptive Tutoring on Children ’ s Acquisition of Second Language Vocabularies. In Proceedings of the 2018 ACM/IEEE

Referenties

GERELATEERDE DOCUMENTEN

We discussed a wide range of possible segmentation techniques that were ap- plicable for our scenario and the available sensors, and decided to implement two fundamentally

Results revealed that there is indeed a significant effect of the type of gesture used for language learning; it showed a significant difference between the performance of

If the temperature of air is measured with a dry bulb thermometer and a wet bulb thermometer, the two temperatures can be used with a psychrometric chart to obtain the

Although word re- sponses of correct length (c,) are far higher, response words longer than the eliciting stimulus have no higher scores than the corresponding

In sum, our results (1) highlight the preference for handling and molding representation techniques when depicting objects; (2) suggest that the technique used to represent an object

The focus of this study is to investigate the extent to which individual values parallel organizational values and the potential impact that this fit (or lack thereof) may have

The initial due date of an order is generated by the planning department, based on estimations on the predicted number of orders for each resource of the pre-production and

First of all, if a company successfully learns about the loyalty levels of its customers and uses this information to price discriminate customers based on their loyalty, total