TNA project : Open-Domain Conversations with Real and Virtual Robots



Acronym : 86-Open-Domain Conversations with Real and Virtual Robots-Wilcock

Project Lead : Graham Wilcock From : University of Helsinki

Dates : from 1st December 2013 to 14th December 2013

Description :



Motivation and objectives :
This project is a sister project of "Constructive Conversations - Modelling Interaction Space, Gesturing, and Gazing in natural interactive encounters with humans and robots", whose objectives will be pursued also by this project. In addition, this new project adds a further objective: to develop virtual-reality-based methods for interacting with remote robots using 3D Internet. One of the motivations for this new objective is that robots are currently expensive resources that are typically only available to researchers in technology faculties, but research on human-robot interaction (HRI) urgently needs contributions from researchers in humanities, social sciences, and behavioural sciences. Access to robot interactions by virtual reality will make it possible for a much wider range of researchers to participate in HRI, including smaller projects who could not economically justify acquiring their own robots. The existing motivation and objectives of the sister project are as follows: Natural language is used to exchange information, and the effective transfer of information is often taken as the main criterion for the success of interaction. Especially in the context of automatic services, the delivery of reliable and relevant information is an important goal for the design of such systems. Recently, however, one of the challenges for designing interactive systems has been identified as being related to social aspects of interactions: how to engage the partner in the interaction and keep their interest up so that the speaker can either deliver the message they intend to deliver, or can provide rapport and affection so as to create a mutual bond and an understanding relationship. Our interest in studying conversational engagement goes back to intelligent systems and interaction technology, where engagement is used to describe the user’s willingness and involvement in the interaction with the automatic interactive system. If it is possible to measure the interlocutors’ engagement level, it is easier to adjust the system’s conversational strategies accordingly. One of the ways to measure engagement is to study the interlocutors' paralinguistic singnalling concerning their interest, level of understanding, and focus of attention. For instance, with motion tracker technology such issues as the participants' distance from each other can be measured, and combined with the research questions of the participants' engagement in the interaction, the novel technology can provied important and useful objective information of the interaction space and the participants' control over the space around them. This research will not enhance only our understanding of the comfortable communication between humans or between humans and virtual agents, but will also allow us to build models for the automatic recognition and management of issues related to appropriate and smooth communication.

Teams :
3I (Intelligent Interactive Informatics) Group is a research group in the Institute of Behavioural Sciences at the University of Helsinki. Our research focuses on intelligent interaction and information systems, and our areas of expertise include multimodal interaction, interaction management, paralinguistic communication, corpus collection and annotation, machine learning techniques, clustering and classification of linguistic data, learning and interaction. We currently work on multimodal corpus analysis (top-down human annotations and observations as well as bottom-up signal analysis) on naturally flowing human-human conversations and first encounter interactions, especially focussing on eye-gaze, face, and hand gestures, and their use in signalling turn-taking and feedback, in order to develop models for interaction techniques and strategies.

Dates :
starting date : 01 December, 2013
ending date : 14 December, 2013

Facilities descriptions :
http://visionair-browser.g-scop.grenoble-inp.fr/visionair/Browser/Catalogs/3DICC.HU.html

Recordings & Results :
The aim of the project is to make progress towards spoken dialogue interaction with real and virtual robots using virtual reality and 3D Internet. In previous research (WikiTalk) we developed methods for open-domain spoken conversations with a real robot (NAO) using Wikipedia as a knowledge source. Real robots are expensive and only available in engineering labs, but research on human-robot interaction (HRI) needs contributions from researchers in humanities and behavioural sciences. Access to robot interactions by virtual reality will enable a wider range of researchers to participate in HRI, including smaller projects who could not justify acquiring their own robots.

Conclusions :
To learn about VirCA, I installed the new version (released 12.11.2013). I compiled prototype code for a Wikipedia Browser and added the component to the system. I investigated speech recognition with a virtual robot by combining a Voice Recognizer component with a virtual KUKA industrial robot CyberDevice (all these components created by MTA SZTAKI). To work on a virtual humanoid robot, I installed a prototype virtual NAO robot CyberDevice created by MTA SZTAKI. I was able to display the virtual NAO robot in the VirCA 3D virtual space on my laptop screen. We discussed how to enable the virtual humanoid robot to talk, to listen, and to perform gestures under the control of dialogue manager software - these objectives can be achieved with further development. During the visit I saw demonstrations of remote interactions using 3D Internet, including a virtual NAO robot performing gestures controlled by motion capture with Kinect. I presented my own research ideas on open-domain listening in WikiTalk at a one-day workshop on Future Internet Science and Engineering held at MTA SZTAKI.



.



Visionair logo

VISIONAIR / Grenoble INP / 46 avenue Felix Viallet / F-38 031 Grenoble cedex 1 / FRANCE
Project funded by the European Commission under grant agreement 262044