2020-02981 - The role of rapport in human-conversational agent interaction:Modeling conversation to improve task performance in human-agent Interaction

Contract type : Fixed-term contract

Renewable contract : Oui

Level of qualifications required : Graduate degree or equivalent

Fonction : Temporary scientific engineer


The objective of this project is to build embodied conversational agents (also known as ECAs, or virtual humans, chatbots, or multimodal dialogue systems) that have the ability to engage their users using language and nonverbal behavior, both social and task talk, where the social talk serves to improve task performance. In order to achieve this objective, we model human-human conversation, and integrate the models into ECAs, and then evaluate their performance.

The engineer chosen for this project will work with a strong multidisciplinary team of doctoral students, postdocs, and other researchers, to develop a state of the art Embodied Conversational Agent system that can engage with people. It will be demonstrated on a large screen at Inria (and perhaps in other venues), and also used for human-computer interaction experiments on the web and on mobile devices. 

Inria offers a supportive environment for engineers, as they are located in a particular research group, but also benefit from the support of the entire group of engineers at Inria Paris who carry out research and development. 


  • The engineer chosen for this project should have a broad range of skills at the intersection of Dialogue Systems, Cognitive Architectures, Embodied Conversational Agents, and Animated Agents. These skills should include several of the following:
  • Implementing modules for speech recognition, intention recognition, dialogue management, and natural language generation. 
  • Interfacing the dialogue system modules with competence modules such as intelligent tutors or recommendation systems.
  • Working with architectures that include nonverbal behavior recognition as well as speech recognition, and with architectures that include an animated agent implemented in Unity for nonverbal behavior generation. 
  • Integrating results from deep reinforcement learning on natural language and nonverbal behavior corpora into the dialogue system pipeline. 
  • Exploring the utility of different genres of architectures, such as Cognitive Architectures, with short- and long-term memory, and user models.
  • Developing and maintaining a multi-user version of the Embodied Conversational Agent that can run in a browser.
  • Developing and maintaining a multi-user version of the Embodied Conversational Agent that can run on a smart phone. 
  • For more information on the project, potential candidates should look at the SARA (Socially-Aware Robot Assistant) website at <http://articulab.hcii.cs.cmu.edu/projects/sara/>  and read some of the publications associated with the project, here <http://articulab.hcii.cs.cmu.edu/publications/>



Technical Skills:  Solid competence in deep learning applied to dialogue systems, advanced competence in programming in language such as Python and C++, and use of tools such as Tensorflow and Pytorch.

Languages: French, English

Relational Skills: ability to work in a team, and collaborate with others from different disciplines and backgrounds. Ability to manage other team members.

Benefits package

  • Subsidized meals
  • Partial reimbursement of public transport costs
  • Leave: 7 weeks of annual leave + 10 extra days off due to RTT (statutory reduction in working hours) + possibility of exceptional leave (sick children, moving home, etc.)
  • Possibility of teleworking (after 6 months of employment) and flexible organization of working hours
  • Professional equipment available (videoconferencing, loan of computer equipment, etc.)
  • Social, cultural and sports events and activities
  • Access to vocational training