Imagine talking to a computer the same way we talk to other humans! The Robotics & Avatars Lab investigates interaction between humans and computational intelligent agents by monitoring human-human communication, both verbal (language) and non-verbal (facial expressions, gestures, eye gaze), and using these findings to develop artificial intelligence algorithms that imitate human-human communication.
These technologies allow us to have conversations with computers in the embodiment of robots or virtual humans that walk up to us, make eye contact, use facial expressions and start a conversation. The Robotics & Avatars Lab answers questions such as: “what makes human communication natural?”, “should an embodied agent look like a fellow human?”, “how do people respond to these humanlike agents?” and “can robots and avatars learn through conversations?”
These robots and avatars can be applied in journalism (for instance as virtual personal newsreader), in healthcare (as caredroids in nursing homes where they serve as a conversational partner), as intelligent training systems (in maintenance, healthcare or aerospace) or as intelligent tutoring systems (in a range of educational settings).