Erica: Man Made (Guardian video documentary)

Ishiguro-sensei and I were recently featured in a Guardian video documentary about ERICA.

The Guardian: Documentaries - Erica: Man Made


Robot's Delight - A Lyrical Exposition on Learning by Imitation from Human-Human Interaction

 Best Video Award, HRI 2017

Our latest video won the Best Video Award at the 2017 ACM/IEEE International Conference on Human-Robot Interaction in Vienna! In the form of a musical tribute to The Sugarhill Gang’s 1979 hit “Rapper’s Delight”, this video features Robovie and ERICA rapping in English to outline our recent research into learning-by-imitation of human-human conversational interaction.

In one study, we asked participants to role-play interactions between a shopkeeper and customer in a camera shop. We captured their motion and speech data with sensors, and we applied unsupervised learning techniques to reproduce the shopkeeper's behaviors with Robovie.

In a second study, we applied this technique to a stationary android, ERICA. In this case, since spatial cues were unavailable, we needed to develop a new technique for identifying topic patterns based on modeling the interaction structure.

The Extended Abstract can be found here:

Dylan F. Glas, Malcolm Doering, Phoebe Liu, Takayuki Kanda, and Hiroshi Ishiguro, Robot's Delight - A Lyrical Exposition on Learning by Imitation from Human-Human Interaction, in Companion Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2017), pp. 408, Vienna, Austria, March 2017. (Video submission)
Extended Abstract pdf

The related research can be found in the following journal paper, as well as other work currently under review:

Phoebe Liu, Dylan F. Glas, Takayuki Kanda, and Hiroshi Ishiguro, Data-Driven HRI: Learning Social Behaviors by Example from Human-Human Interaction, in IEEE Transactions on Robotics, Vol. 32, No. 4, pp. 988-1008, 2016.
DOI: 10.1109/TRO.2016.2588880
Authors' Preprintpdf


Erica Demonstration at Miraikan, August 2015

source: Kyodo news On August 3, 2015, we unveiled our new android, Erica! Her name stands for "ERATO Intelligent Conversational Android". At a press release and an open symposium at "Miraikan", the Japanese national science museum in Tokyo, members of the press and the public came on stage, where Erica answered their questions about her interests, hobbies, dreams, and so on. Prof. Hiroshi Ishiguro from Osaka University and Prof. Tatsuya Kawahara, from Kyoto University, presented the vision and goals of the ERATO Ishiguro Symbiotic Human-Robot Interaction Project, and the other core members of the ERATO team presented various elements of the android system and their research objectives in this five-year project.

For this demonstration, Erica used our "ATRacker" pedestrian tracking system based on laser range finders in conjunction with a microphone array for sound source localization, in order to identify who was talking to her at any given time. Although many androids are teleoperated, Erica was developed to be fully-autonomous, and the only human input was selecting her different interaction modes, e.g. "Q+A Mode", "Listening mode", and "Idle mode", based on the schedule of events in the symposium.

The core android control software, gaze control, execution and blending of explicit gestures and facial expressions, and various nonverbal behaviors such as blinking, breathing, and backchannel nodding, were developed by our team at ATR. The dialogue management framework was developed by a team from Kyoto University using Julius for speech recognition. Speech synthesis was performed using a custom voice developed for Hoya's VoiceText software.

Erica is one of a set of three sister robots, one each at Osaka University, Kyoto University, and ATR. In the future, we plan to develop Erica's personality and capabilities further, creating more engaging, humanlike, and expressive interactions, and to lay a framework for developing interactive android applications for a variety of scenarios.

English and other languages: Japanese:

Miraikan ASIMO Tour Guide Demonstration, 2013

In October, 2013, we demonstrated the results of our collaboration with Honda. Using our ambient intelligence systems and several new algorithms developed in the project, we deployed Honda's ASIMO robot as an interactive tour guide in Tokyo's Miraikan science museum.

In this demonstration, the robot relied on our 3D human tracking system to precisely track where and how a visitor was walking, and this data was used to infer the visitor's level of interest in different exhibits and determine the optimal standing position and communication strategy for the robot. For repeat visitors, the robot used information about previous visits to enable richer autonomous spoken interactions.

Note: Our work should not be confused with an unrelated ASIMO demonstration conducted at Miraikan in July 2013, which received some negative reviews when it misrecognized people holding up their smartphones for people raising their hands to ask a question.

Thank you very much, Mr. Roboto | World Future Society, 2011

Ishiguro-sensei and I were featured in an article in THE FUTURIST Magazine, published by the World Future Society.

The article covers several projects involving Robovie, Geminoid, and some of our Ambient Intelligence work as well as describing some of the motivations of our research and providing an interesting perspective on some of our robotics research.



Thank you very much, Mr. Roboto | World Future Society



APiTA Town Keihanna Shopping Center Wheelchair Demonstration, 2011

On March 30, 2011, we demonstrated the latest developments in the Ubiquitous Network Robot project, this time featuring a robotic wheelchair in a shopping mall.

This demo presented new applications of networked robot systems, demonstrating autonomous planning and safety using ubiquitous sensor networks, location-based services, and integration with remote teleoperators and mobile devices over the internet, to ensure safety and ease-of-use of the robotic wheelchair.

The system successfully enabled the customer to move freely throughout the shopping mall by herself, giving a new level of independence to someone who typically would be dependent on a caretaker to accompany her if she even went out at all.

Nara Tourist Information Center, 2010

In December 2010, we placed a Robovie-R3 robot in the Nara Tourist Information Center, near JR Nara Station, to demonstrate a prototype system enabling elderly operators to control a conversational robot over the internet. The target of the system is to enable people with limited mobility, such as retirees or parents raising young children, to easily engage in part-time work using teleoperated robots.

The teleoperators in the demonstration were members of Suzaku, an association of retirees who act as volunteer tour guides in the Nara area. One operator was located in Nara and the other at ATR in Kyoto, and they took turns controlling the robot to answer questions, give directions, and tell entertaining stories about famous sights in Nara, such as the Great Buddha at Todaiji temple and the deer in Nara Park.

APiTA Town Keihanna Shopping Center, 2009-2010

In 2009-2010, we placed robots and environmental sensors in the APiTA Town shopping mall. The target application: helping elderly people with their shopping. This field demonstration showcased several new technologies, such as new, portable sensors for our laser-based human tracking system and smartphone integration with robot services.

Universal CityWalk Osaka, 2008-2009

In 2008-2009, we performed a series of field studies in which four robots patrolled a part of the shopping area in Universal CityWalk Osaka, greeting customers, recommending shops, and giving directions. During these studies, we demonstrated our multi-robot teleoperation systems, our human tracking and motion primitive analysis systems, a global service allocation and path planning system, and several other technologies.


Robot Services

Integration with environmental sensor networks and primitive analysis enable us to target robot services to the people who appear most likely to need them.

Network Robot System Demonstration

Final demonstration of the Network Robot project, including simultaneous teleoperation of four robots, centralized dynamic service allocation and path planning, robot-robot collaboration, and much more.

Robovie and ASIMO - "Robot Cafe"

Combining the strengths of the two robots, Robovie and Honda's ASIMO work together in a cafe demonstration, where Robovie chats with the customers and takes orders, and ASIMO walks around delivering drinks. The collaboration was enabled by the Network Robot Platform, which mediated messages between the two robots.

Robovie and DustCart Collaboration

DustCart, the autonomous mobile trash-can robot developed as a part of the DustBot project by SSSA, came to Japan for a collaboration with Robovie. The two robots demonstrated a luggage-carrying service scenario mediated by the Network Robot Platform.

RoboPal, 2007

This was the first robot I worked on at ATR. It is designed to be a daily-life companion for elderly people. I developed the entire robot control architecture, including map-based localization, path planning, a safety system, a scripting language for developing applications, and a graphical interface for teleoperation and system monitoring.