Targeting a Robotic Brain Capable of Thoughtful Communication

The Hagiwara Lab in the Department of Information and Computer Science of Keio University's Faculty of Science and Technology is trying to realize a robotic brain that can carry on a conversation, or in other words, a robotic brain that can understand images and words and can carry on thoughtful communication with humans.

"Even now, significant progress is being made with robots, and tremendous advancements are being made with the control parts. However, we feel like R&D with regards to the brain has been significantly delayed. When we think about what types of functions are necessary for the brain, the first thing that we as humans do is visual information processing. In other words, the brain needs to be able to process what is seen. The next thing is the language information processing that we as humans implement. By using language capabilities, humans are able to perform extremely advanced intellectual processing. However, even if a robotic brain can process what it sees and use words, it is still lacking one thing, specifically, feelings and emotions. Therefore, as a third pillar, we're conducting research on what is called Kansei Engineering, or affective information processing."

The Hagiwara Lab has adopted an approach of learning from the information processing of the human brain. The team is trying to construct a robotic brain while focusing on three elements: visual information processing, language information processing, and affective information processing, and an even more important point is integrating these three elements.

"With regards to visual information processing, by using a neural network as well, we're trying to recognize items through mechanisms based on experience and intuition in the same manner that is implemented directly by humans without having to use three-dimensional structures or perform complicated mathematical processing. In the conventional object recognition field, patterns from the recognized results are merely converted to symbols. However, by adding language processing to those recognized results, we can comprehensively utilize knowledge to get a better visual image. For example, even if an object is recognized as being a robot, knowledge such as the robot has a human form, or it has arms and legs can also be used. Next will be language information processing because processing of language functions is becoming extremely important. For example, even as a robot, the next step would be for it to recognize something as being cute, not cute, mechanical, or some other type of characteristic. Humans naturally have this type of emotional capability, but in current robotic research, that type of direction is not being researched much. Therefore, at our lab, we're conducting research in a direction that enables robots to understand what they see, to use language information processing to understand what they saw as knowledge, and to then comprehensively use the perspective of feelings and emotions like those of humans as well."

The robotic brain targeted by the Hagiwara Lab is one that is not merely just smart. Instead, the lab is targeting a robotic brain with emotions, feelings, and spirit that will enable it to interact skillfully with humans and other environments. To achieve this, the lab is conducting a broad range of research from the fundamentals of Kansei Engineering to applications thereof in fields such as entertainment, design, and healing.

"Most of the robots thus far move exactly as they are programmed to do. However, within the next 10 years, and perhaps even sooner, I believe that robots will be steadily introduced into the home. And when that happens, the interface with humans, which are the users, will be extremely important. For example, if you have a robot that can undergo a variety of movements rather than being a robot like this that doesn't move, and if amongst those various movements, there is movement that looks like fluctuation, then communication is occurring amongst that movement, or if the contact time with the robot becomes longer, then of course the robot will be able to understand even the user's feelings and personality, and it can then respond and act accordingly. We're trying to build a robot that is capable of that type of attentiveness."




Related Articles

In 2015, you could become best friends with an amiable and attentive humanoid robot, like the relationship of Nobita and Doraemon (minus the Yojigen-poket)!

Sharp Launching Robot Smartphone "RoBoHoN"
A robot-style smartphone "RoBoHoN" will be released in early 2016. It's invented by Sharp and a robot genius, Tomotaka Takahashi. In addition to provide basic smartphone features, RoBoHoN is able to talk, walk, dance, and recognize human faces. It can also become a projector.
ATLAS is Coming for ASIMO -

Honda bills ASIMO as the world's most advanced humanoid robot, and taken as a whole, it’s probably accurate to say. But an American robot is catching up, and unless Honda’s got some new tricks (ASIMO X?), ATLAS is going to shove ASIMO aside and take his cookies.

Riding, Demoing, and Dismounting Skeletonics -

Getting in/on, Demoing, and Getting out of a Team Skeletonics Exoskeletal Suit
in a pretty small room)

Over the weekend we visited a conceptual capital, a Mecca of sorts, of super-powered, executive-level Japanese geekdom: Maker Faire: Tokyo 2013.

Reno J. Tibke - December 22, 2014

Autonomous Christmas 2014
Featuring drones, rovers, quadrupeds, robot fish, and a few completely novel forms of mechanical mobility, Autonomous Christmas proves that SuperRoboDorky science people actually do have senses of humor and plenty of Christmas spirit.
Robots are the New Mexicans and ActiveRobo SAM is Japan's Newest Mexican
Section A is our editor being furiously angry at robotics coverage and the corpsifying institution of journalism in general. Section B is some very interesting robotics news from Japan (telepresence robotic heavy equipment operation!). Section C is the bow on this wild ride robotics feature. Choose your own adventure(s)!
Dyson's 360 Eye Robo-Vacuum Arrives Late, and That is Totally Fine
This is really a high-end home appliance - as a product, more akin to a washing machine or refrigerator than a smartphone - and in this realm, a brand new line being six months late is better than on-time but undercooked. We see here some real competition for the company than currently owns the Japanese market.

We’ve been in love with Team Skeletonics’ human-powered exoskeleton for years, and all throughout, it's pretty much been the same mechanical and aesthetic configuration. But it now looks like they’ve been refining in the background, and might be bringing something new out to play.

• • •

Introduced way back in 2008, seven years after the Segway, Toyota's Winglet finally got to come out and play last week; practical trials are underway. Like the Segway (pretty much exactly), it’s a single rider, self-balancing mobility device. Toyota calls it a robot. Yeah... No.
Japanese Robot Movies -
Gymnast robot celebrates Tokyo’s successful 2020 Olympic bid with a triple backflip off the bar, and back onto the bar, industrial robots play with racecars, and a monolingual mini-humanoid speaks the first robotic words from the International Space Station. Three J-robots in under 4 minutes!

Yoriko Takahashi - June 11, 2015

Honda’s given us all a chance to satisfy the desire to build our very own ASIMO. It’s a great project for all ages, particularly for children with a budding interest in robotics.