MIT’s new AI for robots can ‘feel’ an object just by seeing it

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a brand new AI that can feel objects just by seeing them – and vice versa. 

The new AI can predict how it would feel to touch an object, just by looking at it. It can also create a visual representation of an object, just from the tactile data it generates by touching it.

Yunzhu Li, CSAIL PhD student and lead author on the paper about the system, said the model can help robots handle real-world objects better: 

By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge. By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects.

Yunzhu Li, a PhD student at MIT CSAIL

The research…

https://thenextweb.com/artificial-intelligence/2019/06/17/mits-new-ai-for-robots-can-feel-an-object-just-by-seeing-it/

Have a comment? Type it below!