The team took a KUKA robot arm and added a tactile sensor called GelSight, which was created by Ted Adelson’s group at CSAIL. The information collected by GelSight was then fed to an AI so it could learn the relationship between visual and tactile information.
To teach the AI how to identify objects by touch, the team recorded 12,000 videos of 200 objects like fabrics, tools and household objects being touched. The videos were broken down into still images and the AI used this dataset to connect tactile and visual data.
“By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”, says Yunzhu Li, CSAIL PhD student and lead author on a new paper about the system. “By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses together could empower the robot and reduce the data we might need for tasks involving manipulating…