The new AI can predict how it would feel to touch an object, just by looking at it. It can also create a visual representation of an object, just from the tactile data it generates by touching it. Yunzhu Li, CSAIL PhD student and lead author on the paper about the system, said the model can help robots handle real-world objects better:  The research team used a KUKA robot arm with a special tactile sensor called GelSight to train the model. Then it made the arm touch 200 household objects 12,000 times, and recorded the visual and tactile data. Based on that, it created a data set of 3 million visual-tactile images called VisGel. Andrew Owens, a postdoctoral researcher at the University of California at Berkeley, opined this research can aid robots in knowing how firmly it should grip an object: The researchers are presenting this paper at The Conference on Computer Vision and Pattern Recognition in the US this week. 

MIT s new AI for robots can  feel  an object just by seeing it - 91