MIT EECS | Angle Undergraduate Research and Innovation Scholar
Sensor Fusion of Visual and Tactile Sensory Data for Object Localization and Robotic Manipulation
After decades of advances in technology to sense contact, the basic question remains: How should robots make effective use of tactile information? Combining tactile reasoning with the current vision-based manipulation systems could make robots significantly more capable of robust and dexterous manipulation. Applications range from manipulating momentarily occluded objects to complex manipulation tasks such as screwing. We propose to investigate natural ways of seamlessly integrating high-resolution tactile data with visual data to develop a reliable object pose tracker. The project will explore the use of inference and deep learning techniques to encode favorable tradeoffs between visual and tactile information and demonstrate these techniques on a robotic arm system.
“I have been interested in the area of robotic manipulation since I interned at the Computer Vision group at NASA-JPL on a task-oriented grasping project. My UROP project in this topic last semester was exciting and encouraged me to want to delve into the field further through SuperUROP. I’m excited to apply what I’ve learned through my courses on a real project and further my knowledge in controls and machine learning through this experience.”