Justin Daniel Tunis
MIT EECS | Lincoln Laboratory Undergraduate Research and Innovation Scholar
Indoor Localization via Sensor Fusion of UWB, IMU, and Camera Data
2017–2018
Electrical Engineering and Computer Science
- Graphics and Vision
Moe Win
Recently, a reliable solution for indoor localization has been heavily pursued. Ultra-wideband radio frequency (RF) has shown promising results as a technology for providing accurate indoor localization data. This project looks at combining this data with image data from cameras to set the foundation for potential applications to augmented reality (AR). Existing computer vision techniques will be used to identify optical markers in the image of the camera. These 2-D features will then be associated with a known 3-D coordinate of the optical marker. The output of the computer vision algorithm will be used in a data-fusion algorithm, along with the ultra-wideband (UWB) and inertial data, to achieve the final result. In future commercial indoor localization applications, optical markers can be replaced by 3-D indoor maps stored in a database.
Indoor localization has been a topic that has interested me for a few years now going back to freshman year when I wanted to develop a better way to be guided around grocery stores when shopping. I’m excited to explore new techniques in the space with potential applications to AR as that is another fascinating space. I love the research process and I’m excited about what this SuperUROP project has in store for me.