Somaia Saba
MIT EECS | Aptiv Undergraduate Research and Innovation Scholar
Visualizing and Predicting Human Behavior
2022–2023
Electrical Engineering and Computer Science
- Devices
Richard R. Fletcher
Past research has demonstrated that digital phenotyping can be implemented in wearables and smartphone devices to use sensor data features in order to assess mental health states with high accuracy [1-5]. The data used includes data from sensors built into the phone or wearable, such as accelerometer and gyroscope data, as well as user-generated data, such as screen unlock events and call activity. Using this data, psychiatric conditions, including depression and anxiety, can be detected and the severity of the disease can be analyzed.
This project expands on the work of these previous studies by examining how the outputs of digital phenotyping algorithms can be visualized in order to allow users to assess patterns of behavior. Over the past few years, Dr. Fletcher’s group has developed its own digital phenotyping software that collects data from several important sensors including a 3-axis accelerometer, gyroscope, light sensor, screen unlock events, battery charging, GPS, call activity, and SMS/text activity. Using this software to collect data and implementing digital phenotyping algorithms to analyze the data will allow for the detection of mental states, as described in past studies. These predicted mental states could then be analyzed over time in a visual manner in order to recognize patterns of behavior.
To the best of our knowledge, visualization of digital phenotyping outputs has not been done before. Through visualization in the manner of a grid, patterns of behavior can be made more easily visible to users, and allow users to more readily address any patterns that indicate clinical relevance.
Human behavior and psychosocial factors contribute significantly to a myriad of global diseases, including cardiovascular disease, respiratory disease, diabetes, and drug addiction. Understanding, predicting, and analyzing human behavior remains a major unresolved problem in healthcare and medicine. One technique developed to begin addressing this issue is digital phenotyping, or the use of digital and sensor information to infer human behavior.
Previously, we developed a mobile phone-based platform that collects sensor data to help track, predict, and visualize human behavior and mental health. We have also created a data preprocessing scripts and algorithms in order to predict if someone is asleep or awake based on the mobile sensor data. Ultimately, the purpose is to create a visual representation of data and digital phenotyping outputs to help users detect patterns in their behavior. To achieve this goal, we will be running a pilot study to evaluate the efficacy of our system and show how the system can be extrapolated for predicting behaviors beyond wakefulness.