Cabin Awareness for Autonomous Vehicle Fleets
Katrina Armistead (MDE ‘22) and Noah Deutsch (MDE ‘22)
Independent Design Engineering Project
Fall 2021 – Spring 2022
Jock Herron (GSD) and David Keith (MIT)
Autonomous vehicle fleets promise to fundamentally reshape human mobility, offering safer, cheaper, and more enjoyable transportation options for all. However, even if self-driving cars can navigate the roads safely, they currently lack scalable systems to respond to messy passenger behavior. Situations that rideshare drivers routinely and straightforwardly address today—including lost items, seatbelt adherence, or sleeping passengers—become very difficult to handle when the driver is removed.
The current industry-standard solution to these challenges is costly and invasive one-to-one human monitoring during the ride, which begs the question, “Why even bother going driverless?” Ultimately, without investment in radical new solutions, autonomous vehicle fleets are destined to be plagued by serious safety, customer experience, and efficiency problems, which will greatly hamper their positive impact on society.
To tackle these problems, we introduce Raeda: an AI-powered sensor stack that characterizes messy human behavior inside autonomous rideshare vehicles. Our novel approach combines multiple sensing modalities (vision, smell, and audio) with privacy-preserving machine learning models to provide a comprehensive understanding of cabin status. Through real-time detection, our technology enables autonomous fleet operators to improve customer safety and experiences, lower vehicle depreciation, and increase fleet efficiency at scale.