Interactive Human-Drone Partnerships in Emergency Response Scenarios
Approximately 22,922 firefighters are injured every year in the US while fighting structural fires. These injuries could be reduced if drones were used to collect actionable information. For example, drones could be equipped with thermal cameras and charged with autonomously creating a 3D heatmap of the building or detecting windows and taking infrared imagery of smoke-filled rooms to search for people trapped in the building. Similar scenarios play out across numerous emergency response scenarios – for example searching for a missing kayaker on a river or a child lost in the wilderness, surveying a traffic accident or a forest fire, or tracking a suspect in a school shooting. In such scenarios, drone autonomy is essential as it frees up the responders to focus on achieving mission goals without worrying about the details of drone operations.
Drones are ideally suited for supporting emergency response activities. When equipped with appropriate sensing capabilities, they can identify specific objects of interest, and surface valuable data out of a massive pool of largely uninteresting or irrelevant images even when that data is collected under less than ideal circumstances. We develop and augment algorithms to dynamically construct a high-fidelity scene depicting the context of the mission through scene restoration, interpretation, and mapping visual data onto physical maps. The generated scene provides the context for specifying missions, geolocating drones, supporting autonomous drone navigation, and directly achieving mission goals such as detecting victims in the water or through windows of burning buildings.