We aim to develop safe and interactive autonomous systems, looking at methods for modeling and predicting human behavior, designing robust decision and control frameworks, and validating complex, multi-agent systems to verify safety. Active research topics include:
Agricultural Robotics


Agriculture is currently facing a labor crisis. Automating large equipment only partially addresses this problem. We focus on deploying small, low-cost robots beneath the crop canopy which can coordinate to create more sustainable agroecosystems. Before agbots can be ubiquitously used at scale, they need to reach high-levels of autonomy and be made easy-to-use by growers who are managing large acreage with little time to spare. We are developing tools for resilient autonomy and methods to facilitate interaction with the agbots. Through intelligent interaction, we will enable natural collaboration with the robots on the field and effective remote supervision from afar.
Agriculture is currently facing a labor crisis. Automating large equipment only partially addresses this problem. We focus on deploying small, low-cost robots beneath the crop canopy which can coordinate to create more sustainable agroecosystems. Before agbots can be ubiquitously used at scale, they need to reach high-levels of autonomy and be made easy-to-use by growers who are managing large acreage with little time to spare. We are developing tools for resilient autonomy and methods to facilitate interaction with the agbots. Through intelligent interaction, we will enable natural collaboration with the robots on the field and effective remote supervision from afar.
Students: Tianchen Ji, Peixin Chang
Sponsors: NRI2.0 // USDA/NIFA
Illinois Center for Digital Agriculture
- Multi-Modal Anomaly Detection for Unstructured and Uncertain Environments.
Tianchen Ji, Sri Theja Vuppala, Girish Chowdhary, and Katherine Driggs-Campbell. CoRL 2020. Available on arXiv - Proactive Anomaly Detection for Robot Navigation with Multi-Sensor Fusion.
Tianchen Ji, Arun Narenthiran Sivakumar, Girish Chowdhary, and Katherine Driggs-Campbell. RA-L 2022.
Collaborative Manufacturing


Traditional factories strictly separate human and robot workers to facilitate safe, high-speed automation. We are developing tools to bring the next generation factories that allow collaborative manufacturing. Our work is based on the premise that fluid interaction among small human and robot teams will would enable flexible, high-performance, low-volume production lines. We are exploring precision manipulation, safe interaction, intelligent scheduling, and simulation.
Traditional factories strictly separate human and robot workers to facilitate safe, high-speed automation. We are developing tools to bring the next generation factories that allow collaborative manufacturing. Our work is based on the premise that fluid interaction among small human and robot teams will would enable flexible, high-performance, low-volume production lines. We are exploring precision manipulation, safe interaction, intelligent scheduling, and simulation.
Students: Zhe Huang, Ye-Ji Mun
Sponsors:
Foxconn Interconnect Technology (C-NICE)
ZJU-UIUC Joint Research Center:
Center for Adaptive, Resilient Cyber-Physical Manufacturing Networks
- Long-term Pedestrian Trajectory Prediction using Mutable Intention Filter and Warp LSTM.
Zhe Huang, Aamir Hasan, Kazuki Shin, Ruohua Li, and Katherine Driggs-Campbell. RA-L 2020. Available on arXiv - Learning Sparse Interaction Graphs of Partially Detected Pedestrians for Trajectory Prediction.
Zhe Huang, Ruohua Li, Kazuki Shin, and Katherine Driggs-Campbell. RA-L 2021. Available on arXiv
Safe, Interactive Autonomous Vehicles



As autonomous systems begin to operate in safety critical domains, it is vital that we develop an understanding of their vulnerabilities and devise methods for safe interaction amongst humans. In domains such as autonomous driving, failures may be very rare but highly impactful. As a result, using brute force methods such as observing how the system behaves in real world environments is not an efficient method for analyzing potential failure cases. To overcome this, we explore approaches to adaptively search over the space of failures by formulating the problem of failure identification through a reinforcement learning approach, thus efficiently identifying high relevance system vulnerabilities. In order to guarantee safe behaviour during run-time, we investigate how to incorporate online reachability-based monitoring to formally assess the safety of a vehicle maneuvering autonomously among pedestrians. When integrated on a real life Polaris GEM autonomous vehicle, we demonstrate the ability to provide rigorous safety assurances in near real-time performance.
Despite the ultimate goal of full automation without human intervention, existing Advanced driver-assistance systems (ADAS) still require collaborative inputs from a human driver to ensure system safety. Our objective is to enable reliable and smooth interactions between humans and intelligent vehicles. Towards this goal, we explore: (1) methods that help human drivers understand autopilot behavior better; (2) vision-based algorithms that can help intelligent vehicles understand human driver's situational awareness better by detecting driver's cognitive distraction using low-cost windshield cameras or mobile devices; (3) interactions that can effectively communicate critical situations between drivers and intelligent vehicles.
Much like humans in the real world, who observe other drivers and make inferences, we have designed a framework that treats human drivers as sensors to provide environment information to our intelligent system. We used probabilistic learning methods to estimate a sensor model that captures how people dynamically respond to pedestrians (i.e., learning the relationship between environment state and action), so that driver’s actions can then serve as a proxy for detection. This framework has shown significantly improvement in overall environment awareness.
Students: Aamir Hasan, Peter Du
Sponsors:

Students: Ye-Ji Mun, Masha Itkina
Sponsors: Ford-Stanford Research Alliance

Students: Peter Du
Sponsors: Illinois Center for Autonomy
- AutoPreview: A Framework for Autopilot Behavior Understanding.
Yuan Shen, Niviru Wijayaratne, Peter Du, SJ Jiang, and Katherine Driggs-Campbell. CHI: Extended Abstracts, 2021.
- People as Sensors: Imputing Maps from Human Actions.
Oladapo Afolabi*, Katherine Driggs-Campbell*, Roy Dong, Mykel J. Kochenderfer, and S Shankar Sastry. IROS, 2018.Available on arXiv. - Multi-Agent Variational Occlusion Inference Using People as Sensors.
Masha Itkina, Ye-Ji Mun, Katherine Driggs-Campbell, and Mykel J. Kochenderfer. ICRA 2022.
Available on arXiv
- Finding Diverse Failure Scenarios in Autonomous Systems Using Adaptive Stress Testing.
Peter Du and Katherine Driggs-Campbell. SAE International Journal of Connected and Automated Vehicles, 2019. - Adaptive Failure Search Using Critical States from Domain Experts.
Peter Du and Katherine Driggs-Campbell. ICRA 2021. - Online Monitoring for Safe Pedestrian-Vehicle Interactions.
Peter Du, Zhe Huang, Tianchen Ji, Tianqi Liu, Ke Xu, Qichao Gao, Hussein Sibai, Katherine Driggs-Campbell, and Sayan Mitra. ITSC 2020. - Adaptive Stress Testing with Reward Augmentation for Autonomous Vehicle Validation.
Anthony Corso*, Peter Du*, Katherine Driggs-Campbell, and Mykel J. Kochenderfer. ITSC 2019.
Mobile Robotics and Navigation


Many crowd navigation methods are often short-sighted and prone to the freezing robot problem. To tackle these problems, we propose a novel robot planner that reasons about spatial and temporal relationships between the robot and the crowd. In addition, we are incorporating human intent estimation into the planner through active sensing of the robot.
To navigate independently in a physical environment, people generally rely on visual cues to understand the environment. However, recent studies find that this is especially difficult for people with visual impairments. Providing a robot guide that could facilitate wayfinding in a variety of environments would significantly improve the quality of life and the independence of people with vision impairments. In this project, we explore the feasibility of robot navigation for guidance and wayfinding in collaboration with the Human Factors & Aging Laboratory.
Students: Shuijing Liu, Peixin Chang, Neeloy Chakraborty, Eric Liang
Students: Shuijing Liu, Aamir Hasan
Sponsors: The Illinois Campus Research Board
- Decentralized Structural-RNN for Robot Crowd Navigation with Deep Reinforcement Learning.
Shuijing Liu, Peixin Chang, Weihang Liang, Neeloy Chakraborty, and Katherine Driggs-Campbell. ICRA 2021. Available on arXiv