CSET Project #: 2101
Project Funding: University of Hawai'i Manoa
Start Date: July 2021
End Date: July 2022
This project aims to develop a series of machine learning-based algorithms using drone-equipped surveillance cameras to automatically detect vehicle dynamic characteristics, such as vehicle speed, headway, and spacing, and then track vehicle trajectories in real-time to better identify and predict traffic crashes to mitigate crash injuries with minimum response time. Specifically, we plan to develop a drone-based computer vision-enabled Convolutional Neural Network (CNN) to detect and identify individual vehicle dynamics using machine learning approaches. A drone-equipped camera will be used to collect sequential images of various vehicle dynamics and then transfer learning mechanisms will be utilized to train the CNN to detect and classify vehicles in videos captured by drones. A self-calibrated road boundary extraction method based on image sequences will be developed to extract road boundaries and filter vehicles to improve the detection accuracy of vehicle dynamics. Using the results of neural network detection, we plan to use video-based object tracking to complete the extraction of vehicle trajectory information. Finally, vehicle dynamics and trajectory information will be calculated, and traffic crash potential will be estimated for real-time crash prediction and response time minimization. Using drone-based surveillance camera systems to monitor traffic operation flow conditions and identify crashes dynamically is of great significance for transportation agencies to improve traffic 2 safety performance developing timely countermeasures to mitigate rural crash severities and minimize the rural crash risks and severities in the States of Alaska, Washington, Idaho, and Hawaii.