Optimizing License Plate Recognition with Illumination and Blur Analysis

No time to read?
Get a summary

Researchers at the University of Science and Technology MISIS have collaborated with CITYLABS experts to pioneer algorithms for CCTV systems that can detect blurred or over-illuminated vehicle license plates. This development was shared with socialbites.ca by MISIS researchers, highlighting a practical stride in traffic surveillance technology.

Identifying a specific vehicle by its state registration plate is a central challenge in traffic analysis, especially under the demanding conditions found on technologically advanced roadways. Real-world factors like high vehicle speeds, intense headlight glare, accumulated dust, and limited camera capabilities frequently hinder accurate recognition. This project addresses those persistent bottlenecks by refining how plates are detected and interpreted in challenging visual environments.

To gauge how well a plate is lit in each frame, the team proposed a method based on analyzing the brightness histogram. The approach leverages the YOLOv5 neural network to simultaneously detect both vehicles and their license plates, providing a robust pipeline from scene understanding to plate extraction. The emphasis on histogram-based illumination assessment helps separate readable regions from problematic ones, paving the way for more reliable recognition in varying lighting conditions. [Source: MISIS; CITYLABS]

During neural network training, the researchers assembled datasets that reflect diverse times of day, seasonal shifts, and weather variations. Once the region containing the characters is localized, the three-dimensional RGB color information is distilled down to a single grayscale representation to simplify processing. From this grayscale image, a histogram analysis isolates the portion associated with overexposure, achieving a classification where approximately 95.7% of numbers are correctly identified as overexposed under realistic scenarios. The team also developed a specialized architecture that classifies turbidity with a high level of accuracy, attaining 96.4% precision and a processing time as fast as 0.073 milliseconds on an optimized system, as described by Igor Temkin, the Head of Automated Control Systems (ACS) at NUST MISIS. [Source: MISIS]

The creation of a separate dataset for blurring presented another key milestone. The new algorithm provides quantitative assessments of blur and illumination levels and categorizes images as readable or unreadable. This information can be used to automatically adjust camera parameters such as shutter speed and aperture, thereby enhancing the quality of subsequent frames and improving long-term recognition rates. [Source: MISIS; CITYLABS]

Experimental validation demonstrated the practicality of the proposed techniques across different hardware platforms, including personal computers and Nvidia Jetson Nano modules. The results indicated consistent performance gains in license plate readability and detection reliability, even in less-than-ideal lighting and weather conditions. The research team notes that the combined use of histogram-based illumination analysis and fast neural network inference creates a resilient pipeline suitable for real-time traffic monitoring and enforcement scenarios. [Source: MISIS]

Through these advances, the collaboration between MISIS and CITYLABS contributes to smarter traffic management and safer roadways by enabling more accurate plate recognition on cameras with varying specifications. The work illustrates how modern computer vision, when applied thoughtfully to edge cases like glare and blur, can significantly improve the fidelity of automated vehicle identification in dynamic urban environments. [Source: MISIS; CITYLABS]

ancient scientists created flight delay estimation app.

No time to read?
Get a summary
Previous Article

Russia frames US as a central rival and NATO near its borders after the Soviet collapse

Next Article

{"title":"Strategic Decarbonization in Catalonia: SAF, Public-Private Collaboration, and Growth"}