Loading…
Traffic Signs Recognition using R-CNN
Traffic signs are considered as an important source of information for both vehicles and pedestrians. To ensure safety, they represent the rules that have been set in place and help communicate information to pedestrians and drivers, which can help maintain order and reduce accidents. Neglecting the...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Traffic signs are considered as an important source of information for both vehicles and pedestrians. To ensure safety, they represent the rules that have been set in place and help communicate information to pedestrians and drivers, which can help maintain order and reduce accidents. Neglecting them is considered as a dangerous practice. The majority of signs employ visuals rather than text to make them more generally comprehensible. As a result, it is critical to comprehend the significance of each image and utilize it to guide in the driving process. If the rider has not followed any rules, the rider may be either penalized or involved in a serious accident. According to a recent research, the significant number of accidents were caused by the driver's errors or bad judgement. It is possible to reduce the number of traffic accidents by using the driverless/automated vehicles without requiring any human operator to drive them to their destination. Driverless cars should also be conscious of traffic rules. Traffic signs should be interpreted by automated cars and their decisions should be based on this information. This research addresses the design and development of a neural network model capable of classifying traffic signs into multiple categories to assist self-driving automobiles in better interpretation of traffic signs. Although there are several existing models on TSR, the proposed model outperforms the existing models in terms of accuracy. |
---|---|
ISSN: | 2768-5330 |
DOI: | 10.1109/ICICCS53718.2022.9788295 |