標題: 實時交通標誌偵測與辨識
Real-time Traffic Sign Detection and Recognition
作者: 林士傑
杭學鳴
Lin, Shih-Chieh
Hang, Hsueh-Ming
電子研究所
關鍵字: 交通標誌;圖像偵測;圖像辨識;機器學習;實時運算;Traffic Sign;Image Detection;Image Recognition;Machine Learning;Real-time Computation
公開日期: 2017
摘要: 本論文建構一交通標誌偵測與辨識系統,此系統能於低階工業電腦上進行實時運算,並能於多樣的環境下保持良好的偵測與辨識率。許多目前現有的系統專注於達成最優秀的偵測率,並沒有對運算需求作限制,因此無法於車上使用。 我們採用色彩與形狀的圖形切割方法來組成我們的交通標誌偵測架構,目的是為了能受益於兩種方法個別的優點。我們採用較快速的色彩切割做為本系統的第一步驟。於本系統中,色彩切割的目的是大幅度地減少被送往形狀切割的資料量,形狀切割的目的是用剩餘資料來辨別可能的交通標誌。偵測結束後,我們從選定的區域擷取出優良的HOG特徵,並將特徵傳至一個事先訓練好的SVM分類器進行交通標誌的辨識。 為了確保本系統適用於現實情況,我們於三個知名的資料庫(GTSB, LTS 和 IEEE VIP cup)上進行實驗,而我們也使用低階工業電腦做為實驗平台。本系統能夠在足夠高的畫面更新率下達成一線水準的準確率。
This thesis proposes a real-time traffic sign detection and recognition system, which can achieve both real-time processing on a low-end industrial computer and decent performance under various environments. Many of the existing researches focus on achieving the best possible accuracy with no restriction on computation cost. We combine color and shape segmentation techniques to design our traffic sign detection scheme. We hope to take the respective advantages of color and shape segmentation methods. We use the faster color segmentation methods in the first part of our proposed system. In our system, the objective of color segmentation part is to greatly reduce the irrelevant data that will be sent to the shape segmentation part. The objective of shape segmentation is to identify the locations of traffic signs using the sign shape information. After placing bounding boxes on the traffic sign candidate locations, we extract the robust HOG features from the candidates and send the features to a pre-trained SVM classifier for traffic sign recognition. To ensure that our proposed method is applicable for real-world situations, we performed experiments on three popular datasets (GTSB, LTS and IEEE VIP cup) and we used a low-end industrial computer as the computational platform. Our system is able to achieve first-tier performance at a desirable high frame rate.
URI: http://etd.lib.nctu.edu.tw/cdrfb3/record/nctu/#GT070450270
http://hdl.handle.net/11536/142266
Appears in Collections:Thesis