Accelerating In-Vehicle Network Intrusion Detection System Using Binarized Neural Network
- 提供方法
- 版元よりダウンロードリンクを連絡
- 形態
- 価格
- 一般価格(税込):¥6,600 会員価格(税込):¥5,280
- 文献・情報種別
- SAE Paper
No.2022-01-0156
- 掲載ページ
- 1-14(Total 14 p)
- 発行年月
- 2022年 3月
- 出版社
- SAE International
- 言語
- 英語
- イベント
- WCX SAE World Congress Experience 2022
書誌事項
著者(英) | 1) Linxi Zhang, 2) Xuke Yan, 3) Di Ma |
---|---|
勤務先(英) | 1) University of Michigan-Dearborn, 2) Oakland University, 3) University of Michigan-Dearborn |
抄録(英) | Controller Area Network (CAN), the de facto standard for in-vehicle networks, has insufficient security features and thus is inherently vulnerable to various attacks. To protect CAN bus from attacks, intrusion detection systems (IDSs) based on advanced deep learning methods, such as Convolutional Neural Network (CNN) and Recurrent Neural Network (RNN), have been proposed to detect intrusions. However, those models generally introduce high latency, require considerable memory space, and often result in high energy consumption. To accelerate intrusion detection and also reduce memory requests, we exploit the use of Binarized Neural Network (BNN) and hardware-based acceleration for intrusion detection in in-vehicle networks. As BNN uses binary values for activations and weights rather than full precision values, it usually results in faster computation, smaller memory cost, and lower energy consumption than full precision models. Moreover, unlike other deep learning methods, BNN can be further accelerated by leveraging Field-Programmable Grid Arrays (FPGAs) since BNN cuts down the hardware consumption. We design our BNN model to suit CAN traffic data and exploit sequential features of the CAN traffic instead of individual messages. We evaluate the proposed IDS with four different real vehicle datasets. Our experimental results show that the proposed BNN-based IDS reduces the detection latency on the same CPU (3 times faster) while maintaining acceptable detection rates compared to full precision models. We also implement the proposed IDS using FPGA hardware to reduce latency further and accelerate intrusion detection. Our experiments on multiple platforms demonstrate that using the FPGAs dramatically reduces the detection latency (128 times faster) with lower power consumption in comparison with an embedded CPU. 翻訳 |