Please log in

Paper / Information search system

日本語

ENGLISH

Help

Please log in

  • Summary & Details

Accelerating In-Vehicle Network Intrusion Detection System Using Binarized Neural Network

Detailed Information

Author(E)1) Linxi Zhang, 2) Xuke Yan, 3) Di Ma
Affiliation(E)1) University of Michigan-Dearborn, 2) Oakland University, 3) University of Michigan-Dearborn
Abstract(E)Controller Area Network (CAN), the de facto standard for in-vehicle networks, has insufficient security features and thus is inherently vulnerable to various attacks. To protect CAN bus from attacks, intrusion detection systems (IDSs) based on advanced deep learning methods, such as Convolutional Neural Network (CNN) and Recurrent Neural Network (RNN), have been proposed to detect intrusions. However, those models generally introduce high latency, require considerable memory space, and often result in high energy consumption. To accelerate intrusion detection and also reduce memory requests, we exploit the use of Binarized Neural Network (BNN) and hardware-based acceleration for intrusion detection in in-vehicle networks. As BNN uses binary values for activations and weights rather than full precision values, it usually results in faster computation, smaller memory cost, and lower energy consumption than full precision models. Moreover, unlike other deep learning methods, BNN can be further accelerated by leveraging Field-Programmable Grid Arrays (FPGAs) since BNN cuts down the hardware consumption. We design our BNN model to suit CAN traffic data and exploit sequential features of the CAN traffic instead of individual messages. We evaluate the proposed IDS with four different real vehicle datasets. Our experimental results show that the proposed BNN-based IDS reduces the detection latency on the same CPU (3 times faster) while maintaining acceptable detection rates compared to full precision models. We also implement the proposed IDS using FPGA hardware to reduce latency further and accelerate intrusion detection. Our experiments on multiple platforms demonstrate that using the FPGAs dramatically reduces the detection latency (128 times faster) with lower power consumption in comparison with an embedded CPU.

About search

close

How to use the search box

You can enter up to 5 search conditions. The number of search boxes can be increased or decreased with the "+" and "-" buttons on the right.
If you enter multiple words separated by spaces in one search box, the data that "contains all" of the entered words will be searched (AND search).
Example) X (space) Y → "X and Y (including)"

How to use "AND" and "OR" pull-down

If "AND" is specified, the "contains both" data of the phrase entered in the previous and next search boxes will be searched. If you specify "OR", the data that "contains" any of the words entered in the search boxes before and after is searched.
Example) X AND Y → "X and Y (including)"  X OR Z → "X or Z (including)"
If AND and OR searches are mixed, OR search has priority.
Example) X AND Y OR Z → X AND (Y OR Z)
If AND search and multiple OR search are mixed, OR search has priority.
Example) W AND X OR Y OR Z → W AND (X OR Y OR Z)

How to use the search filters

Use the "search filters" when you want to narrow down the search results, such as when there are too many search results. If you check each item, the search results will be narrowed down to only the data that includes that item.
The number in "()" after each item is the number of data that includes that item.

Search tips

When searching by author name, enter the first and last name separated by a space, such as "Taro Jidosha".