TY - JOUR
T1 - TFKAN
T2 - Transformer based on Kolmogorov–Arnold Networks for Intrusion Detection in IoT environment
AU - Fares, Ibrahim A.
AU - Abd Elaziz, Mohamed
AU - Aseeri, Ahmad O.
AU - Zied, Hamed Shawky
AU - Abdellatif, Ahmed G.
N1 - Publisher Copyright:
© 2025 The Authors
PY - 2025/6
Y1 - 2025/6
N2 - This work proposes a novel Transformer based on the Kolmogorov–Arnold Network (TFKAN) model for Intrusion Detection Systems (IDS) in the IoT environment. The TFKAN Transformer is developed by implementing the Kolmogorov–Arnold Networks (KANs) layers instead of the Multi-Layer Perceptrons (MLP) layers. Unlike the MLPs feed-forward layer, KAN layers have no fixed weights but use learnable univariate function components, enabling a more compact representation. This means a KAN can achieve comparable performance with fewer trainable parameters than a larger MLP. The RT-IoT2022, IoT23, and CICIoT2023 datasets were used in the evaluation process. The proposed TFKAN Transformer outperforms and obtains higher accuracy scores of 99.96%, 98.43%, and 99.27% on the RT-IoT2022, IoT23, and CICIoT2023 datasets, respectively. The results indicate that the developed Transformer using KAN shows promising performance in IDS within IoT environments compared to MLP layers.Transformers based on KANs are on average 78% lighter, in parameter count, than Transformers using MLPs. This makes KANs promising to be a replacement for MLPs.
AB - This work proposes a novel Transformer based on the Kolmogorov–Arnold Network (TFKAN) model for Intrusion Detection Systems (IDS) in the IoT environment. The TFKAN Transformer is developed by implementing the Kolmogorov–Arnold Networks (KANs) layers instead of the Multi-Layer Perceptrons (MLP) layers. Unlike the MLPs feed-forward layer, KAN layers have no fixed weights but use learnable univariate function components, enabling a more compact representation. This means a KAN can achieve comparable performance with fewer trainable parameters than a larger MLP. The RT-IoT2022, IoT23, and CICIoT2023 datasets were used in the evaluation process. The proposed TFKAN Transformer outperforms and obtains higher accuracy scores of 99.96%, 98.43%, and 99.27% on the RT-IoT2022, IoT23, and CICIoT2023 datasets, respectively. The results indicate that the developed Transformer using KAN shows promising performance in IDS within IoT environments compared to MLP layers.Transformers based on KANs are on average 78% lighter, in parameter count, than Transformers using MLPs. This makes KANs promising to be a replacement for MLPs.
KW - Cybersecurity
KW - Intrusion detection
KW - Kolmogorov–Arnold Networks (KANs)
KW - Multi-Layer Perceptrons (MLPs)
KW - Transformers
UR - http://www.scopus.com/inward/record.url?scp=105002152898&partnerID=8YFLogxK
U2 - 10.1016/j.eij.2025.100666
DO - 10.1016/j.eij.2025.100666
M3 - Article
AN - SCOPUS:105002152898
SN - 1110-8665
VL - 30
JO - Egyptian Informatics Journal
JF - Egyptian Informatics Journal
M1 - 100666
ER -