Friday Jun 13, 2025

Knowledge Distillation for Modulation Classification in Resource-Constrained Devices

Automated modulation classification (AMC) is crucial in electronic warfare because it enhances situational awareness and prompts responses to hostile transmissions. This research addresses the challenges of deploying AMC in limited computational environments by proposing using response-based knowledge distillation (KD) to train compact yet accurate AMC models. The methodology involves a framework that integrates a signal-based convolutional neural network (SBCNN) and an image-based convolutional neural network (IBCNN). The SBCNN extracts features from preprocessed signal data, which is subsequently used to train the IBCNN. Experimental results indicate that the SBCNN-based approach, when trained with teacher distillation, achieves superior performance compared to its standalone counterpart. Our findings show that KD has significant potential to enhance AMC performance in real-time applications by balancing computational demands with classification accuracy.

Knowledge Distillation for Modulation Classification in Resource-Constrained Devices

Pedro Marcio Raposo Pereira, Felipe Augusto Pereira de Figueiredo, Rausley Adriano Amaral de Souza, National Institute of Telecommunications (Inatel)

Comment (0)

No comments yet. Be the first to say something!

Copyright 2025 All rights reserved.

Version: 20241125