Vol. 152
Latest Volume
All Volumes
PIERC 153 [2025] PIERC 152 [2025] PIERC 151 [2025] PIERC 150 [2024] PIERC 149 [2024] PIERC 148 [2024] PIERC 147 [2024] PIERC 146 [2024] PIERC 145 [2024] PIERC 144 [2024] PIERC 143 [2024] PIERC 142 [2024] PIERC 141 [2024] PIERC 140 [2024] PIERC 139 [2024] PIERC 138 [2023] PIERC 137 [2023] PIERC 136 [2023] PIERC 135 [2023] PIERC 134 [2023] PIERC 133 [2023] PIERC 132 [2023] PIERC 131 [2023] PIERC 130 [2023] PIERC 129 [2023] PIERC 128 [2023] PIERC 127 [2022] PIERC 126 [2022] PIERC 125 [2022] PIERC 124 [2022] PIERC 123 [2022] PIERC 122 [2022] PIERC 121 [2022] PIERC 120 [2022] PIERC 119 [2022] PIERC 118 [2022] PIERC 117 [2021] PIERC 116 [2021] PIERC 115 [2021] PIERC 114 [2021] PIERC 113 [2021] PIERC 112 [2021] PIERC 111 [2021] PIERC 110 [2021] PIERC 109 [2021] PIERC 108 [2021] PIERC 107 [2021] PIERC 106 [2020] PIERC 105 [2020] PIERC 104 [2020] PIERC 103 [2020] PIERC 102 [2020] PIERC 101 [2020] PIERC 100 [2020] PIERC 99 [2020] PIERC 98 [2020] PIERC 97 [2019] PIERC 96 [2019] PIERC 95 [2019] PIERC 94 [2019] PIERC 93 [2019] PIERC 92 [2019] PIERC 91 [2019] PIERC 90 [2019] PIERC 89 [2019] PIERC 88 [2018] PIERC 87 [2018] PIERC 86 [2018] PIERC 85 [2018] PIERC 84 [2018] PIERC 83 [2018] PIERC 82 [2018] PIERC 81 [2018] PIERC 80 [2018] PIERC 79 [2017] PIERC 78 [2017] PIERC 77 [2017] PIERC 76 [2017] PIERC 75 [2017] PIERC 74 [2017] PIERC 73 [2017] PIERC 72 [2017] PIERC 71 [2017] PIERC 70 [2016] PIERC 69 [2016] PIERC 68 [2016] PIERC 67 [2016] PIERC 66 [2016] PIERC 65 [2016] PIERC 64 [2016] PIERC 63 [2016] PIERC 62 [2016] PIERC 61 [2016] PIERC 60 [2015] PIERC 59 [2015] PIERC 58 [2015] PIERC 57 [2015] PIERC 56 [2015] PIERC 55 [2014] PIERC 54 [2014] PIERC 53 [2014] PIERC 52 [2014] PIERC 51 [2014] PIERC 50 [2014] PIERC 49 [2014] PIERC 48 [2014] PIERC 47 [2014] PIERC 46 [2014] PIERC 45 [2013] PIERC 44 [2013] PIERC 43 [2013] PIERC 42 [2013] PIERC 41 [2013] PIERC 40 [2013] PIERC 39 [2013] PIERC 38 [2013] PIERC 37 [2013] PIERC 36 [2013] PIERC 35 [2013] PIERC 34 [2013] PIERC 33 [2012] PIERC 32 [2012] PIERC 31 [2012] PIERC 30 [2012] PIERC 29 [2012] PIERC 28 [2012] PIERC 27 [2012] PIERC 26 [2012] PIERC 25 [2012] PIERC 24 [2011] PIERC 23 [2011] PIERC 22 [2011] PIERC 21 [2011] PIERC 20 [2011] PIERC 19 [2011] PIERC 18 [2011] PIERC 17 [2010] PIERC 16 [2010] PIERC 15 [2010] PIERC 14 [2010] PIERC 13 [2010] PIERC 12 [2010] PIERC 11 [2009] PIERC 10 [2009] PIERC 9 [2009] PIERC 8 [2009] PIERC 7 [2009] PIERC 6 [2009] PIERC 5 [2008] PIERC 4 [2008] PIERC 3 [2008] PIERC 2 [2008] PIERC 1 [2008]
2025-01-31
On Selecting Activation Functions for Neural Network-Based Digital Predistortion Models
By
Progress In Electromagnetics Research C, Vol. 152, 111-120, 2025
Abstract
Neural networks have become a focal point for their ability to effectively capture the complex nonlinear characteristics of power amplifiers (PAs) and facilitate the design of digital predistortion (DPD) circuits. This is accomplished through the utilization of nonlinear activation functions (AFs) that are the cornerstone in a neural network architecture. In this paper, we delve into the influence of eight carefully selected AFs on the performance of the neural network–based DPD. We particularly explore their interaction with both the depth and width of the neural network. In addition, we provide an extensive performance analysis using two crucial metrics: the normalized mean square error (NMSE) and the adjacent channel power ratio (ACPR). Our findings highlight the superiority of the exponential linear unit activation function (ELU AF), particularly within deep neural network (DNN) frameworks, among the AFs under consideration.
Citation
Mostapha Ouadefli, Mohamed Et-tolba, Abdelwahed Tribak, and Tomas Fernandez Ibanez, "On Selecting Activation Functions for Neural Network-Based Digital Predistortion Models," Progress In Electromagnetics Research C, Vol. 152, 111-120, 2025.
doi:10.2528/PIERC24120508
References

1. Mohammady, Somayeh, Ronan Farrell, David Malone, and John Dooley, "Performance investigation of peak shrinking and interpolating the PAPR reduction technique for LTE-advance and 5G signals," Information, Vol. 11, No. 1, 20, 2019.

2. Lopez-Bueno, David, Teng Wang, Pere L. Gilabert, and Gabriel Montoro, "Amping up, saving power: Digital predistortion linearization strategies for power amplifiers under wideband 4G/5G burst-like waveform operation," IEEE Microwave Magazine, Vol. 17, No. 1, 79-87, 2015.

3. Wood, John, Behavioral Modeling and Linearization of RF Power Amplifiers, Artech House, Boston; London, 2014.

4. Ceylan, Nazim, J.-E. Mueller, and Robert Weigel, "Optimization of EDGE terminal power amplifiers using memoryless digital predistortion," IEEE Transactions on Microwave Theory and Techniques, Vol. 53, No. 2, 515-522, 2005.

5. Kim, Jaehyeong and K. Konstantinou, "Digital predistortion of wideband signals based on power amplifier model with memory," Electronics Letters, Vol. 37, No. 23, 1417–1418, 2001.

6. Zhu, Anding, Jos C. Pedro, and Thomas J. Brazil, "Dynamic deviation reduction-based Volterra behavioral modeling of RF power amplifiers," IEEE Transactions on Microwave Theory and Techniques, Vol. 54, No. 12, 4323-4332, 2006.

7. Morgan, Dennis R., Zhengxiang Ma, Jaehyeong Kim, Michael G. Zierdt, and John Pastalan, "A generalized memory polynomial model for digital predistortion of RF power amplifiers," IEEE Transactions on Signal Processing, Vol. 54, No. 10, 3852-3860, 2006.

8. LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton, "Deep learning," Nature, Vol. 521, No. 7553, 436-444, 2015.
doi:10.1038/nature14539

9. Wang, Yingying, Yibin Li, Yong Song, and Xuewen Rong, "The influence of the activation function in a convolution neural network model of facial expression recognition," Applied Sciences, Vol. 10, No. 5, 1897, 2020.

10. Liew, Shan Sung, Mohamed Khalil-Hani, and Rabia Bakhteri, "Bounded activation functions for enhanced training stability of deep neural networks on visual pattern recognition problems," Neurocomputing, Vol. 216, 718-734, 2016.

11. Pedamonti, Dabal, "Comparison of non-linear activation functions for deep neural networks on MNIST classification task," ArXiv Preprint ArXiv:1804.02763, 2018.

12. Wang, Dongming, Mohsin Aziz, Mohamed Helaoui, and Fadhel M. Ghannouchi, "Augmented real-valued time-delay neural network for compensation of distortions and impairments in wireless transmitters," IEEE Transactions on Neural Networks and Learning Systems, Vol. 30, No. 1, 242-254, 2018.

13. Hongyo, Reina, Yoshimasa Egashira, Thomas M. Hone, and Keiichi Yamaguchi, "Deep neural network-based digital predistorter for Doherty power amplifiers," IEEE Microwave and Wireless Components Letters, Vol. 29, No. 2, 146-148, 2019.

14. Yu, Xiaoqi, Xiaohu Fang, Jie Shi, Guangsheng Lv, Changning Wei, and Jiangwei Sui, "Deep neural network based stable digital predistortion using ELU activation for switchless class-G power amplifier," 2024 IEEE MTT-S International Wireless Symposium (IWS), 1-3, Beijing, China, 2024.

15. Jiang, Yiyue, Andrius Vaicaitis, John Dooley, and Miriam Leeser, "Efficient neural networks on the edge with FPGAs by optimizing an adaptive activation function," Sensors, Vol. 24, No. 6, 1829, 2024.
doi:10.3390/s24061829

16. Filicori, F. and G. Vannini, "Mathematical approach to large-signal modelling of electron devices," Electronics Letters, Vol. 27, No. 4, 357-359, 1991.
doi:10.1049/el:19910226

17. Gilabert, Pere L. and Gabriel Montoro, "Look-up table implementation of a slow envelope dependent digital predistorter for envelope tracking power amplifiers," IEEE Microwave and Wireless Components Letters, Vol. 22, No. 2, 97-99, 2012.

18. Fischer-Bühner, Arne, Lauri Anttila, Manil Dev Gomony, and Mikko Valkama, "Phase-normalized neural network for linearization of RF power amplifiers," IEEE Microwave and Wireless Technology Letters, Vol. 33, No. 9, 1357-1360, 2023.

19. Lima, Eduardo G., Telmo R. Cunha, and José C. Pedro, "A physically meaningful neural network behavioral model for wireless transmitters exhibiting PM-AM/PM-PM distortions," IEEE Transactions on Microwave Theory and Techniques, Vol. 59, No. 12, 3512-3521, 2011.

20. Zhang, Yikang, Yue Li, Falin Liu, and Anding Zhu, "Vector decomposition based time-delay neural network behavioral model for digital predistortion of RF power amplifiers," IEEE Access, Vol. 7, 91559-91568, 2019.

21. Bawa, Vivek Singh and Vinay Kumar, "Linearized sigmoidal activation: A novel activation function with tractable non-linear characteristics to boost representation capability," Expert Systems with Applications, Vol. 120, 346-356, 2019.

22. Lohani, Harshit Kumar, S. Dhanalakshmi, and V. Hemalatha, "Performance analysis of extreme learning machine variants with varying intermediate nodes and different activation functions," Cognitive Informatics and Soft Computing: Proceeding of CISC 2017, 613-623, 2019.

23. Koçak, Yılmaz and Gülesen Üstündağ Şiray, "New activation functions for single layer feedforward neural network," Expert Systems with Applications, Vol. 164, 113977, 2021.

24. Nair, Vinod and Geoffrey E. Hinton, "Rectified linear units improve restricted boltzmann machines," Proceedings of the 27th International Conference on Machine Learning (ICML-10), 807-814, 2010.

25. Trottier, Ludovic, Philippe Giguere, and Brahim Chaib-Draa, "Parametric exponential linear unit for deep convolutional neural networks," 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA), 207-214, Cancun, Mexico, 2017.

26. Hendrycks, Dan and Kevin Gimpel, "Gaussian error linear units (GELUS)," ArXiv Preprint ArXiv:1606.08415, 2016.

27. Ramachandran, Prajit, Barret Zoph, and Quoc V. Le, "Searching for activation functions," ArXiv Preprint ArXiv:1710.05941, 2017.

28. MathWorks "Power amplifier characterization," https://www.mathworks.com/help/comm/ug/power-amplifier-characterization.html, June 2024.

29. Fawzy, Abdelwahab, Sumei Sun, Teng Joon Lim, and Yong Xin Guo, "An efficient deep neural network structure for RF power amplifier linearization," 2021 IEEE Global Communications Conference (GLOBECOM), 1-6, Madrid, Spain, 2021.