Content area
The first research attempt to dynamically optimize the CORDIC algorithm’s iteration count using artificial intelligence is presented in this paper. Conventional approaches depend on a certain number of iterations, which frequently results in extra calculations and longer processing times. Our method drastically reduces the number of iterations without compromising accuracy by using machine learning regression models to predict the near-best iteration value for a given input angle. Overall efficiency is increased as a result of reduced computational complexity along with faster execution. We optimized the hyperparameters of several models, including Random Forest, XGBoost, and Support Vector Machine (SVM) Regressor, using Grid Search and Cross-Validation. Experimental results show that the SVM Regressor performs best, with a mean absolute error of 0.045 and an R2 score of 0.998. This AI-driven dynamic iteration prediction thus offers a promising route for efficient and adaptable CORDIC implementations in real-time digital signal processing applications.
Details
Accuracy;
Embedded systems;
Coordinate transformations;
Artificial intelligence;
Digital signal processing;
Support vector machines;
Regression models;
Neural networks;
Signal processing;
Mathematical functions;
Medical equipment;
Design;
Energy efficiency;
Algorithms;
Computer graphics;
Digital signal processors;
Real time;
Robotics
; Franzoni Valentina 2
; Milani, Alfredo 3
; Randieri Cristian 4
1 Amrita School of Artificial Intelligence, Amrita Vishwa Vidyapeetham, Coimbatore 641112, India; [email protected] (R.S.); [email protected] (L.C.R.)
2 Department of Mathematics and Computer Science, University of Perugia, 06123 Perugia, Italy; [email protected]
3 Department of Human Sciences, Link Campus University, 00165 Roma, Italy
4 Department of Theoretical and Applied Sciences, eCampus University, Via Isimbardi 10, 22060 Novedrate, Italy; [email protected]