Refinement of Extended Accelerated Over-Relaxation Method for Solution of Linear Systems

Given any linear stationary iterative methods in the form z = Jz + f, where J is the iteration matrix, a significant improvements of the iteration matrix will decrease the spectral radius and enhances the rate of convergence of the particular method while solving system of linear equations in the form Az = b. This motivates us to refine the Extended Accelerated Over-Relaxation (EAOR) method called Refinement of Extended Accelerated Over-Relaxation (REAOR) so as to accelerate the convergence rate of the method. In this paper, a refinement of Extended Accelerated Over-Relaxation method that would minimize the spectral radius, when compared to EAOR method, is proposed. The method is a 3-parameter generalization of the refinement of Accelerated Over-Relaxation (RAOR) method, refinement of Successive Over-Relaxation (RSOR) method, refinement of Gauss-Seidel (RGS) method and refinement of Jacobi (RJ) method. We investigated the convergence of the method for weak irreducible diagonally dominant matrix, matrix or matrix and presented some numerical examples to check the performance of the method. The results indicate the superiority of the method over some existing methods.


Introduction
Solutions of linear systems is basically one of the most important aspect of solving physical problems encountered in fields like science, engineering, biological sciences and so on. This is due to the fact that, it is virtually impossible to perform any sort of numerical analysis without coming across linear algebraic equations (Laskar and Behera, 2014). Moreso, sets of linear equations generated from partial differential equations are mostly sparse and large, consumes computation time and resources. The Direct and iterative methods are usually employed to solve such linear systems and utilizing some properties of the coefficient matrices like sparseness for example usually makes it possible for storage reduction, run time and cost (Kiusalaas, 2005). Iterative methods are quite effective and usually preferable over direct methods in finding solutions to sparse/large linear systems. Modification of existing iterative methods in order to achieve higher rate of convergence led to several refinement methods. Dafchahi (2008) introduced the refinement of Jacobi (RJ) method and proved that the method is superior to Jacobi method. Vatti and Eneyew (2011) and Genanew (2016) enhanced the convergence rate of the Gauss-Seidel method by developing the refinement of Gauss-Seidel (RGS) method. Kyurkhiev and Iliev (2013) modified the SOR and SSOR schemes by proposing the refinement of Successive Over-Relaxation (RSOR) and refinement of Symmetric Successive Over-Relaxation (RSSOR) methods. Vatti et al. (2015) improved on the SOR scheme and proposed the RSOR method. Vatti et al. (2018) attempted to magnify convergence rate of the AOR method by the method called refinement of Accelerated Over-Relaxation (RAOR) method. Muleta and Gofe (2018) developed a Refinement of generalized Accelerated Over Relaxation (RGAOR) method. Recently, Audu et al. (2021) introduced the Extended Accelerated Over-Relaxation method, an efficient three-parameter method that generalizes the AOR, SOR, Jacobi and Gauss-Seidel methods and superior to the above mentioned methods.
This research work is aimed at increasing the rate of convergence of the Extended Accelerated Over-Relaxation (EAOR) method by refining the method into a more efficient method that would hasten the numerical solution of linear systems. We consider the linear system expressed as = (1) where ̃= [ ] is the coefficient matrix, is the unknowns and is the constant at the right hand side. Many methods are employed in solving (1) through the splitting of =̃−̃−̃ (2) where -̅ , -̅ and ̃ are strictly upper, strictly lower and diagonal part of respectively. Furthermore, (1) can be written as and ̃− 1 =̃. Hence the system of equation in (1) is transformed into ̃=̃ (5) where ̃=̃− 1 . The popular AOR method by Hadjidimos (1978) for numerical solution of (1) is expressed as ] represents the EAOR iterative matrix and the spectral radius of the iterative matrix is denoted as ( , , ). We shall consider weak irreducible diagonally dominant and (or ) matrices for the coefficient matrix in (1).

The REAOR Method
Considering the regular splitting of to solve (1) by a refinement iterative method, we multiply the linear system ̃=̃ by the over-relaxation parameter to obtain (15), the REAOR formula takes the form; Where ( +1) appearing in the right hand side, is ( + 1) ℎ estimation of the EAOR iterative method. The EAOR method in (7) is inserted into (16) to obtain (19) is called the REAOR method, or can be represented as is the iteration matrix of the REAOR method. The spectral radius of the REAOR method is the largest eigenvalue of its iteration matrix represented as ( , , ). With some specific values of the parameters , and , some refinement methods can be obtained such as Definition 4: A square matrix ( ) × is considered as an matrix if ≤ 0, > 0, = 1,2, ⋯ , , is non-singular and −1 ≥ 0.

Corollary 1.
Suppose is an or −matrix with the range of 0 < + ≤ ≤ 1 for the parameters, then the EAOR method is convergent for any arbitrary initial approximation (0) . Proof: see Audu et al. (2021).

Theorem 1: If matrix
is an or −matrix, then the Refinement of the Extended Accelerated Over-Relaxation (REAOR) method converges to the exact solution for any initial guess (0) . Proof: If is the true solution of = , since the coefficient matrix is an −matrix, it follows from corollary 1 that the EAOR method is convergent. This makes it possible to arbitrarily close ( +1) to , hence we get ‖ ̅ (  shows that the REAOR is a convergent method and the proof is completed. Theorem 2: Let be weak irreducible diagonally dominant. Then for any choice of initial guess (0) , the REAOR method converges to the true solution . Proof: Let be the true solution of (1) and = ( ) is weak irreducible diagonally dominant. Then in view of corollary 2, the EAOR method is convergent and applying similar procedure of theorem 1, it is obvious that the REAOR method is convergent for irreducible weak diagonally dominant matrix and this completes the proof.

Numerical Experiment
In this section we present some numerical tests with respect to the new method. Specifically, we computed the spectral radius of the REAOR iteration matrix and further obtained its convergence results. We compared the results obtained with those of Accelerated Over-Relaxation method, Refinement of Accelerated Over-Relaxation method and Extended Accelerated Over-Relaxation to test the efficiency of the new method. The computations were carried out using Maple 2017 software with accuracy of 10 decimal places and the results are presented in tables 1 to 4.
Problem 1: Solve the linear system 1 whose coefficient matrix is an Irreducible diagonally dominant matrix by AOR, RAOR, EAOR and REAOR methods.

Discussion of Results
The rate of convergence is strongly connected to the spectral radius. The closer the spectral radius is to zero the faster the convergence. Comparison results of spectral radius of the new REAOR method and other existing methods for problem 1 and 2 are displayed in tables 1 and 2 respectively. It is observed that the spectral radius of the new method is lower than the spectral radii of EAOR, RAOR and AOR methods due to the fact that ( ) < ( ) < ( ) < ( ) < 1. This implies that the REAOR method will converge faster to the true solution than EAOR, AOR and RAOR methods. Figures 1 and 2 are graphical illustrations of the results in Tables 1  and 2 for easy understanding and comprehension  where ,2018 in figure 2 represents the RAOR method.
To validate the spectral radius result, we checked the convergence results for problem 1 and 2 given in tables 3 and 4. The REAOR method converges at 29 th and 22 nd iterations in comparison to 56 th and 43 rd iterations of the EAOR method, 41 st and 31 st iterations of the RAOR method and 81 st and 60 th iterations of the AOR method with respect to problems 1 and 2 respectively. The result shows that the new REAOR method converges faster to the true solution and took shorter time to achieve convergence when compared to similar existing methods.

Conclusion
Iterative methods are often employed in solving sparsely large linear systems. Although convergence is a major requirement of such methods, but how fast they converge is equally important. In this study, a new numerical iterative method for solving linear systems Refinement of Extended Accelerated Over-Relaxation method has been proposed. Convergence criteria for some special matrices were examined and two numerical examples of 9 × 9 and 10 × 10 linear systems were presented. Comparison of results of the new method with other existing methods in literature by their spectral radii and number of iterations, indicates that the REAOR method surpasses the other methods. Also, the study has shown that the Refinement method increases the convergence rate of the Extended Accelerated Over-Relaxation method.