An Improved Convergence Condition of the MMS Iteration Method for Horizontal LCP of H + -Matrices
In this paper, inspired by the previous work in (Appl. Math. Comput., 369 (2020) 124890), we focus on the convergence condition of the modulus-based matrix splitting (MMS) iteration method for solving the horizontal linear complementarity problem (HLCP) with H + -matrices. An improved convergence condition of the MMS iteration method is given to improve the range of its applications, in a way which is better than that in the above published article.
Keywords: horizontal linear complementarity problem; H +-matrix; the MMS iteration method
1. Introduction
As is known, the horizontal linear complementarity problem, for the given matrices , is to find that two vectors satisfy
(1)
where is given, which is often abbreviated as HLCP. If in (1), the HLCP (1) is no other than the classical linear complementarity problem (LCP) in [[1]], where I denotes the identity matrix. This implies that the HLCP (1) is a general form of the LCP.
The HLCP (1), used as a useful tool, often arises in a diverse range of fields, including transportation science, telecommunication systems, structural mechanics, mechanical and electrical engineering, and so on, see [[2], [4], [6]]. In the past several years, some efficient algorithms have been designed to solve the HLCP (1), such as the interior point method [[8]], the neural network [[9]], and so on. Particularly, in [[10]], the modulus-based matrix splitting (MMS) iteration method in [[11]] was adopted to solve the HLCP (1). In addition, the partial motivation of the present paper is from complex systems with matrix formulation, see [[12], [14]] for more details.
Recently, Zheng and Vong [[15]] further discussed the MMS method, as described below.
The MMS method [[10], [15]]. Let be a positive diagonal matrix and , and let and be the splitting of matrices A and B, respectively. Assume that is an arbitrary initial vector. For until the iteration sequence converges, compute by
(2)
where is obtained by
(3)
For the later discussion, some preliminaries are gone over. For a square matrix , , and , where and for . A matrix is called a non-singular M-matrix if and for ; an H-matrix if its comparison matrix is a non-singular M-matrix; an -matrix if it is an H-matrix with positive diagonals; and a strictly diagonally dominant (s.d.d.) matrix if . In addition, with , means for .
For the MMS method with -matrix, two new convergence conditions are obtained in [[15]], which are weaker than the corresponding convergence conditions in [[10]]. One of these is given below.
Theorem 1
([[15]]).Assume that are two -matrices and with ,
Let
be an H-splitting of A,
be an H-compatible splitting of B, and
be an
-matrix. Then the MMS method is convergent, provided one of the following conditions holds:
(a)
;
(b)
,
(4)
with
and
, where
, D and
are positive diagonal matrices such that
and
are two strictly diagonally dominant (s.d.d.) matrices.
At present, the difficulty in Theorem 1 is to check the condition (4). Besides that, the condition (4) of Theorem 1 is limited by the parameter k. That is to say, if the choice of k is improper, then we cannot use the condition (4) of Theorem 1 to guarantee the convergence of the MMS method. To overcome this drawback, the purpose of this paper is to provide an improved convergence condition of the MMS method, for solving the HLCP of -matrices, to improve the range of its applications, in a way which is better than that in Theorem 1 [[15]].
2. An Improved Convergence Condition
In fact, by investigating condition (b) of Theorem 1, we know that the left inequality in (4) may have a flaw. Particularly, when the choice of k is improper, we cannot use condition (b) of Theorem 1 to guarantee the convergence of the MMS method. For instance, we consider two matrices
To make A and B satisfy the convergence conditions of Theorem 1, we take
By the simple computations,
Hence, is a non-singular M-matrix, so that is an H-splitting. On the other hand, , so that is an H-compatible splitting.
For convenience, we take , where I denotes the identity matrix. By simple calculations, we have
and
Further, we have
Obviously, when , we naturally do not get that
This implies that condition (b) of Theorem 1 may be invalid when we use condition (b) of Theorem 1 to judge the convergence of the MMS method for solving the HLCP. To overcome this disadvantage, we obtain an improved convergence condition for the MMS method, see Theorem 2, whose proof is similar to the proof of Theorem 2.5 in [[15]].
Theorem 2.
Assume that
are two
-matrices, and
with
,
Let
be an H-splitting of A,
be an H-compatible splitting of B, and
be an
-matrix. Then the MMS method is convergent, provided one of the following conditions holds:
(a)
;
(b) when
,
(5)
where D is a positive diagonal matrix, such that
is an s.d.d. matrix.
Proof.
For Case (a), see the proof of Theorem 2.5 in [[15]].
For Case (b), by simple calculations, we have
(6)
Making use of Equation (6), based on the proof of Theorem 2.5 in [[15]], we have
where
Since is an -matrix, it follows that is a non-singular M-matrix, and the existence of such a matrix D (see [[16]], p. 137) satisfies
From the left inequality in (5), we have
(7)
Further, based on the inequality (7), we have
Thus, based on Lemma 2.3 in [[15]], we have
The proof of Theorem 2 is completed. □
Comparing Theorem 2 with Theorem 1, the advantage of the former is that condition (b) of Theorem 2 is not limited by the parameter k of the latter. Besides that, we do not need to find two positive diagonal matrices D and , such that and are, respectively, s.d.d. matrices, we just find one positive diagonal matrix D, such that is an s.d.d. matrix.
Incidentally, there exists a simple approach to obtain a positive diagonal matrix D in Theorem 2: first, solving the system gives the positive vector x, where ; secondly, we take , which can make an s.d.d. matrix.
In addition, if the -matrix itself is an s.d.d. matrix, then we can take in Theorem 2. In this case, we can obtain the following corollary.
Corollary 1.
Assume that
are two
-matrices, and
with
,
Let
be an H-splitting of A,
be an H-compatible splitting of B, and the
-matrix
be an s.d.d. matrix. Then, the MMS method is convergent, provided one of the following conditions holds:
(a)
;
(b) when
,
3. Numerical Experiments
In this section, we consider a simple example to illustrate our theoretical results in Theorem 2. All the computations are performed in MATLAB R2016B.
Example 1.
Consider the HLCP
, in which
,
, where
,
,
, and μ, ν are real parameters. Let
, with
In our calculations, we take
and
for A and B in Example 1,
is used for the initial vector. The modulus-based Jacobi (NMJ) method and Gauss–Seidel (NMGS) method, with
, are adopted. The NMJ and NMGS methods are stopped once the number of iterations is larger than 500 or the norm of residual vectors (RES) is less than
, where
Here, we consider two cases of Theorem 2. When , we take for the NMJ method and the NMGS method. In this case, Table 1 is obtained. When , we take , and obtain that and is an s.d.d. matrix. In this case, we take and for the NMJ and NMGS methods, and obtain Table 2 and Table 3.
The numerical results in Table 1, Table 2 and Table 3 not only further confirm that the MMS method is feasible and effective, but also show that the convergence condition in Theorem 2 is reasonable.
4. Conclusions
In this paper, the modulus-based matrix splitting (MMS) iteration method for solving the horizontal linear complementarity problem (HLCP) with -matrices, has been further considered. The main aim of this paper is to present an improved convergence condition of the MMS iteration method, to enlarge the range of its applications, in a way which is better than previous work [[15]].
Tables
Table 1 Numerical results for .
| m | 100 | 200 | 300 |
---|
NMJ | IT | 30 | 31 | 32 |
| CPU | 0.0381 | 0.2120 | 0.4114 |
| RES | 6.35 × 10 | 6.61 × 10 | 5.02 × 10 |
NMGS | IT | 19 | 20 | 20 |
| CPU | 0.0314 | 0.0952 | 0.2488 |
| RES | 6.86 × 10 | 4.79 × 10 | 7.30 × 10 |
Table 2 Numerical results for .
| m | 100 | 200 | 300 |
---|
NMJ | IT | 29 | 30 | 31 |
| CPU | 0.0379 | 0.1553 | 0.3976 |
| RES | 9.71 × 10 | 9.05 × 10 | 6.65 × 10 |
NMGS | IT | 18 | 19 | 19 |
| CPU | 0.0243 | 0.0931 | 0.2300 |
| RES | 6.01 × 10 | 4.00 × 10 | 6.11 × 10 |
Table 3 Numerical results for .
| m | 100 | 200 | 300 |
---|
NMJ | IT | 39 | 39 | 40 |
| CPU | 0.0474 | 0.1930 | 0.5127 |
| RES | 6.78 × 10 | 9.78 × 10 | 8.12 × 10 |
NMGS | IT | 20 | 20 | 21 |
| CPU | 0.0283 | 0.1109 | 0.2595 |
| RES | 4.76 × 10 | 8.47 × 10 | 4.59 × 10 |
Author Contributions
Conceptualization, methodology, software, S.W.; original draft preparation, C.L.; translation, editing and review, S.W.; validation, S.W. All authors have read and agreed to the published version of the manuscript.
Data Availability Statement
Data will be made available on request.
Conflicts of Interest
The authors declare no conflict of interest.
Acknowledgments
The author would like to thank three referees; their opinions and comments improved the presentation of the paper greatly.
[
Footnotes
1
Disclaimer/Publisher's Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
]
[
References
Cottle R.W., Pang J.-S., Stone R.E. The Linear Complementarity Problem; Academic: San Diego, CA, USA. 1992
2
Eaves B.C., Lemke C.E. Equivalence of LCP and PLS. Math. Oper. Res. 1981; 6: 475-484. 10.1287/moor.6.4.475
3
Ye Y. A fully polynomial time approximation algorithm for computing a stationary point of the generalized linear complementarity problems. Math. Oper. Res. 1993; 18: 334-345. 10.1287/moor.18.2.334
4
Mangasarian O.L., Pang J.S. The extended linear complementarity problem. SIAM J. Matrix Anal. Appl. 1995; 16: 359-368. 10.1137/S0895479893262734
5
Sznajder R., Gowda M.S. Generalizations of P0- and P-properties; extended vertical and horizontal linear complementarity problems. Linear Algebra Appl. 1995; 223: 695-715. 10.1016/0024-3795(93)00184-2
6
Ferris M.C., Pang J.S. Complementarity and Variational Problems: State of the Arts; SIAM Publisher: Philadelphia, PA, USA. 1997
7
Xiu N., Zhang J. A smoothing Gauss-Newton method for the generalized horizontal linear complementarity problems. J. Comput. Appl. Math. 2001; 129: 195-208. 10.1016/S0377-0427(00)00550-1
8
Zhang Y. On the convergence of a class on infeasible interior-point methods for the horizontal linear complementarity problem. SIAM J. Optim. 1994; 4: 208-227. 10.1137/0804012
9
Gao X., Wang J. Analysis and application of a one-layer neural network for solving horizontal linear complementarity problems. Int. J. Comput. Int. Syst. 2014; 7: 724-732. 10.1080/18756891.2013.858903
Mezzadri F., Galligani E. Modulus-based matrix splitting methods for horizontal linear complementarity problems. Numer. Algor. 2020; 83: 201-219. 10.1007/s11075-019-00677-y
Bai Z.-Z. Modulus-based matrix splitting iteration methods for linear complementarity problems. Numer. Linear Algebra Appl. 2010; 17: 917-933. 10.1002/nla.680
Zhang J.-X., Yang G.-H. Low-complexity tracking control of strict-feedback systems with unknown control directions. IEEE T. Automat. Contrl. 2019; 64: 5175-5182. 10.1109/TAC.2019.2910738
Zhang X.-F., Chen Y.Q. Admissibility and robust stabilization of continuous linear singular fractional order systems with the fractional order α: The 0< α <1 case. Isa Trans. 2018; 82: 42-50. 28385193
Zhang J.-X., Wang Q.-G., Ding W. Global output-feedback prescribed performance control of nonlinear systems with unknown virtual control coefficients. IEEE T. Automat. Contrl. 2022; 67: 6904-6911. 10.1109/TAC.2021.3137103
Zheng H., Vong S. On convergence of the modulus-based matrix splitting iteration method for horizontal linear complementarity problems of H+-matrices. Appl. Math. Comput. 2020; 369: 124890. 10.1016/j.amc.2019.124890
Berman A., Plemmons R.J. Nonnegative Matrices in the Mathematical Sciences; Academi: New York, NY, USA. 1979
]
By Cuixia Li and Shiliang Wu
Reported by Author; Author