Home About us Contact | |||
Smallest Eigenvalues (smallest + eigenvalue)
Selected AbstractsA fast algorithm for computing the smallest eigenvalue of a symmetric positive-definite Toeplitz matrixNUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, Issue 4 2008N. Mastronardi Abstract Recent progress in signal processing and estimation has generated considerable interest in the problem of computing the smallest eigenvalue of a symmetric positive-definite (SPD) Toeplitz matrix. An algorithm for computing upper and lower bounds to the smallest eigenvalue of a SPD Toeplitz matrix has been recently derived (Linear Algebra Appl. 2007; DOI: 10.1016/j.laa.2007.05.008). The algorithm relies on the computation of the R factor of the QR factorization of the Toeplitz matrix and the inverse of R. The simultaneous computation of R and R,1 is efficiently accomplished by the generalized Schur algorithm. In this paper, exploiting the properties of the latter algorithm, a numerical method to compute the smallest eigenvalue and the corresponding eigenvector of SPD Toeplitz matrices in an accurate way is proposed. Copyright © 2008 John Wiley & Sons, Ltd. [source] Combination of Jacobi,Davidson and conjugate gradients for the partial symmetric eigenproblemNUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, Issue 1 2002Y. Notay Abstract To compute the smallest eigenvalues and associated eigenvectors of a real symmetric matrix, we consider the Jacobi,Davidson method with inner preconditioned conjugate gradient iterations for the arising linear systems. We show that the coefficient matrix of these systems is indeed positive definite with the smallest eigenvalue bounded away from zero. We also establish a relation between the residual norm reduction in these inner linear systems and the convergence of the outer process towards the desired eigenpair. From a theoretical point of view, this allows to prove the optimality of the method, in the sense that solving the eigenproblem implies only a moderate overhead compared with solving a linear system. From a practical point of view, this allows to set up a stopping strategy for the inner iterations that minimizes this overhead by exiting precisely at the moment where further progress would be useless with respect to the convergence of the outer process. These results are numerically illustrated on some model example. Direct comparison with some other eigensolvers is also provided. Copyright © 2001 John Wiley & Sons, Ltd. [source] Computation of a few smallest eigenvalues of elliptic operators using fast elliptic solversINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 8 2001Janne Martikainen Abstract The computation of a few smallest eigenvalues of generalized algebraic eigenvalue problems is studied. The considered problems are obtained by discretizing self-adjoint second-order elliptic partial differential eigenvalue problems in two- or three-dimensional domains. The standard Lanczos algorithm with the complete orthogonalization is used to compute some eigenvalues of the inverted eigenvalue problem. Under suitable assumptions, the number of Lanczos iterations is shown to be independent of the problem size. The arising linear problems are solved using some standard fast elliptic solver. Numerical experiments demonstrate that the inverted problem is much easier to solve with the Lanczos algorithm that the original problem. In these experiments, the underlying Poisson and elasticity problems are solved using a standard multigrid method. Copyright © 2001 John Wiley & Sons, Ltd. [source] Sparse approximate inverse preconditioning of deflated block-GMRES algorithm for the fast monostatic RCS calculationINTERNATIONAL JOURNAL OF NUMERICAL MODELLING: ELECTRONIC NETWORKS, DEVICES AND FIELDS, Issue 5 2008P. L. Rui Abstract A sparse approximate inverse (SAI) preconditioning of deflated block-generalized minimal residual (GMRES) algorithm is proposed to solve large dense linear systems with multiple right-hand sides arising from monostatic radar cross section (RCS) calculations. The multilevel fast multipole method (MLFMM) is used to accelerate the matrix,vector product operations, and the SAI preconditioning technique is employed to speed up the convergence rate of block-GMRES (BGMRES) iterations. The main purpose of this study is to show that the convergence rate of the SAI preconditioned BGMRES method can be significantly improved by deflating a few smallest eigenvalues. Numerical experiments indicate that the combined effect of the SAI preconditioning technique that clusters most of eigenvalues to one, coupled with the deflation technique that shifts the rest of the smallest eigenvalues in the spectrum, can be very beneficial in the MLFMM, thus reducing the overall simulation time substantially. Copyright © 2008 John Wiley & Sons, Ltd. [source] Combination of Jacobi,Davidson and conjugate gradients for the partial symmetric eigenproblemNUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, Issue 1 2002Y. Notay Abstract To compute the smallest eigenvalues and associated eigenvectors of a real symmetric matrix, we consider the Jacobi,Davidson method with inner preconditioned conjugate gradient iterations for the arising linear systems. We show that the coefficient matrix of these systems is indeed positive definite with the smallest eigenvalue bounded away from zero. We also establish a relation between the residual norm reduction in these inner linear systems and the convergence of the outer process towards the desired eigenpair. From a theoretical point of view, this allows to prove the optimality of the method, in the sense that solving the eigenproblem implies only a moderate overhead compared with solving a linear system. From a practical point of view, this allows to set up a stopping strategy for the inner iterations that minimizes this overhead by exiting precisely at the moment where further progress would be useless with respect to the convergence of the outer process. These results are numerically illustrated on some model example. Direct comparison with some other eigensolvers is also provided. Copyright © 2001 John Wiley & Sons, Ltd. [source] |