http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
A NEW CONJUGATE GRADIENT MINIMIZATION METHOD BASED ON EXTENDED QUADRATIC FUNCTIONS
ISSAM.A.R. MOGHRABI 한국산업응용수학회 2004 Journal of the Korean Society for Industrial and A Vol.8 No.2
A Conjugate Gradient (CG) algorithm for unconstrained minimization is proposed which is invariant to a nonlinear scaling of a strictly convex quadratic function and which generates mutually conjugate directions for extended quadratic functions. It is derived for inexact line searches and is designed for the minimization of general nonlinear functions. It compares favorably in numerical tests with the original Dixon algorithm on which the new algorithm is based.
MINIMIZATION OF EXTENDED QUADRATIC FUNCTIONS WITH INEXACT LINE SEARCHES
ISSAM A.R. MOGHRABI 한국산업응용수학회 2005 Journal of the Korean Society for Industrial and A Vol.9 No.1
A Conjugate Gradient algorithm for unconstrained minimization is pro-posed which is invariant to a nonlinear scaling of a strictly convex quadratic function and which generates mutually conjugate directions for extended quadratic functions. It is derived for inexact line searches and for general functions. It compares favourably in numerical tests (over eight test functions and dimensionality up to 1000) with the Dixon (1975 ) algorithm on which this new algorithm is based .
A SELF SCALING MULTI-STEP RANK ONE PATTERN SEARCH ALGORITHM
ISSAM A.R. MOGHRABI 한국산업응용수학회 2011 Journal of the Korean Society for Industrial and A Vol.15 No.4
This paper proposes a new quickly convergent pattern search quasi-Newton algorithm that employs the multi-step version of the Symmetric Rank One (SRI). The new algorithm works on the factorizations of the inverse Hessian approximations to make available a sequence of convergent positive bases required by the pattern search process. The algorithm, in principle, resembles that developed in [1] with multi-step methods dominating the derivation and with numerical improvements incurred, as shown by the numerical results presented herein.
A new limited memory Quasi-Newton method for unconstrained optimization
MOGHRABI, ISSAM A.R. 한국산업정보응용수학회 2003 한국산업정보응용수학회 Vol.7 No.1
The main concern of this paper is to develop a new class of quasi-newton methods. These methods are intended for use whenever memory space is a major concern and, hence, they are usually referred to as limited memory methods. The methods developed in this work are sensitive to the choice of the memory parameter η that defines the amount of past information stored within the Hessian (or its inverse) approximation, at each iteration. The results of the numerical experiments made, compared to different choices of these parameters, indicate that these methods improve the performance of limited memory quasi-Newton methods.
Scaling Methods for Quasi-Newton Methods
MOGHRABI, ISSAM A.R. 한국산업정보응용수학회 2002 한국산업정보응용수학회 Vol.6 No.1
This paper presents two new self-scaling variable-metric algorithms. The first is based on a known two-parameter family of rank-two updating formulae, the second employs an initial scaling of the estimated inverse Hessian which modifies the first self-scaling algorithm. The algorithms are compared with similar published algorithms, notably those due to Oren, Shanno and Phua, Biggs and with BFGS (the best known quasi-Newton method). The best of these new and published algorithms are also modified to employ inexact line searches with marginal effect. The new algorithms are superior, especially as the problem dimension increases.