Solution of feasibility problems via non-smooth optimization
Date
Authors
Editor(s)
Advisor
Supervisor
Co-Advisor
Co-Supervisor
Instructor
Source Title
Print ISSN
Electronic ISSN
Publisher
Volume
Issue
Pages
Language
Type
Journal Title
Journal ISSN
Volume Title
Attention Stats
Usage Stats
views
downloads
Series
Abstract
In this study we present a penalty function approach for linear feasibility problems. Our attempt is to find an eiL· coive algorithm based on an exterior method. Any given feasibility (for a set of linear inequalities) problem, is first transformed into an unconstrained minimization of a penalty function, and then the problem is reduced to minimizing a convex, non-smooth, quadratic function. Due to non-differentiability of the penalty function, the gradient type methods can not be applied directly, so a modified nonlinear programming technique will be used in order to overcome the difficulties of the break points. In this research we present a new algorithm for minimizing this non-smooth penalty function. By dropping the nonnegativity constraints and using conjugate gradient method we compute a maximum set of conjugate directions and then we perform line searches on these directions in order to minimize our penalty function. Whenever the optimality criteria is not satisfied and the improvements in all directions are not enough, we calculate the new set of conjugate directions by conjugate Gram Schmit process, but one of the directions is the element of sub differential at the present point.