SIAM Journal on Optimization vol:22 issue:3 pages:879-898
Nonlinear optimization problems in complex variables are frequently encountered in applied mathematics and engineering applications such as control theory, signal processing and electrical engineering. Optimization of these problems often requires a first- or second-order approximation of the objective function to generate a new step or descent direction. However, such methods cannot be applied to real functions of complex variables because they are necessarily nonanalytic in their argument, i.e., the Taylor series expansion in their argument alone does not exist. To overcome this problem, the objective function is usually redefined as a function of the real and imaginary parts of its complex argument so that standard optimization methods can be applied. However, this approach may needlessly disguise any inherent structure present in the derivatives of such complex problems. Although little-known, it is possible to construct an expansion of the objective function in its original complex variables by noting that functions of complex variables can be analytic in their argument and its complex conjugate as a whole. We use these complex Taylor series expansions to generalize existing optimization algorithms for both general nonlinear optimization problems and nonlinear least squares problems. We then apply these methods to two case studies which demonstrate that complex derivatives can lead to greater insight in the structure of the problem, and that this structure can often be exploited to improve computational complexity and storage cost.