3 General Optimization
The general optimization problem can be stated as follows. Given a fiunction \(f: \mathbb{R}^k\rightarrow\mathbb{R}\), we want to find \(\min_{x\in S}f(x)\), where \(S\subset\mathbb{R}^k\). The general approach to solving this problem that we will discuss is called a line search method. With line search methods, given \(f\) and a current estimate \(x_n\) of the location of the minimum, we want to
Choose a direction \(p_n\) in \(k\)-dimensional space;
Choose a step length in the direction \(p_n\), usually by solving \(\min_{\alpha>0} f(x_n + \alpha p_n)\) to get \(\alpha_n\)
Update our estimate with \(x_{n+1} = x_n + \alpha_n p_n\).
Clearly then, with line search methods, the two questions one must answer are how should we choose the direction? and how far should we step? Almost all line search approaches provide variations on the answers to those two questions.
Care must be taken in addressing the problems involved with line search methods because typically one must assume that the size of the parameter space is large (i.e. \(k\) is large). Therefore, one of constraints for all methods is minimizing the amount of computation that needs to be done due to the large parameter space. Efficiency with respect to memory (storage of data or parameters) and computation time is key.