Exercises
Exercise A.1 (Concepts on convexity)
- Define a convex set and provide an example.
- Define a convex function and provide an example.
- Explain the concept of convex optimization problems and provide an example.
- What is the difference between active and inactive constraints in an optimization problem?
- What is the difference between a locally optimal point and a globally optimal point?
- Define a feasibility problem and provide an example.
- Explain the concept of least squares problems and provide an example.
- Explain the concept of linear programming and provide an example.
- Explain the concept of nonconvex optimization and provide an example.
- Explain the difference between a convex and a nonconvex optimization problem.
Exercise A.2 (Convexity of sets) Determine the convexity of the following sets:
- \(\mathcal{X} = \left\{x\in\R \mid x^2-3x+2\ge0\right\}\).
- \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \textm{max}\{x_1,x_2,\dots,x_n\}\le1\right\}\).
- \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \alpha \le \bm{c}^\T\bm{x} \le \beta\right\}\).
- \(\mathcal{X} = \left\{\bm{x}\in\R^2 \mid x_1\ge1,\; x_2\ge2,\; x_1x_2\ge1\right\}\).
- \(\mathcal{X} = \left\{(x,y)\in\R^2 \mid y \ge x^2\right\}\).
- \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \|\bm{x}-\bm{c}\|\le \bm{a}^\T\bm{x} + b\right\}\).
- \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid (\bm{a}^\T\bm{x} + b) / (\bm{c}^\T\bm{x} + d) \ge1,\; \bm{c}^\T\bm{x} + d \ge1\right\}\).
- \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \bm{a}^\T\bm{x} \ge b \textm{ or } \|\bm{x}-\bm{c}\|\le1\right\}\).
- \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \bm{x}^\T\bm{y}\le1 \textm{ for all } \bm{y}\in\mathcal{S}\right\}\), where \(\mathcal{S}\) is an arbitrary set.
Exercise A.3 (Convexity of functions) Determine the convexity of the following functions:
- \(f(\bm{x}) = \alpha g(\bm{x}) + \beta\), where \(g\) is a convex function, and \(\alpha\) and \(\beta\) are scalars with \(\alpha>0\).
- \(f(\bm{x}) = \|\bm{x}\|^p\) with \(p\ge1\).
- \(f(\bm{x}) = \|\bm{Ax} - \bm{b}\|_2^2\).
- The difference between the maximum and minimum value of a polynomial on a given interval, as a function of its coefficients: \[ f(\bm{x}) = \underset{t\in[0,1]}{\textm{sup}} p_{\bm{x}}(t) - \underset{t\in[0,1]}{\textm{inf}} p_{\bm{x}}(t), \] where \(p_{\bm{x}}(t) = x_1 + x_2 t + x_3 t^2 + \dots + x_n t^{n-1}\).
- \(f(\bm{x}) = \bm{x}^\T\bm{Y}^{-1}\bm{x}\) (with \(\bm{Y}\succ\bm{0}\)).
- \(f(\bm{Y}) = \bm{x}^\T\bm{Y}^{-1}\bm{x}\) (with \(\bm{Y}\succ\bm{0}\)).
- \(f(\bm{x},\bm{Y}) = \bm{x}^\T\bm{Y}^{-1}\bm{x}\) (with \(\bm{Y}\succ\bm{0}\)). Hint: Use the Schur complement.
- \(f(\bm{x}) = \sqrt{\sqrt{\bm{a}^\T\bm{x} + b}}\).
- \(f(\bm{X}) = \textm{log\,det}\left(\bm{X}\right)\) on \(\mathbb{S}^n_{++}\).
- \(f(\bm{X}) = \textm{det}\left(\bm{X}\right)^{1/n}\) on \(\mathbb{S}^n_+\).
- \(f(\bm{X}) = \textm{Tr}\left(\bm{X}^{-1}\right)\) on \(\mathbb{S}^n_{++}\).
- \(f(\bm{x}) = \frac{1}{2}\bm{x}^\T\bSigma\bm{x} - \bm{b}^\T\textm{log}(\bm{x})\), where \(\bSigma\succ\bm{0}\) and the log function is applied elementwise.
Exercise A.4 (Reformulation of problems)
Rewrite the following optimization problem as an LP (assuming \(d > \|\bm{c}\|_1\)): \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \dfrac{\|\bm{Ax} - \bm{b}\|_1}{\bm{c}^\T\bm{x}+d}\\ \textm{subject to} & \|\bm{x}\|_{\infty} \leq 1. \end{array} \]
Rewrite the following optimization problem as an LP: \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \dfrac{\|\bm{Ax} - \bm{b}\|_1}{1 - \|\bm{x}\|_{\infty}}. \end{array} \]
Rewrite the following constraint as an SOC constraint: \[ \left\{(\bm{x},y,z)\in\R^{n+2} \mid \|\bm{x}\|^2\leq yz,y\geq 0,z\geq 0\right\}. \] Hint: You may need the equality \(yz=\frac{1}{4}\left((y+z)^2-(y-z)^2\right)\).
Rewrite the following problem as an SOCP: \[ \begin{array}{ll} \underset{\bm{x},y\ge0,z\ge0}{\textm{minimize}} & \bm{a}^\T\bm{x} + \kappa\sqrt{\bm{x}^\T\bSigma\bm{x}}\\ \textm{subject to} & \|\bm{x}\|^2\leq yz, \end{array} \] where \(\bSigma \succeq \bm{0}\).
Rewrite the following problem as an SOCP: \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \bm{x}^\T\bm{A}\bm{x} + \bm{a}^\T\bm{x}\\ \textm{subject to} & \bm{B}\bm{x}\leq \bm{b}, \end{array} \] where \(\bm{A} \succeq \bm{0}\).
Rewrite the following problem as an SDP: \[ \begin{array}{ll} \underset{\bm{X}\succeq\bm{0}}{\textm{minimize}} & \textm{Tr}\left((\bm{I} + \bm{X})^{-1}\right) + \textm{Tr}\left(\bm{A}\bm{X}\right). \end{array} \]
Exercise A.5 (Concepts on problem resolution)
- How would you determine if a convex problem is feasible or infeasible?
- How would you determine if a convex problem has a unique solution or multiple solutions?
- What are the main ways to solve a convex problem?
- Given a nonconvex optimization problem, what strategies can be used to find an approximate solution?
Exercise A.6 (Linear regression)
- Consider the line equation \(y = \alpha x + \beta\). Choose some values for \(\alpha\) and \(\beta\), and generate 100 noisy pairs \((x_i,y_i),\; i=1,\dots,100\) (i.e., add some random noise to each observation \(y_i\)).
- Formulate a regression problem to fit the 100 data points with a line based on least squares. Plot the true and estimated lines along with the points.
- Repeat the regression using several other definitions of error in the problem formulation. Plot and compare all the estimated lines.
Exercise A.7 (Concepts on Lagrange duality)
- Define Lagrange duality and explain its significance in convex optimization.
- Give an example of a problem and its dual.
- List the KKT conditions and explain their role in convex optimization.
- Provide an example of a problem with its KKT conditions.
- Try to find a solution that satisfies the previous KKT conditions. Is this always possible?
Exercise A.8 (Solution via KKT conditions) For the following problems, determine the convexity, write the Lagrangian and KKT conditions, and derive a closed-form solution:
Risk parity portfolio: \[ \begin{array}{ll} \underset{\bm{x}\ge\bm{0}}{\textm{minimize}} & \sqrt{\bm{x}^\T\bSigma\bm{x}}\\ \textm{subject to} & \bm{b}^\T\log(\bm{x}) \ge 1, \end{array} \] where \(\bSigma\succ\bm{0}\) and the log function is applied elementwise.
Projection onto the simplex: \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \frac{1}{2}\|\bm{x} - \bm{y}\|_2^2\\ \textm{subject to} & \bm{1}^\T\bm{x} =(\le) 1,\; \bm{x}\ge\bm{0}. \end{array} \]
Projection onto a diamond: \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \frac{1}{2}\|\bm{x} - \bm{y}\|_2^2\\ \textm{subject to} & \|\bm{x}\|_1 \le 1. \end{array} \]
Exercise A.9 (Dual problems) Find the dual of the following problems:
Vanishing maximum eigenvalue problem: \[ \begin{array}{ll} \underset{t, \bm{X}}{\textm{minimize}} & t\\ \textm{subject to} & t\bm{I} \succeq \bm{X},\\ & \bm{X} \succeq \bm{0}. \end{array} \]
Matrix upper bound problem: \[ \begin{array}{ll} \underset{\bm{X}}{\textm{minimize}} & \textm{Tr}(\bm{X})\\ \textm{subject to} & \bm{X} \succeq \bm{A},\\ & \bm{X} \succeq \bm{B} \end{array} \] where \(\bm{A},\bm{B} \in \mathbb{S}^n_+\).
Log det problem: \[ \begin{array}{ll} \underset{\bm{X}\succeq\bm{0}}{\textm{minimize}} & \textm{Tr}(\bm{C}\bm{X}) + \textm{log\,det}(\bm{X}^{-1})\\ \textm{subject to} & \bm{A}_i^\T\bm{X}\bm{A}_i \preceq \bm{B}_i, \qquad i=1,\dots,m, \end{array} \] where \(\bm{C} \in \mathbb{S}^n_+\) and \(\bm{B}_i \in \mathbb{S}^n_{++}\) for \(i=1,\dots,m,\).
Exercise A.10 (Multi-objective optimization)
- Explain the concept of multi-objective optimization problems.
- What is the significance of the weights in the scalarization of a multi-objective problem?
- Provide an example of a bi-objective convex optimization problem and its scalarization.
- Solve this scalarized bi-objective problem for different values of the weight and plot the optimal trade-off curve.