\( \newcommand{\bm}[1]{\boldsymbol{#1}} \newcommand{\textm}[1]{\textsf{#1}} \def\T{{\mkern-2mu\raise-1mu\mathsf{T}}} \newcommand{\R}{\mathbb{R}} % real numbers \newcommand{\E}{{\rm I\kern-.2em E}} \newcommand{\w}{\bm{w}} % bold w \newcommand{\bmu}{\bm{\mu}} % bold mu \newcommand{\bSigma}{\bm{\Sigma}} % bold mu \newcommand{\bigO}{O} %\mathcal{O} \renewcommand{\d}[1]{\operatorname{d}\!{#1}} \)

Exercises

Exercise A.1 (Concepts on convexity)

  1. Define a convex set and provide an example.
  2. Define a convex function and provide an example.
  3. Explain the concept of convex optimization problems and provide an example.
  4. What is the difference between active and inactive constraints in an optimization problem?
  5. What is the difference between a locally optimal point and a globally optimal point?
  6. Define a feasibility problem and provide an example.
  7. Explain the concept of least squares problems and provide an example.
  8. Explain the concept of linear programming and provide an example.
  9. Explain the concept of nonconvex optimization and provide an example.
  10. Explain the difference between a convex and a nonconvex optimization problem.

Exercise A.2 (Convexity of sets) Determine the convexity of the following sets:

  1. \(\mathcal{X} = \left\{x\in\R \mid x^2-3x+2\ge0\right\}\).
  2. \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \textm{max}\{x_1,x_2,\dots,x_n\}\le1\right\}\).
  3. \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \alpha \le \bm{c}^\T\bm{x} \le \beta\right\}\).
  4. \(\mathcal{X} = \left\{\bm{x}\in\R^2 \mid x_1\ge1,\; x_2\ge2,\; x_1x_2\ge1\right\}\).
  5. \(\mathcal{X} = \left\{(x,y)\in\R^2 \mid y \ge x^2\right\}\).
  6. \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \|\bm{x}-\bm{c}\|\le \bm{a}^\T\bm{x} + b\right\}\).
  7. \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid (\bm{a}^\T\bm{x} + b) / (\bm{c}^\T\bm{x} + d) \ge1,\; \bm{c}^\T\bm{x} + d \ge1\right\}\).
  8. \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \bm{a}^\T\bm{x} \ge b \textm{ or } \|\bm{x}-\bm{c}\|\le1\right\}\).
  9. \(\mathcal{X} = \left\{\bm{x}\in\R^n \mid \bm{x}^\T\bm{y}\le1 \textm{ for all } \bm{y}\in\mathcal{S}\right\}\), where \(\mathcal{S}\) is an arbitrary set.

Exercise A.3 (Convexity of functions) Determine the convexity of the following functions:

  1. \(f(\bm{x}) = \alpha g(\bm{x}) + \beta\), where \(g\) is a convex function, and \(\alpha\) and \(\beta\) are scalars with \(\alpha>0\).
  2. \(f(\bm{x}) = \|\bm{x}\|^p\) with \(p\ge1\).
  3. \(f(\bm{x}) = \|\bm{Ax} - \bm{b}\|_2^2\).
  4. The difference between the maximum and minimum value of a polynomial on a given interval, as a function of its coefficients: \[ f(\bm{x}) = \underset{t\in[0,1]}{\textm{sup}} p_{\bm{x}}(t) - \underset{t\in[0,1]}{\textm{inf}} p_{\bm{x}}(t), \] where \(p_{\bm{x}}(t) = x_1 + x_2 t + x_3 t^2 + \dots + x_n t^{n-1}\).
  5. \(f(\bm{x}) = \bm{x}^\T\bm{Y}^{-1}\bm{x}\) (with \(\bm{Y}\succ\bm{0}\)).
  6. \(f(\bm{Y}) = \bm{x}^\T\bm{Y}^{-1}\bm{x}\) (with \(\bm{Y}\succ\bm{0}\)).
  7. \(f(\bm{x},\bm{Y}) = \bm{x}^\T\bm{Y}^{-1}\bm{x}\) (with \(\bm{Y}\succ\bm{0}\)). Hint: Use the Schur complement.
  8. \(f(\bm{x}) = \sqrt{\sqrt{\bm{a}^\T\bm{x} + b}}\).
  9. \(f(\bm{X}) = \textm{logdet}\left(\bm{X}\right)\) on \(\mathbb{S}^n_{++}\).
  10. \(f(\bm{X}) = \textm{det}\left(\bm{X}\right)^{1/n}\) on \(\mathbb{S}^n_+\).
  11. \(f(\bm{X}) = \textm{Tr}\left(\bm{X}^{-1}\right)\) on \(\mathbb{S}^n_{++}\).
  12. \(f(\bm{x}) = \frac{1}{2}\bm{x}^\T\bSigma\bm{x} - \bm{b}^\T\textm{log}(\bm{x})\), where \(\bSigma\succ\bm{0}\) and the log function is applied elementwise.

Exercise A.4 (Reformulation of problems)

  1. Rewrite the following optimization problem as an LP (assuming \(d > \|\bm{c}\|_1\)): \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \dfrac{\|\bm{Ax} - \bm{b}\|_1}{\bm{c}^\T\bm{x}+d}\\ \textm{subject to} & \|\bm{x}\|_{\infty} \leq 1. \end{array} \]

  2. Rewrite the following optimization problem as an LP: \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \dfrac{\|\bm{Ax} - \bm{b}\|_1}{1 - \|\bm{x}\|_{\infty}}. \end{array} \]

  3. Rewrite the following constraint as an SOC constraint: \[ \left\{(\bm{x},y,z)\in\R^{n+2} \mid \|\bm{x}\|^2\leq yz,y\geq 0,z\geq 0\right\}. \] Hint: You may need the equality: \(yz=\frac{1}{4}\left((y+z)^2-(y-z)^2\right)\).

  4. Rewrite the following problem as an SOCP: \[ \begin{array}{ll} \underset{\bm{x},y\ge0,z\ge0}{\textm{minimize}} & \bm{a}^\T\bm{x} + \kappa\sqrt{\bm{x}^\T\bSigma\bm{x}}\\ \textm{subject to} & \|\bm{x}\|^2\leq yz, \end{array} \] where \(\bSigma \succeq \bm{0}\).

  5. Rewrite the following problem as an SOCP: \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \bm{x}^\T\bm{A}\bm{x} + \bm{a}^\T\bm{x}\\ \textm{subject to} & \bm{B}\bm{x}\leq \bm{b}, \end{array} \] where \(\bm{A} \succeq \bm{0}\).

  6. Rewrite the following problem as an SDP: \[ \begin{array}{ll} \underset{\bm{X}\succeq\bm{0}}{\textm{minimize}} & \textm{Tr}\left((\bm{I} + \bm{X})^{-1}\right) + \textm{Tr}\left(\bm{A}\bm{X}\right). \end{array} \]

Exercise A.5 (Concepts on problem resolution)

  1. How would you determine if a convex problem is feasible or infeasible?
  2. How would you determine if a convex problem has a unique solution or multiple solutions?
  3. What are the main ways to solve a convex problem?
  4. Given a nonconvex optimization problem, what strategies can be used to find an approximate solution?

Exercise A.6 (Linear regression)

  1. Consider the line equation \(y = \alpha x + \beta\). Choose some values for \(\alpha\) and \(\beta\), and generate 100 noisy pairs \((x_i,y_i),\; i=1,\dots,100\) (i.e., add some random noise to each observation \(y_i\)).
  2. Formulate a regression problem to fit the 100 data points with a line based on least squares. Plot the true and estimated lines along with the points.
  3. Repeat the regression using several other definitions of error in the problem formulation. Plot and compare all the estimated lines.

Exercise A.7 (Concepts on Lagrange duality)

  1. Define Lagrange duality and explain its significance in convex optimization.
  2. Give an example of a problem and its dual.
  3. List the KKT conditions and explain their role in convex optimization.
  4. Provide an example of a problem with its KKT conditions.
  5. Try to find a solution that satisfies the previous KKT conditions. Is this always possible?

Exercise A.8 (Solution via KKT conditions) For the following problems, determine the convexity, write the Lagrangian and KKT conditions, and derive a closed-form solution:

  1. Risk parity portfolio: \[ \begin{array}{ll} \underset{\bm{x}\ge\bm{0}}{\textm{minimize}} & \sqrt{\bm{x}^\T\bSigma\bm{x}}\\ \textm{subject to} & \bm{b}^\T\log(\bm{x}) \ge 1, \end{array} \] where \(\bSigma\succ\bm{0}\) and the log function is applied elementwise.

  2. Projection onto the simplex: \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \frac{1}{2}\|\bm{x} - \bm{y}\|_2^2\\ \textm{subject to} & \bm{1}^\T\bm{x} =(\le) 1,\; \bm{x}\ge\bm{0}. \end{array} \]

  3. Projection onto a diamond: \[ \begin{array}{ll} \underset{\bm{x}}{\textm{minimize}} & \frac{1}{2}\|\bm{x} - \bm{y}\|_2^2\\ \textm{subject to} & \|\bm{x}\|_1 \le 1. \end{array} \]

Exercise A.9 (Dual problems) Find the dual of the following problems:

  1. Vanishing maximum eigenvalue problem: \[ \begin{array}{ll} \underset{t, \bm{X}}{\textm{minimize}} & t\\ \textm{subject to} & t\bm{I} \succeq \bm{X}\\ & \bm{X} \succeq \bm{0}. \end{array} \]

  2. Matrix upper bound problem: \[ \begin{array}{ll} \underset{\bm{X}}{\textm{minimize}} & \textm{Tr}(\bm{X})\\ \textm{subject to} & \bm{X} \succeq \bm{A}\\ & \bm{X} \succeq \bm{B} \end{array} \] where \(\bm{A},\bm{B} \in \mathbb{S}^n_+\).

  3. Logdet problem: \[ \begin{array}{ll} \underset{\bm{X}\succeq\bm{0}}{\textm{minimize}} & \textm{Tr}(\bm{C}\bm{X}) + \textm{logdet}(\bm{X}^{-1})\\ \textm{subject to} & \bm{A}_i^\T\bm{X}\bm{A}_i \preceq \bm{B}_i, \qquad i=1,\dots,m, \end{array} \] where \(\bm{C} \in \mathbb{S}^n_+\) and \(\bm{B}_i \in \mathbb{S}^n_{++}\) for \(i=1,\dots,m,\).

Exercise A.10 (Multi-objective optimization)

  1. Explain the concept of multi-objective optimization problems.
  2. What is the significance of the weights in the scalarization of a multi-objective problem?
  3. Provide an example of a bi-objective convex optimization problem and its scalarization.
  4. Solve this scalarized bi-objective problem for different values of the weight and plot the optimal trade-off curve.