# CS代考 ECE599/CS539 Nonlinear Optimization (Spring 2020) Midterm – cscodehelp代写

ECE599/CS539 Nonlinear Optimization (Spring 2020) Midterm
(Due: 11:59pm on May 22, Friday)
Instruction: Students should provide enough detail of the logical procedure of deriving answers. Answers without sufficient justification will receive partial or no credit. For questions involving MATLAB experiments, provide codes with comments. The maximum possible score is 100.
1. Convergence of Gradient Descent with Armijo’s Rule: Convex Optimization [30 points]. Suppose that f is in C2, i.e., twice continuously differentiable, and there exist A,a > 0 such that for every x ∈ Rn,

a ≤ all eigenvalues of F(x) ≤ A, (1) where F(x) denotes, as usual, the Hessian of f at x. This implies that f is a strictly convex function.
Consider the unconstrained optimization problem:
min f(x). (2)
In this question, we will analyze convergence of the Gradient Descent algorithm employing Armijo’s
rule as the line search method.
Recall the Armijo’s rule with parameters ε ∈ (0, 1/2) and η > 1: in the k-th iteration, we choose the
step size αk ≥ 0 such that it satisfies
(i) f(xk − αkg(xk)) ≤ f(xk) − ε · αkg(xk)T g(xk), and
(ii) f(xk − ηαkg(xk)) > f(xk) − ε · ηαkg(xk)T g(xk)
where g(xk ) denotes ∇f (xk )T . Now, let {xk } denote the sequence generated by Gradient Descent
with Armijo’s rule. Solve the following questions related to convergence of this sequence.
(Pages 240 and 241 of the textbook contain the convergence analysis of Gradient Descent with Exact Line Search and Gradient Descent with Armijo’s rule. I strongly recommend you to read those two pages before you start working on this question. You will be able to get some hints. Note, however, that you are required to provide full derivations in your answers with enough justification, even for those results that were derived in pages 240-241)