I like the way that Nocedal and Wright introduce the interior point method. They seem to focus more on the primal-dual version. Also, just in general, I find their book more cohesive than Convex Optimization because it builds a story instead of just collecting sets of facts and examples. Both are important references!
As I've written, I find the BV story very cohesive when viewed as a programming language. NW (which is also an incredible textbook) is focused solely on algorithms, and that gives it a cleaner, more focused story.
I can definitely relate to what you are writing about covering examples from applications, given that 2024-2025 academic year is my optimization year: I am about to wrap up teaching a graduate class based on Luenberger's _Optimization by Vector Space Methods_ and switch over to teaching optimization to undergrads using the book you wrote with Steve Wright. Most of Luenberger's book holds up remarkably well including the examples, although I also make sure to cover a lot of the "current thing" kind of stuff (RKHS, optimal transport, statistical inference using optimization à la Juditsky-Nemirovski). The main difficulty is exactly as you outline: Being able to convey enough of the philosophy of each application domain (economics, optimal control, statistics, ML) to ground the problem. I suspect it will be easier with the undergrad course since the emphasis is mostly on applications in ML and data analysis.
I like the way that Nocedal and Wright introduce the interior point method. They seem to focus more on the primal-dual version. Also, just in general, I find their book more cohesive than Convex Optimization because it builds a story instead of just collecting sets of facts and examples. Both are important references!
As I've written, I find the BV story very cohesive when viewed as a programming language. NW (which is also an incredible textbook) is focused solely on algorithms, and that gives it a cleaner, more focused story.
I can definitely relate to what you are writing about covering examples from applications, given that 2024-2025 academic year is my optimization year: I am about to wrap up teaching a graduate class based on Luenberger's _Optimization by Vector Space Methods_ and switch over to teaching optimization to undergrads using the book you wrote with Steve Wright. Most of Luenberger's book holds up remarkably well including the examples, although I also make sure to cover a lot of the "current thing" kind of stuff (RKHS, optimal transport, statistical inference using optimization à la Juditsky-Nemirovski). The main difficulty is exactly as you outline: Being able to convey enough of the philosophy of each application domain (economics, optimal control, statistics, ML) to ground the problem. I suspect it will be easier with the undergrad course since the emphasis is mostly on applications in ML and data analysis.
IMO, the book with Steve is easier to motivate because
(a) we're so focused on solving regularized ERM problems.
(b) it's a course on algorithms, not a broader theory.
Boyd and Vandenberghe have a more ambitious agenda. Perhaps too ambitious for a semester?
But I'm very curious to hear how that material goes over with the undergrads. Please report back. Or live blog it!