These topics are most relevant in specialized fields like theoretical computer science, control theory, or operation research. But a basic understanding of these powerful techniques can also be fruitful in the practice of machine learning. Virtually every machine-learning algorithm aims to minimize some kind of estimation error subject to various constraints-which is an optimization problem. Here are the topics to learn:

- Basics of optimization, how to formulate the problem

- Maxima, minima, convex function, global solution

- Linear programming, simplex algorithm

- Integer programming

- Constraint programming, knapsack problem

- Randomized optimization techniques: hill climbing, simulated annealing, genetic algorithms

**Where You Might Use It**

Simple linear regression problems using least-square loss function often have an exact analytical solution, but logistic regression problems don't. To understand the reason, you need to be familiar with the concept of "convexity" in optimization. This line of investigation will also illuminate why we must remain satisfied with "approximate" solutions in most machine-learning problems.