Robert Bixby is the Chief Strategy Officer and cofounder of Gurobi. In 1987 he co-founded CPLEX optimization. These are two of the main commercial packages for LP and
MIP. Bixby published a 2012 survey paper titled, A Brief History of Linear and Mixed-Integer Programming Computation.
One finds that of the three listed algorithms, primal simplex is now rarely the winner. Dual and barrier dominate; moreover, because of current trends in computing machinery, with individual processors making relatively little progress, and most increased power coming from increasing the number of cores per CPU chip, the ability to exploit parallelism is becoming more and more important. Barrier algorithms can and have been very effectively parallelized, while there has been essentially no success in parallelizing simplex algorithms. The result is that barrier algorithms are increasingly the winning algorithm when solving large linear programs from scratch. However, since crossover to a basis is an essential part of barrier algorithms, and this step is fundamentally a simplex computation and hence sequential, the fraction of time taken by crossover is ever increasing.
Here is Bixby’s dblp webpage and linkedIn page (I always like how odd the expertise section of LinkedIn is, look at Bixby’s score in Optimization, maybe it doesn’t mean what I think it means). Google runs an open source LP project GLOP as does GNU GLPK. Geraldo Veiga has a nice set of slides for a 2015 talk titled A Brief History of Computational Linear Programming. Hans Mittleman at ASU maintains a spectacular LP performance benchmark site Decision Tree for Optimization Software. Greg Glockner has a set of 2015 talk slides Parallel and Distributed Optimization with Gurobi Optimizer that seem useful for larger optimization problems. You can run optimization problems against the contemporary codes at neos, see neos guide and neos server. Standard texts include: Nocedal and Wright, 2000, Numerical Optimization; Luenberger and Ye, Linear and Nonlinear Programming; Ruszczynski, Nonlinear Optimization; Bertsekas, Convex Optimization Algorithms; and Williams, Model Building in Mathematical Programming.
How does all this optimization play in Wall Street currently? Risk Magazine will train you up in ALM & Balance Sheet Optimization last Sep 2016, predictably from a risk management perspective. Betsy Graseck, the wholesale Banking Equity Analyst at MS, contributed to Morgan Stanley/Oliver Wyman, Mar. 2014, Wholesale & Investment Banking Outlook: Mis-allocated Resources: Why Banks Need to Optimize Now and Mar. 2016 Learning to Live with Less Liquidity. We will be revisiting Ms. Graseck’s publications later in the blog considering she is the senior analyst for Capital One Financial. McKinsey/Bueller et.al., 2013, Between deluge and drought: The future of US bank liquidity and funding has a good discussion transfer pricing. Deloitte, Apr 2015, Capital efficiency and optimization spends some effort to discuss “optimizing the capital structure.” In all these publications I think the usage of the word “Optimization” leans more in the direction of improvement or “do better than you did last year” than numerically optimize relative to some objective function. Nevertheless the publications are useful in describing the infrastructure needed to run a (multi)trillion dollar accrual portfolio. JPM’s Collateral optimization is somewhat more numerical – see Rivett, The Optimization Imperative, JP Morgan, 2012. Lichtfous and Hefner at Deloite in 2014 wrote Optimisation Under Regulatory Constraints outlining an approach that sounds more rigorous and numerical. There are optimization texts online and oriented to finance for example, Cornuejols and Tutuncu, 2006, Optimization Methods in Finance. Finally, Optimization in the context of innovation, FinTech, and Digital Banking seems yet again distinct from simple numerical optimization of the accrual portfolio, see for example PWC, Retail Banking 2020.