Congratulations to Francesco Ceccon, Joshua Haddad, Jordan Jalving, Carl D Laird, Ruth Misener, Alexander Thebelt, and Calvin Tsay for winning the 2022 COIN-OR Cup for their submission OMLT (https://github.com/cog-imperial/OMLT).
We didn’t *only* select OMLT because it makes us hungry for a yummy 🍳 omelet 🍳. OMLT represents neural networks and gradient-boosted trees in an algebraic modeling language (Pyomo) suitable for integrating into optimization formulations. Optimizing over trained surrogate models is important because it allows integration of NNs or GBTs into larger decision-making problems. Computer science applications include maximizing a neural acquisition function or verifying neural networks. In engineering applications, machine learning models may replace complicated constraints or serve as surrogates in larger design and operations problems. OMLT supports GBTs through an ONNX interface and NNs through both ONNX and Keras interfaces. OMLT transforms these pre-trained machine learning models into the algebraic modeling language Pyomo to encode the optimization formulations.
But also, the association with our tummies did help the winners, who we hope will make us breakfast.