One of the fastest-growing optimization fields is derivative-free optimization, or DFO, and Katya Scheinberg is among its pioneers. She has patented DFO software and is coauthor of a textbook published in 2008.

Scheinberg, associate professor of industrial and systems engineering, uses a comparison to explain DFO. As you walk in the dark down a slope into a valley, your feet detect the incline of the slope. When you reach the bottom, your feet detect level space.

“The typical derivative-based algorithm tells you which way is down,” says Scheinberg. “But in DFO, you have to find the bottom of the valley when it is filled with water. You can’t feel the slope, but you can drop an anchor at different points to measure depth.

“Intuitively, DFO requires much more effort. Our research helps minimize simulation costs by designing intelligent approaches that do not use too many measurements and [that] progress to solutions.”

At IBM, Scheinberg patented software that has been used to fine-tune decisions about wire width, transistor properties, power consumption and other concerns related to circuit design.

At Lehigh, she is working with Prof. Brian Chen of computer science and engineering to detect the binding properties of a class of proteins used in drugs. Chen uses Scheinberg’s algorithm to discover similar patterns in proteins important in drug development.

Optimization has become a hotter field, Scheinberg says, since the web opened up the possibility of building algorithms with large-scale data derived from Internet searches.

“When optimization originated in 1947, algorithms handled 10 variables. Now we have billions. We kept pace before because computer speed increased. But as soon as we thought we could solve anything, along came the web, creating new problems and opportunities. More data allows for new models, and so we’ve seen a boom in interest in optimization in general and in DFO in particular.”