Menu
Back Issues
Featured

Optimizing A Faster, Smarter, and Easier World

Big data plus increasing computational power empower researchers to use optimization algorithms to build a better world on a grander scale.

Searching for information online has become an integral part of everyday life. When you want to find something, a search engine rifles through 60 trillion pages and returns ranked results tailored precisely to your needs based on your location, language, previous searches, and more.

It all happens within a fraction of a second, and most people take it for granted—giving little or no thought to the complex programs that power these searches behind the scenes.

“People don’t fully understand that a search engine is a technological marvel,” says Jorge Nocedal, the David A. and Karen Richards Sachs Professor and Chair of Industrial Engineering and Management Sciences. “It’s an incredible human accomplishment. You type a fairly complicated sentence and receive reliable answers within half a second. I never expected that we would be able to get so much information so easily and so reliably in such a short amount of time.”

How does this happen? Much of the answer can be found in the science of optimization, and search engines are just one example of how this fast-growing discipline is making the world run faster, smarter, and easier. At Northwestern, engineers use optimization to improve everything from software to healthcare to renewable energy. With the University’s new Center for Optimization and Statistical Learning (OSL) now open to support this work, optimization is finding its way into innovative collaborations across disciplines and schools.

“We have trained algorithms with a lot of data, using massive computing resources, and something big has happened. We have created systems that are more intelligent than ever before.”
JORGE NOCEDAL PROFESSOR OF INDUSTRIAL ENGINEERING AND MANAGEMENT SCIENCES

Fueled By Data

The science of optimization has been around for a long time. Early engineers often used it to find the best designs for performance and aerodynamics. Nocedal, for example, remembers using optimization algorithms decades ago during his first job as an undergraduate at the National University of Mexico in Mexico City.

“I was probably 17 or 18 years old, and the astronomy department asked me to help design a new telescope with optimal optical properties,” he says. “The question was how to use optimization algorithms to find the exact shape of a lens that would deliver the best possible quality images. I got hooked and have been optimizing ever since.”

The science of optimization is a dynamic field. The unflagging influx of big data paired with the onset of powerful computing has revolutionized optimization in ways unimaginable even a decade ago.

“It’s really the scale that has changed,” says David Morton, professor of industrial engineering and management sciences and director of the new Center for Optimization and Statistical Learning. “The number of parameters or decision variables that you can optimize has increased to scales on the order of millions or tens of millions. It’s not just big data and increased computational power, but new algorithms that can handle and exploit those things.”

Squeezing Uncertainty

The new Center, with its dual focus on optimization and statistical learning, has turned optimization science into a team sport. Optimization makes processes as perfect or as effective as possible; statistical learning ensures that optimization algorithms will always grow smarter and increasingly able to handle incoming data.

Statistical learning provides a framework for machine learning, a subfield of computer science that builds algorithms capable of learning from data and using that learning to make better predictions. To help accelerate machine learning, the Center connects researchers in engineering with experts in computer science.

“We have trained algorithms with a lot of data, using massive computing resources, and something big has happened,” Nocedal says. “We have created systems that are more intelligent than ever before.”

The ultimate goal is to develop better decision-making models that squeeze out as much uncertainty as possible. Uncertainty involves imperfect or unknown information. Optimization models attempt to compensate for this.

For example, Morton’s latest work, a multi-institutional collaboration funded by the US Department of Energy, focuses on designing and building systems that generate solar power by using mirrors to concentrate large areas of sunlight into a small area.

“There is uncertainty in terms of solar radiance,” he says. “What will happen when there’s cloud cover? What if the mirrors fail? How does the system change as it degrades over time? You have to take uncertainty into account to find the optimal design under all those conditions.”

Northwestern Engineering Puts Optimization to Work

Northwestern researchers and innovators leverage optimization principles across wide-ranging endeavors in groundbreaking ways.

Optimizing Teams

Noshir Contractor works to map, understand, and enable the most effective networks in a wide variety of contexts, including for teams traveling to Mars.

Optimizing Wi-Fi

Randall Berry helps develop distributed resource allocation techniques for wireless networks

Optimizing Search

Douglas Downey develops techniques and prototypes that extend the state of the art for search engines.

Optimizing Software

Andreas Wachter develops open-source, general purpose optimization software that can be used in many different fields across both academia and industry.

Optimizing Music

Bryan Pardo turns complicated audio production software into easy-to-understand interfaces and terms.

Optimizing The Grid

Ermin Wei works to help make the power grid more cost-effective and sustainable.

Optimizing Shipping

Diego Klabjan helps logistics companies plan the most efficient shipping routes.

Optimizing The Marathon

Karen Smilowitz works to make the Chicago Marathon safer and smoother.

Optimizing Organs

Sanjay Mehrotra developed a model to help optimize kidney distribution for organ donation.

Robust Optimization

Sometimes uncertainty can mean the difference between life or death. Omid Nohadani, associate professor of industrial engineering and management sciences, uses algorithms to optimize radiation therapy. His work takes into account how cancer changes as a tumor shrinks or grows, or even how the tumor’s position moves as a person undergoing therapy coughs or breathes. The radiation needs to be delivered at certain angles, which take these subtle changes into account.

“You want to use the best angles and radiation intensity to harm the tumor while sparing the healthy tissue,” Nohadani says. “There are many variables you have to take into account to deliver the optimal dose.”

To best meet this goal, Nohadani uses a strategy called robust optimization, a methodology that has developed rapidly over just the past 15 years. Very simply stated, the methodology uses algorithms that take risk and uncertainties into account and acknowledges that the most robust position is not always the best position. With radiation therapy, for example, a radiation dosage at the optimal intensity to shrink the tumor fastest might not be the best therapy for the patient’s overall health. A lower radiation dose might take longer to shrink the tumor, but it might also minimize damage to the surrounding tissue when uncertainties arise.

“In healthcare, if you make a small mistake, it can make a big impact,” Nohadani says. “You want to make the safer bet.”

Optimizing The Future

When Jorge Nocedal develops optimization algorithms, he keeps an eye on the future. In recent years, his work has focused on speech recognition systems. He imagines a world in which computers understand and immediately translate what people say into other languages, which would enable two people speaking different languages to converse meaningfully in real time.

Nocedal also envisions that the optimization algorithms for speech recognition could be applied to image recognition. To do this, Nocedal rejects the common, linear models that dominated the field for so long and instead uses non-linear models, which take into account increasing complexity and greater freedom.

“Non-linear algorithms used to scare people,” he says, “but they are responsible for recent advances in speech and image recognition.”

Nocedal describes the linear vs. non-linear concept using basic economics. Over time, for example, automobile prices have risen steadily a little each year in a linear fashion. Eventually there might be a sale. In a non-linear model, the industry might falter in some manner, causing prices to periodically drop.

“A lot of behavior does not follow a line,” Nocedal says. “A lot of behavior follows curves. After all, the world is not a linear place.”