ARPA-E Grid Competition

News

September 1st, 2022

Dr. Yurii Nesterov Appointed Distinguished Visiting Professor

One of the leading scholars in the field of optimization, Yurii Nesterov, is joining the faculty of the IEMS Department as a Distinguished Visiting Professor, after retiring from the University of Louvain, Belgium. He is an expert in convex optimization and complexity theory. In 2022 he was elected as a foreign member of the US National Academy of Sciences. He will be collaborating with OSL researchers and will teach a 1-month course on modern convex analysis this Spring.

 

March 8th, 2022

Kim Yu Selected  As One of the Rising Stars in Computational and Data Sciences

IEMS Ph.D. student Qimeng (Kim) Yu, has been  selected to attend the Rising Stars in Computational and Data Sciences event, to be held April 20-21 in Albuquerque, New Mexico, hosted by Sandia National Laboratories. With an acceptance rate of 26%, comparable to the selection rates of the most prestigious conferences in data science, the event brings together junior faculty and PhD students who show exceptional promise. Kim's research, under the supervision of Prof. Simge Küçükyavuz, focuses on generalized submodular optimization.

November 8th, 2021

Two OSL Teams Among Winners of Arpa-E Grid Competition

A team including Jorge Nocedal and a team including Andreas Waecther and Ermin Wei placed 2nd and 7th (respectively) in the national grid competition organized by the Department of Energy. The challenge was to develop software to create a more resilient and secure American electricity grid. The teams were tasked with devising optimal decisions for starting up or shutting down  generators, and for closing or opening a transmission lines. https://gocompetition.energy.gov/  The teams employed the optimization software packages Knitro and Ipopt developed by OSL faculty:

https://www.mccormick.northwestern.edu/research/optimization-machine-learning-center/software-downloads/

The two Northwestern teams were collectively awarded more than $500,000 in prizes.

November 4th, 2021

New GPU Machine for Deep Learning Research

OSL has purchased a dedicated 10 x GPU server to support deep neural network computations. 

The server has 5 NVIDIA RTX A6000 GPU cards, each with 48GB GDDR6 RAM and 5 unused slots for future expansion.  The server has 2 x Intel Xeon 6242R 3.1GHz 20 core CPUs and 384GB of system RAM. For permanent storage, it has 8 10K RPM drives in a RAID-5 configuration for 6 TB of drive space.

September 1, 2021

Tammy Kolda Appointed Distinguished Visiting Professor

Formerly a distinguished scientist at Sandia National Laboratory, Dr. Kolda currently serves as founding editor of the journal SIAM Journal on the Mathematics of Data Science. She is an expert in tensor decomposition for data analysis, and in optimization. In 2020 she was elected to the National Academy of Engineering. Dr. Kolda will be collaborating with OSL researchers and will give lectures at the undergraduate and graduate levels.

October 24, 2019

Leon Bottou from Facebook Artificial Intelligence Lab discussed some of the key challenges facing machine learning today.

In the 2019 Distinguished OSL lecture “Learning Representations Using Causal Invariance”, he notes that machine learning algorithms often capture spurious correlations in the training data distribution because the data collection process is subject to confounding biases. But if one has access to multiple datasets exemplifying the same concept and whose distributions exhibit different biases, one can learn something that is common across all these distributions by projecting the data into a representation space that satisfies a causal invariance criterion. This idea differs from previous work on statistical robustness or adversarial objectives. An open question facing AI before it can claim “world domination” is therefore to discover  the actual mechanism underlying the data instead of modeling its superficial statistics.

Leon’s research has followed many turns that reflect the evolution of machine learning: neural networks applications in the late 1980s, stochastic gradient learning algorithms and statistical properties of learning systems in the early 1990s, computer vision applications with structured outputs in the late 1990s, theory of large scale learning in the 2000s. Bottou’s current research aims to clarify the relation between learning and reasoning, with emphasis on causation (inference, invariance, reasoning, and intuition.)