From left, Leon Bottou, Yann LeCun, and Rob Fergus work in their corner Facebook's New York City office.Dave Gershgorn/ Popular ScFrom left, Leon Bottou, Yann LeCun, and Rob Fergus work in their corner Facebook's New York City office.Dave Gershgorn/ Popular Sc

World Domination – Postponed

October 24, 2019

Leon Bottou from Facebook Artificial Intelligence Lab discussed some of the key challenges facing machine learning today.

In the 2019 Distinguished OSL lecture “Learning Representations Using Causal Invariance”, he notes that machine learning algorithms often capture spurious correlations in the training data distribution because the data collection process is subject to confounding biases. But if one has access to multiple datasets exemplifying the same concept and whose distributions exhibit different biases, one can learn something that is common across all these distributions by projecting the data into a representation space that satisfies a causal invariance criterion. This idea differs from previous work on statistical robustness or adversarial objectives. An open question facing AI before it can claim “world domination” is therefore to discover  the actual mechanism underlying the data instead of modeling its superficial statistics.

Leon’s research has followed many turns that reflect the evolution of machine learning: neural networks applications in the late 1980s, stochastic gradient learning algorithms and statistical properties of learning systems in the early 1990s, computer vision applications with structured outputs in the late 1990s, theory of large scale learning in the 2000s. Bottou’s current research aims to clarify the relation between learning and reasoning, with emphasis on causation (inference, invariance, reasoning, and intuition.)