From Submissive Sidekick to Powerful Partner

How artificial intelligence can power true collaboration between humans and machines.

Decorative image

Alexa and Siri, move over. Virtual collaborators are coming.

It’s a longstanding promise that’s finally within reach: the transformation of artificial intelligence (AI) from a tool we use for routine tasks to a constant companion and tireless co-worker that complements our strengths and compensates for our weaknesses.

“We’re on the cusp of a shift in how people interact with computers,” says Ken Forbus, the Walter P. Murphy Professor of Computer Science at Northwestern Engineering. “Alexa and Siri are relatively simplistic and crude compared to what’s to come.”

In the not-too-distant future, Forbus predicts, computers using sufficiently smart “software social organisms” will gain intimate knowledge of the humans they encounter, anticipate their needs, and find flaws in their reasoning. Those general yet complex capabilities will enable the human and the machine to become true collaborators—bouncing ideas off one another and playing off each other’s strengths.

“Having AI that complements us as human beings, and can even collaborate with us, will make everything work better,” Forbus says. “While the sheer complexity of the problems we face is constantly increasing, as a species we’re not becoming more complex. The human mind and our cognitive capacity have reached their limits."

Software as Social Organism

Though this quantum leap for AI remains years away, researchers have begun to create “software social organisms” that can learn a vast range of knowledge and have the capacity to reason. “Social” refers to the software communicating with humans naturally and effectively. “Organisms” refers to the software being self-sufficient.

“Instead of humans learning to code using the language of the computer, the machines will know how to speak with us using language and images,” Forbus says. “They will come to us.

“Today, all machine learning involves humans in the process,” he continues, “but soon that will go away. In these new systems, there will be no human fiddling behind the scenes, which is what we have with Google and other leading-edge companies now. The machine will set its own learning goals and priorities for self-improvement.”

Take Heart, Alexa

“The current intelligent personal assistants, like Alexa, are really kind of cool and at the same time really kind of sad,” he says. “They can help with little tasks, but they don’t get to know you or your preferences over time. They can’t put things in context. They can’t answer complex questions using multiple data sets.”

Forbus says that will change as AI continues to advance exponentially. Just as apprentices start working as assistants to their mentors and move incrementally toward autonomy, software social organisms will learn from their mentors, growing to the point of engaging in true dialogue, posing robust follow-up questions, and making complex connections.

“These organisms will not be sycophants,” Forbus says. “They’ll not only collaborate, they can also serve as devil’s advocates, looking at the other side of an issue to help point out flaws in our arguments and reasoning while pushing our work to higher levels.”

Impact on Pedagogy

This coming shift—which essentially removes humans from the process of regulating machines—already has implications for pedagogy. “Today’s students are being taught how to run machine-learning software, but that’s not a long-term skill. We’re not going to have data scientists in the long term,” he says.

Anticipating that shift, Northwestern Engineering’s new Master of Science in Artificial Intelligence program will expose students to the research and technology that develops virtual collaborators. Forbus’s lab has developed one model for teaching computers to reason like humans and even to make moral decisions, and another for equipping them to perform at human levels on standardized tests.

“While everyone else is focused on creating AI tools,” he says, “we’re talking about something entirely different. Software social organisms would have agency with unlimited opportunities to help students. There are never enough humans in education. Imagine if you had a one-on-one system that helped motivate you and tutor you on your specific needs. That would be amazing.”

How much research, engineering, and evaluation are needed to make these collaborators a reality remains unknown, but progress has been made. For example, Narrative Science, an AI company developed by Northwestern computer science professors Kris Hammond and Larry Birnbaum, interprets data and turns it into English-language stories written for specific audiences.

Ultimately, Forbus says, software social organisms will fit into the culture where they reside and become full-fledged partners, making our increasingly complex work more productive and efficient.