See all NewsEngineering News

New App Replaces Passwords with Gestures

RiskCogs uses smartphone’s motion sensors and machine learning to determine how user handles the device

You might not think the way you hold your smartphone is anything special. But everyone has their own signature hold that includes their grip, gestures, and the angle at which they hold the device. Northwestern Engineering’s Yan Chen is using this distinctive information to develop a dynamic authentication system for mobile devices that does not use sensitive data and is nearly impossible to hack.

Working with his PhD student Tiantian Zhu, Chen developed RiskCogs, a novel security system that collects data from the smartphone’s motion sensors — accelerator, gyroscope, and gravity sensor — and uses machine learning to determine how its owner handles the device. The app then “knows” whether the device is being held by its owner or another person and only unlocks for the owner.

Yan Chen“Everyone has their own signature way of holding their phone,” said Chen, professor of computer science in Northwestern’s McCormick School of Engineering. “Every time you use the phone, the app continues to learn from you.”

From credit card numbers and bank account information to personal photographs, more and more people are storing sensitive information in their mobile devices. To keep this information secure, developers have used passwords, fingerprint and iris scans, facial recognition, and more. But all are imperfect solutions.

“Passwords are hard to remember and easy to guess,” Chen said. “Biometric security, such as fingerprint and iris scans, pose huge privacy risks. And facial recognition authentication is easily hackable.”

But not only are gestures not sensitive data, they are also nearly impossible to imitate. This means RiskCogs is extremely difficult to hack.

To train RiskCogs’s machine learning algorithm, Chen and Zhu used motion sensor data from more than 1,500 users. So far, it operates with 95 percent accuracy and will continue to improve as the algorithm learns. Because all smartphones come with embedded motion sensors, RiskCogs does not need any additional hardware to operate. It just takes one or two weeks of learning its owner’s motions, and then it can make the decision, in real time, to stay locked or to unlock.

“It’s completely unobtrusive,” Zhu said. “You don’t have to do anything except hold your phone, and it uses non-private data.”

Chen and Zhu imagine that RiskCogs could be particularly useful in devices such as smartwatches, which do not have embedded cameras for facial recognition or room for fingerprint identification.

Patent-pending, RiskCogs is now available for Android smartphones.