Can a Piano-Playing Robot Cure Lonely Feelings?

Master of Science in Robotics student Musheng He looks back on her independent study project and what she learned from the experience.

When Musheng He considered what she wanted to pursue for her independent study project in Northwestern Engineering's Master of Science in Robotics (MSR) program, she thought about how technology has brought the world closer together. Despite these technological innovations, studies have shown an increased rate of individuals being lonely. 

Intrigued by this paradox, He wondered if there was a way to have robots fill that loneliness void.

She decided to try and design a system that would make it possible for the Baxter Robot to detect negative emotions in a person and then play songs on a piano to help them manage those feelings. She recently took the time to talk about the project and what she learned from the experience. 

What was your motivation for the project?

My motivation came from John Donne's "No Man is an Island." Living in an era when a friend is only one video chat away, communication seems simpler and faster than any time before. However, rates of loneliness have significantly increased in the last few years, and it seems like many people literally feel they are on their own isolated islands. Even for myself, I have times where I feel lonely and wish someone could be by my side. Inspired by this, I designed this project to see if robots can help. If you can’t find someone you trust to talk with and feel sad on your own, you can have a robot who is by your side to make you happy.  

Why did you choose the piano?

I personally like playing the piano, and the piano is a widely-used instrument in music therapy.

How do you describe how you used machine learning to make the robot play music?

Making the robot play music involves aspects of motion planning, force control and computer vision. My project basically has two parts. The first part is to use machine learning to build a model that takes the human face as input and emotion as output. When the camera captures your face, the system sends your face to the model and captures your emotion. If a sad emotion is detected, the motion-planning code starts to run and the robot then starts to play music.

What was the biggest challenge you faced?

The biggest challenge of my project was the motion planning for Baxter. The controller of MoveIt! can fail easily. Even though a path was planned and executable, Baxter wasn't always able to actually execute the trajectory. Since the piano keyboard is relatively small, it really needs a lot of parameter tuning and tinkering. To get a smooth trajectory is also not easy. It’s still not perfect, but the whole trajectory looks nice. I’m so glad I finally was able to make it work.

What are the most important lessons you've learned from this project?

The biggest takeaway from this project was I was able to make an effective action plan. Breaking a large project into several smaller milestones and doing tasks step-by-step really made me more focused and committed to what mattered and finally helped me reach my goal.

How do you think you'll be able to incorporate the lessons you learned from the project into your life after graduation?

This project was my first independent robotics project, and it really boosted my self-confidence in my ability to get things done. I also know I can tackle big problems by breaking them into smaller tasks.

What do you hope to do after MSR?

My ultimate goal is to design robotics systems to make people live better lives. So after MSR, I hope to use my skills and knowledge to develop applications for robots to perform actions that benefit children, seniors or people with disabilities.

McCormick News Article