McCormick Magazine

See PDF Version PDF file - Adobe Acrobat required

Robotics: Specialized Machines Inspired By Biology

Email::

Admit it: as a child you thought you’d have a robotic housecleaner by now—like Rosie on The Jetsons. Sure, the robotic vacuum the Roomba is one of the best-selling consumer robots, but the humanoid robots that seemed inevitable in science fiction have not yet become a reality. Over the past 50 years the robotics research community has experienced a series of stops and starts as scientists discovered that the problems of robotic sensing and movement were more difficult than anyone imagined. At McCormick, engineers have set their sights on designing specialized robots that can perform specific tasks—like navigating crowded, murky waters or replacing lost limbs—that humans find it difficult or impossible to do. Six McCormick professors—all winners of National Science Foundation CAREER awards, all well regarded in the robotics field—are working to make some of these specialized robots a reality by focusing on the relationship between engineering and biology, increasing collaboration among
disciplines, and partnering with the Rehabilitation Institute of Chicago.

Image of Todd Murphey and puppetsFrom puppets to prosthetics
Todd Murphey, assistant professor of mechanical engineering, found his somewhat unlikely robotic subject a few years ago when he asked an undergraduate seeking a final project to create a mathematical model of how marionettes move. Around the same time, Murphey met Magnus Egerstedt, professor of electrical and computer engineering at Georgia Tech, who mentioned his collaboration with the Center of Puppetry Arts in Atlanta. “We began talking about how we could create an autonomous puppet system,” Murphey says, “and it started to become exciting from a technical engineering perspective.” Their ultimate goal is to stage an autonomous robotic puppet show that would be indistinguishable from a human-powered performance.

The implications of their work go beyond a puppet show, of course. The mathematical equations and computational simulations underlying the project could provide a new basis for controlling prosthetics. When Murphey and his group began creating computational methods to simulate marionettes in real time, they collaborated with Wendy Murray, assistant professor of biomedical engineering and of physical medicine and rehabilitation, who uses computer simulation techniques and experimental methods to investigate how we humans move and control our
arms and our hands.

These simulations are complicated. Consider a simple action like a hand holding a ball: there is a mechanical connection that runs from the palm through the thumb through the ball and then back through the other fingers before returning to the palm. The mathematics Murphey is using to simulate puppetry turned out to also be useful in creating simulations like Murray’s—which ultimately could make better prosthetics. “We want to be able to take healthy subjects, read electrical activity from their muscles as they move their hand, and see a simulated hand moving with the same motions in real time,” Murphey says. “If we can do that, that will be a huge step toward correctly interpreting what the nervous system is attempting to do. Then we could tell a prosthetic hand to make the same motion.”
Murphy’s goal has turned out to be much more difficult to reach than he anticipated. The group’s first 20-second puppet simulation took eight straight days to complete, and it took a year of algorithm development to get that down to less than a minute. The coding for the hand simulation took a year and half. “This project has gotten stuck many times,” Murphey says, laughing. “That’s why we’ve learned a lot from it.”

Along the way Murphey has met with puppeteers to ask them what they think about when they perform. It turns out they do three related things: they imitate human motions, simplify the movements needed to create those motions, then exaggerate the motions for display on stage. Replicating that process with robots turns out to be a little more challenging. A motion as simple as waving a hand requires movement at both the shoulder and the elbow, making it difficult to seem natural.

In the course of its research, Murphey’s team has worked with the Walt Disney Company, which partially funds the project and is interested in creating autonomous puppets for its theme parks. The group also works with Elizabeth Jochum, a graduate theater student at the University of Colorado at Boulder, who created the puppets for the team’s first puppet show: a version of Ovid’s Pygmalion, in which a sculptor falls in love with his creation. The group also worked with Egerstedt at Georgia Tech and used motion-capture suits to record dancers performing the show; it will use that data to create the numerical model of how the puppets will perform.

In his lab in the basement of the Technological Institute, Murphey and his team have created a makeshift stage where puppets are strung by wires to tiny cars that control the wires while rolling around on a ceiling. The puppets move and wave according to motions made offstage by students using remotes from Nintendo’s Wii gaming system. Murphey hopes to stage a human-controlled puppet show later this year, but the ultimate goal is making the performance completely autonomous. Murphey estimates that is still far off but credits his successes along the way to his collaborations at McCormick.
“The robotics group here is amazing,” he says. “We have some of the best people in the country, each doing very different work. For me, the huge motivation in coming here was this diversity of faculty with whom I can collaborate and with whom my graduate students can commingle. I think that’s really valuable.”


Nature as inspiration




A neuroscientist and engineer by training, Malcolm MacIver, associate professor of biomedical and mechanical engineering, studies the black ghost knifefish, which lurks in the rivers of the Amazon basin. The fish is special: it hunts for prey using a weak electric field around its entire body, and it swims both forward and backward using a ribbon-like fin on the underside of its body. These qualities make the fish an excellent means to understand how a nervous system implements sensing and movement.

The sensory capabilities and agility of the knifefish also make it an intriguing model for specialized underwater technologies, such as plugging a leaking oil pipe or monitoring oceanic environments like fragile coral reefs. A particularly surprising movement of the fish came to MacIver’s attention when one of his graduate students noticed the fish suddenly moving vertically in the water. How could it do that? And how could they create a robot that could do that, as well? Further observations and computer simulations revealed that when the fish moved vertically, it did so by creating a downspout of fluid by  rippling its fin in a special way. The downspout of fluid causes a reaction force to push the fish up, allowing it to move in a purely vertical way.

With this knowledge, MacIver’s group hired Kinea Design to design and build a robot that mimicked the fish’s maneuverability and electrosensory system. The company, which specializes in human interactive mechatronics and which was cofounded by McCormick mechanical engineering professors Ed Colgate and Michael Peshkin, began to fashion a waterproof robot with 32 motors that each independently control one ray of a spandex-covered artificial fin. (That means the robot has 32 degrees of freedom; in comparison, industrial robot arms typically have fewer than 10.) Seven months and $150,000 later, the GhostBot came to life.Image of Malcolm MacIver

“The robot is a tool for uncovering the extremely complicated story of how to coordinate movement in animals,” MacIver says. “By simulating and then performing the motions of the fish, we’re getting insight into the mechanical basis of the remarkable agility of a very acrobatic, nonvisual fish.”

MacIver and his team hope to improve the robot so it can autonomously use its sensory signals to detect an object and then use its mechanical system to position itself near the object.

MacIver isn’t the only professor who looks to animals for clues to sensing: Mitra Hartmann, associate professor of biomedical and mechanical engineering, and her group use the rat whisker system as a model to understand how the brain seamlessly integrates the sense of touch with movement. Using high-speed video to examine the relationship between rat head and whisker movements, Hartmann aims to gain insight into the underlying organization of the nervous system. She has developed several artificial whisker arrays that mimic the sensing powers of rats. One of them is able to use only information about how whiskers bend to determine an object’s complete 3-D shape. That technology could find applications on assembly lines, in pipelines, or on rovers or underwater vehicles.

“We’re interested in the principles that underlie mammalian brain structure and function and in how information is transformed at various stages in the nervous system,” Hartmann says. “The rat whisker system is a wonderful model for looking at these sorts of questions.”

Humans and robots collaborate
The robotics community at McCormick has a solid foundation in the Laboratory for Intelligent Mechanical Systems, directed by Peshkin, Colgate, Murphey, and Kevin Lynch, professor of mechanical engineering.

Lynch’s specialty lies in robots that can do dynamic manipulation and locomotion— catching, juggling, running, hopping, and climbing and leaping between two walls. These robots could lead to new military technology for rough terrains where wheeled or tracked vehicles won’t work. With colleagues at the Northwestern Institute on Complex Systems, which he codirects, Lynch also looks to swarming behavior in nature (like flocks of birds) to create group intelligence in small “swarming” robots.

In addition, he works with professors at the Rehabilitation Institute of Chicago on restoring arm function to people who have suffered spinal cord injury. The approach is based on functional electrical stimulation, in which stimulators implanted into the muscles are used to bypass the brain-muscle nervous system connection that has been broken at the spinal cord. “RIC is a great resource,” he says. “Anybody who does anything with human-robot systems wishes they had RIC nearby.”
Colgate and Peshkin have also collaborated with RIC, most notably with Todd Kuiken to create the world’s most advanced bionic arm. Kuiken is professor of biomedical engineering at McCormick and of physical medicine and rehabilitation at the Feinberg School of Medicine as well as the director of RIC’s Neural Engineering Center for Artificial Limbs.

Using an innovative procedure called targeted reinnervation surgery, Kuiken grafted the nerve endings that once went into an amputee’s limb onto the amputee’s pectoral muscle. Once the nerve endings grew into their new location, Kuiken and his team could use sensors to read the impulses of the nerve to move a prosthetic limb. Unexpectedly, these nerve endings were also able to receive input, meaning that new prosthetic devices could actually provide touch sensation to the user similar to the way a real limb would. Colgate and Peshkin used their research in haptics—tactile feedback technology that uses touch as an interface—to give the arm “touch feedback” capabilities (see McCormick by Design, spring 2007).

“We’re interested in how people and mechatronic systems interact,” Colgate says. “We’re trying to create haptic devices that make that interaction better.”

Colgate and Peshkin found success several years ago developing what they called “cobots”— robots that help humans perform tasks. Cobots have been used as high-quality haptic displays, rehabilitation devices, and assistive devices for workers in automobile assembly. This last application has been developed into a spin-off company whose products are now being used in auto plants around the world.

Colgate and Peshkin also remain busy with Kinea Design, which they founded in 2003. In addition to MacIver’s fish, the company has created rehabilitation robots such as the KineAssist, which helps stroke patients regain the ability to balance and walk, and, most recently, a robot assisting workers in meat-processing plants to help them avoid repetitive-strain injuries. The company employs six full-time engineers, several of whom are McCormick alumni.
   
Meeting student demand
Not only has this core group of six professors expanded robotics research at McCormick, they’ve also provided new mechatronics courses in response to student demand. The popular ME 333 Introduction to Mechatronics course, in which students worked in teams to produce computer-controlled electromechanical projects of their own design, has been expanded into a three-course sequence of electronics design, embedded computing, and mechatronics projects. A new lecturer, Nick Marchuk (MS ’10), is also working to expand the curriculum and runs the annual Design Competition, in which students design, build, and program robots to operate autonomously.

All of these faculty members work together on research projects, and in the next two years they will move into new shared lab space in an expansion of the Technological Institute.

As Colgate says, “McCormick is a force to be reckoned with in the robotics world.”

Emily Ayshford