Master Blaster Robotics

Not too long ago, I was watching a story on the evening news about a student graduating college who had been paralyzed in a drunk driving accident (he was the one who was driving drunk) just before starting college. Having known someone killed by a drunk driver, I wasn’t too sympathetic. But we all make mistakes, right? So I was touched, as most people would be, when he stood up and walked across the stage at graduation with the help of a robotic prosthesis built by the college’s technology department.

[Read the full article here.]   [Austin Whitney’s web site for drunk driving prevention]

But something that got me was the technology they used.  I can understand if it was a project for the engineering department’s curriculum.  But there was nothing unique about it, like memory metals for muscles.  In fact, it wasn’t that much different from what we’ve already seen in the Cyberdyne HAL-5 and the Raytheon Sarcos XOS and XOS2.  And there was no specification of its lifting ratio, unlike the HAL-5 with its 8:1 weight lifting boost and the XOS with its 20:1 lifting ratio.  But it worked.  And with a little assistance, a paraplegic walked.

This got me thinking.  Sure I thought about Ripley and Iron Man.  But I also starting thinking about robots, specifically humanoid robots designed to move autonomously through the environment, by analyzing and adapting.  And then I thought about the 1985 film Mad Max Beyond Thunderdome.  There were a couple of significant characters that came to mind.  The first was called Master.  He was brilliant, a true genius.  But he was also a dwarf.  The other was called Blaster.  He was strong, powerful.  But he was also mentally retarded.  Together they were called Master Blaster and formed a symbiotic relationship between mind and muscle.  So what about a symbiotic relationship between mind and machine?

What if there were a Master Blaster Robotics Platform which could move autonomously through an environment based on simple commands from the pilot?  Imagine that.  Instead of having to learn how to walk all over again, someone needs only to know some basic commands for a robot which can then carry out the movement on its own.  The idea certainly isn’t new.  But the technology to make it happen is right around the corner.

Robotics have come a long way toward autonomy.  They range anywhere from small toys like Sony’s Aibo to higher-end research projects out of places like MIT and Cal-Tech.  But the crown jewel of humanoid autonomatons is still Honda’s Asimo.  It’s still not perfect, but it has shown it can handle self-mobility in a varying environment.  So the basic platform is not the greatest issue.

The biggest factor to deal with is communication between man and machine.  The machine must be able to receive commands from the pilot, and the pilot must keep tabs on the processes of the machine.  Each one is likely to be quite different from the other.

There are a few different ways for a human to command a robot transport.  Sure, there’s the joystick method and the keyboard method.  But there are a couple which are more likely.  The first is voice command.  This would work as long as the computer can differentiate between commands and talking to other humans (or one’s self for that matter), and if the environment did not require the pilot to be silent.  Of course there’s also laryngitis.

The other method is what I believe to be the most likely.  It’s the Brain/Machine Interface (BMI).  BMI is using Electroencephalography (EEG) as an interface.  (If you’ve ever seen somebody with a whole bunch of sensors surrounding their cranium then you’re somewhat familiar with EEG).  BMI was demonstrated in 2009 by Honda when a person mentally commanded an Asimo unit.  It was pretty rudimentary (right hand, left foot, etc.) and used a rather large computer, but it was over 90% effective.  Emotiv has an EEG interface which has much more impressive results, and it only requires a PC or Mac.  It has a USB connection and costs less than US $300.  As long as the computer can tell the difference between thinking about going somewhere and actually wanting to go somewhere, the BMI control would be the most effective.

As for the machine communicating with the human operator, there also is a number of ways to accomplish this.  The two most likely are audio and video.  Audio can be accomplished by a simple earpiece worn in the ear producing tones, speech, or both.  On the video side, which I think would be the better option, there could be a set of goggles with a heads-up display, similar to the laser goggle displays made by Microvision.  Either way, the machine talking to the pilot is probably the easiest hurdle to get over.

As for powering the machine, there are batteries currently being developed to simulate capacitors.  This would allow them to fully charge in minutes, not hours.  That’s good since hauling a human being around on a humanoid transport can be a real drain on the power supply.

So the technology is there.  If it can be refined and combined, then the possibility of the disabled to walk by way of a Master Blaster Robot is a very real possibility in the near future.  At the risk of being overly optimistic, I believe that, given enough focus and barring all economic, business, political, human, and technological hurdles, then the idea that paraplegics could gain full robotic mobility is not only a real possibility, it could begin to become commonplace within a decade.  Quadriplegics could follow soon after.


Daniel C. Handley

Dan Handley was raised a Trekkie, fell in love with "Star Wars" at an early age, and became obsessed with comic book superheroes. He spent his youth dreaming of how to get real superpowers, starships, and so on.

One thought on “Master Blaster Robotics

Leave a Reply

Your email address will not be published. Required fields are marked *

Solve : *
22 ⁄ 11 =

This site uses Akismet to reduce spam. Learn how your comment data is processed.