TORONTO/NEW YORK â" The robot hand has been wrapped in artificial skin in order to make it feel more human, the professor from Japan is explaining. Hearing this, I want to take a quick, discreet glance around to see if everybody else in the room thinks itâs as weird as I do, but I canât. Like them, I am getting to see this talk because Iâm attending a massive conference on human-computer interaction, which has brought thousands from around the world to a convention center in Toronto. But unlike them, Iâm not quite there.
Hideyuki Nakanishi explains that he and his team from Osaka University have built a âremote handshaking systemâ that gives the power of touch to people video-chatting with each other through a screen. By sliding your real hand into the fake one attached to your monitor, you can gauge the confidence in a prospective business partnerâs grip, feel your far-away girlfriendâs fingers, or greet your father-in-law with a manly shake. The hand even maintains human body temperature, so when you touch it, itâll be warm, just like real flesh.
By the time Nakanishi shows a video of a Japanese pop idol shaking hands with the device on national television, the MIT researcher who brought me to the conference is losing his mind with delight. âSO CRAZY,â he types into a chat window. Soon after, the talk draws to a close and everyone in the room starts enthusiastically clapping.
Everyone but me. As Nakanishi thanks the audience and prepares to take questions, I find myself looking down at my own hands, which are perched on a keyboard about 350 miles away, in a New York apartment that also contains the rest of my body. In the conference room, I am only âpresentâ in the form of a very simple robot. My face is being projected on an iPad screen, my voice can be heard through a speaker, and the rest of me is represented by an adjustable pole connected to some kind of computerized base with wheels that I can control from my distant laptop.
The MIT team responsible for bringing this machine to Toronto and arranging for it to receive press credentialsâ"yes, Iâm wearing a badge around my aluminum neckâ"have great hopes for it. Having christened it the Peopleâs Bot, they have been using it over the past several days to demonstrate how robotic âtelepresenceââ"controlling the movement and interactions of a physical proxy in a distant placeâ"could help radically expand our individual horizons.
Their ambitions are of the world-changing variety, but they do not solve my immediate problem. Under normal circumstances, it would be strange of me not to join in the show of appreciation for Nakanishiâs presentation. But I am at home by myself eating cerealâ"a fact I am trying to conceal from anyone who might be looking at my screen in Toronto. Putting my spoon down and clapping would make me feel ridiculous.
And then something happens. Audience members with questions about the robot hand rise from their seats and line up behind the microphone thatâs been set up in the middle of the room. I use the right arrow key on my computer keyboard to rotate myself, so that I can see them. As I do, it dawns on me that, regardless of where I am physically, the truth is Iâm âhereâ enough that, if I wanted to, I could wheel over, get in line, and ask a question. Merely imagining myself doing this makes me suddenly feel very present indeed. I feel a new resolve: After the next presentation, I will clap, and Iâll keep my hands in the frame so that the person delivering the talk can see me doing it.
On impulse, I spin myself around to get a look at everybody else in the room, which also shows them my face. As I rotate past, one guy takes out a camera and snaps a photo of me. It wonât be the last time that happens today.
***
The basic notion of virtual presenceâ"your body is here, but youâre also magically moving around thereâ"is a deep-seated desire of humankind, not unlike invisibility, immortality, and flying. But the set of technologies needed to make it practical has come together only recently. The person credited with kick-starting the modern wave of telepresence research is Massachusetts Institute of Technology professor Marvin Minsky, a pioneer in artificial intelligence. âWe could have a remote controlled economy by the twenty first century if we start planning right now,â he wrote more than 30 years ago.
Now, fast data connections have made videoconference applications like FaceTime standard on smartphones, and teleconferencing someone into an office meeting via Skype or Google is no big deal. The model serving as the Peopleâs Bot, a $2,499 device called Double, is a novel combination of iPad and self-balancing remote-controlled scooter. Engineers around the world are working on more advanced versions, including one that allows people with disabilities to control telepresence robots with their minds.
As telepresence technology has improved, people have come up with a number of ways it can fill vital needs. There are housebound children around the country who attend school remotely by using robot decoys. Many hospitals in California have telepresence devices set up that allow specialist doctors to see distant patients who would otherwise never have access to them. A urologist at Boston Childrenâs Hospital has been sending telepresence robots home with patients who have gone through surgery, and would otherwise need to be brought to the hospital several times a week for checkups. Researchers at UMass Lowell have been working on a way for nursing-home residents to be telepresent with their far-flung families; the Department of Defense hopes telepresent surgeons can improve battlefield trauma care.
But thereâs another, conceptually distinct way telepresence might change things: by helping people leap societal barriers rather than physical ones. This notion was illustrated most dramatically at the TED conference in Vancouver this March, when NSA leaker Edward Snowden was interviewed on stage using a telepresence robot called the Beamâ"created in 2012 by a Palo Alto company thatâs selling them for $16,000â"and then spent some time mingling with the crowd, even as his body was stuck in Moscow, where he has been granted temporary asylum by the Russian government. TED attendees thronged the robot to take âSnowden selfies,â even though the star âselfâ was someone who wasnât thereâ"who couldnât be there, in fact, because he would have likely been arrested if heâd tried to fly to Canada.
The team of researchers who developed the Peopleâs Bot, associates of MITâs Center for Civic Media and the Media Lab, see telepresence as a tool to help all kinds of people facing constraints on where they can go. Snowdenâs constraint is a legal one, of course. But a simple bot could also leap security walls, or financial ones: A democracy activist could attend a human-rights summit while trapped in Syria; a family in Indiana could take their artist daughter to the Louvre even if they canât afford the airfare. Just hypothetically, it could also help a reporter in New York who wants to attend a conference in Toronto on extremely short notice, on a day when he already has plans with his mom.
As we prepare for a near future in which robotic telepresence is a routine part of lifeâ"in which meetings might be attended by some mix of real co-workers and bots; in which youâll go to a party and see some old college buddies beaming in from the opposite side of the countryâ"researchers are trying to figure out how being âpresentâ in this strange new way should feel, how it should work, and how much it could ultimately change human interaction.
True, I could have just asked someone attending the conference to FaceTime me on their iPhone and carry me around. But the unequivocal lameness of that scenario underscores the potential value of telepresence bots. There are certain situations where it really matters to be in the room, controlling where you go and what youâre looking at, and perhaps more strangely, taking up physical space. The people you cross paths with encounter a version of you and are forced to grapple with your presence.
At a tech conference, of course, the entire point is for people to mingle and network and eat together and bump into each other unexpectedly. âPresenceâ is exactly what weâre buying when we pay for the plane ticket, the hotel room, and the sometimes high price of entry. So what difference does it make when youâre sort of present and sort of not? This is the question Iâm trying to answer in Toronto. The answers surprise me in a whole bunch of ways that I could not have predicted.
***
As I wait for the second talk to start, the person sitting next to me in the audience begins to cough and sneeze riotously. I am suddenlyâ"irrationallyâ"seized with the desire to be sure heâs covering his mouth. I even find myself a bit frustrated that I canât subtly look over and check without theatrically rotating the screen that my camera is mounted on. When I start paying attention again, I realize that the person who has taken over the podium, a researcher from the University of Wisconsin-Madison named Irene Rae whom Iâm supposed to meet a little later on the conference floor, is giving a talk about telepresence bots exactly like mine. Hearing her describe the experiments she conducted on them, I feel self-conscious.
After the talks conclude, I glide slowly out into hallway, and am carried by my companion from MIT to a different floor. (Escalators, it turns out, are tough for robots.) Once I am back on my own two wheels, I notice crowds of people all around me, standing and chatting and figuring out where they have to be in order to make their next session. As I navigate myself, using the arrow keys on my computer, I worry about hitting someone as I move among them. I notice with some satisfaction that lots of people are looking at me. Some are even waving.
When I hit a clearing, a friendly young woman comes up to me, introduces herself as Leila, and asks where I am. I am very briefly confused by the question: Weâre in Toronto, of course! But when I catch her drift and admit I am actually in New York, she doesnât seem to hear me. Before long, it becomes clear that the volume on the Peopleâs Bot just doesnât go loud enough to carry my voice in this noisy hallway. To hear what Iâm saying, Leila has to put her face right up against mine. This seems to work, and after a bit of basic back and forth, I ask her what it feels like to be talking to me. âDo I seem like a human or a robot to you?â Leila thinks this over, and after a moment, says something thrilling: âItâs like a hybrid of both. Like a cyborg!â
Electrified by this assessment, I spend the next 15 minutes roaming around trying to meet people, and crowing âHELLO!â to anyone who makes eye contact with me, like an over-excited parrot. All the attention makes me want to perform a little, but my technical capabilities limit the tricks I can do to raising and lowering my the pole that connects my head to my motor, and rotating in circles. Before I can get too hammy with any one conference-goer, however, I find that my impossibly quiet voiceâ"for which I try to compensate by leaning in toward the microphone on my laptop, only to realize this means leaving the cameraâs field of visionâ"leads most of my potential new friends to lose patience and move on to other things. When someone says to me, âIâm really sorry, but I canât hear you!â I know theyâre not really rejecting me, but it hurts my feelings a little bit anyway.
Some frustration sets in, and perhaps as a result, I get distracted and bump rather violently into a large man. âExcuse me,â I shout to him, with my hands around my mouth, as he gets out of my way with a bewildered look on his face. When I try to follow him so I can ask what it was like getting bumped into by a cyborg, I find I am a lot slower than he is, and I lose him in the crowd.
I have better luck with a group of three young people, who are willing to go out of their way to hear what Iâm saying. After they take some selfies with me, I ask them, in a voice so loud that I wonder if my neighbors will think I have lost my mind, if they could imagine letting someone like me hang out with them. âSure!â says a cheerful boy named Javier. âMaybe we could get some drinks or some coffee.â He is holding a coffee cup as he says this, so I pick up the one on my desk and show it to him. Everyone laughs.
Soon it is time for my meeting with Irene Rae, the researcher from the Wisconsin Human-Computer Interaction Lab, and her adviser, Bilge Mutlu. When Rae tracks me down, she says cautiously, leaning into the frame of my camera, âI think Iâm supposed to be meeting you?â It feels like weâre two strangers who have agreed to meet for lunch but have neglected to describe what we look like.
When we find a quiet place to talk, Rae explains that robotic telepresence research is still in its early stagesâ"that at this point, experts still donât know exactly what is needed to make people feel physically present in a place where they are not, or how best to help them interact with people who are. Mutlu, who has joined us, notes that this is not merely a question of technology, but of social norms as well. According to one study, people who are telepresent feel âviolatedâ when people who are present-present move them around without their permission, or put their feet up on them as if they were furniture. Then thereâs the question of how close people should get when theyâre interacting with someone who is telepresent. âRight now,â Mutlu admits, âIâm getting very close to you, in order to hear you, and it feels a little uncomfortable for me.â
***
Over the course of the two hours or so that I spend as the Peopleâs Bot, I feel genuinely transported, and Iâll come to remember it much more as time spent at a conference than as time spent sitting in front of my computer. I ride around and wonder who Iâm going to talk to next. I spontaneously ask questions of experts I would otherwise have to arrange formal interviews with over e-mail. In some meaningful way, I have broken through the barrier separating me from this distant group of people. Of course, I cover a lot less ground than I normally would, because my motor just doesnât move me as fast as my legs do, and I am somewhat dependent on a handlerâ"J. Nathan Matias, who is cocaptain of the Peopleâs Bot project, with his MIT colleague Chelsea Barabasâ"to navigate and avoid getting lost, stolen, or knocked over. I also feel a bit clumsy and impaired in various ways, almost like I am drunk. (Perhaps if I were invited to the bar with the grad students afterwards, I would fit in a little better.)
As my time with the Peopleâs Bot winds down, Matias asks me if I would mind going over and showing myself off for some folks from Microsoft, who had helped him with a Wi-Fi problem the other day. Valentina Grigoreanu, a researcher at Microsoft, seems very impressed with me. âItâs just amazing that youâre so far away and you can attend this conference,â she says. But then she betrays that sheâs probably just being polite; it turns out she comes into contact with beings like me on a not infrequent basis. âWe have a few workers at Microsoft who are exactly like this all the time. Theyâve got their own office and they go to meetings and they can move around like you do,â she says. This makes me feel like I am not alone in the universe. But it also kind of blows my mind that Iâm talking to someone for whom this is just no big deal.
Grigoreanu then invites me to dance with her. I agree before I know exactly what this will entail. A moment later I find myself standing in front of an Xbox system set to a game called âJust Dance,â which uses Kinect technology to give players the ability to manipulate virtual versions of themselves by standing and moving around in front of a TV screen. The idea is to dance to the music, try to stay on rhythm, and do as many of the right moves as possible. When asked what song Iâd like to dance to, I request âI Will Survive.â
Soon the beat kicks in and I start to do my thing, such as it is. Side to side, up and down, round and roundâ"I donât have a lot of options. As the chorus hits, I start to experiment, jerking around as quickly as possible by hitting the left and right arrow keys on my computer in rapid succession. When I turn myself around to look at Grigoreanu and Matias, who are dancing behind me, I see that they are able to do much more complex things with their human bodies. I feel a bit jealous of their versatility; Iâm so focused on finding cool-looking ways to move that I canât even tell what the on-screen me is doing, or if the Xbox is registering my strange body at all. Nevertheless, I feel like weâre all three dancing together. In the end, Matias and Grigoreanu both get more than 6,000 points, and I get 962â"not bad, I think, considering I donât have arms or legs.
Not long after this exhilarating exercise, Matias informs me that he has to go, which means itâs time for us to say goodbye. As I thank him for his help I am possessed of a desire to shake his hand, and must settle for an exaggerated, friendly head-nod instead. Nakanishiâs remote hand-shaking system suddenly starts to make more sense.
As Matias leaves my field of vision, I find myself staring at a wall, listening to a conversation I can hear taking place behind me and wondering, with some nervousness, whether Iâm about to be unplugged. After a few seconds Matias pokes his head in front of the iPad and says, with a friendly chirp, âFeel free to log out when youâre ready!â And just like that, I turn myself off.
Related:
⢠More from Ideas: What âmomâ really means in America
⢠6/10/13: Latest creation beams superior to you
⢠4/4/13: Boston to get new Bank of America ATMs with video chats
No comments:
Post a Comment