BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Could Jibo Developer Cynthia Breazeal Be The Steve Wozniak Of Robots?

This article is more than 9 years old.

Bill Gates once predicted we would have a robot in every home to go with our personal computer. James Temple of <re/code> is calling Jibo—a new personal robot to be sold commercially in 2015—“one of the most ambitious and affordable robots for the home that [he has] seen.” I agree.

Developed by social robotics pioneer and MIT professor Cynthia Breazeal, Jibo represents a sleek, responsive, versatile platform that can help individuals and families negotiate daily life. Here is the video Breazeal’s startup put together:

There are at least three things that make Jibo really exciting. The first is the price point, as little as $500 if you preorder. (I did.) This price point reflects just how much the costs of sensors and other components of robots have come down in recent years. A second exciting aspect of Jibo is that developers will be able to write apps for it. It appears as though Breazeal’s team expects a marketplace for Jibo software. A third is that the team obviously put an immense amount of thought into how Jibo will interact with its users—if the video is any indication, we should expect less Siri and more Her.

These are ingredients of success there. Jibo could well be the Apple II of robots—the first popular entrant that opens the door to personal robotics as a true household staple. But it will face hurtles along the way, including from law and policy. I’m going to explore a few here, with some unsolicited advice to Jibo and lawmakers thrown in along the way.

Security

A few years ago, Tadayoshi Kohno, Tamara Denning, and some other of my colleagues at the University of Washington bought some Internet-enabled home robots in order to assess how hackable they were. It turns out they were plenty hackable. According to their research, not only could hackers see and hear what the robot did, but they could move it around. Thus, a compromised robot could in theory grab your spare key and drop it out the pet door. I was particularly struck by their finding that robot Internet traffic looks different enough that someone sniffing around a network could isolate robots from other devices and target them specifically.

Hopefully Breazeal and her team will take security seriously right from the outset. Many startups do not. Everything can seem fine until a high profile hack or research like Kohno’s shows how vulnerable the product makes the consumer. Even in the absence of a hack, the Federal Trade Commission can and will use its authority to police against unfair and deceptive practice to ensure that a device—including a robot—has adequate security. It is, if anything, even more important that a social robot like Zibo be hardened against external interference.

Privacy

I had occasion to visit the Microsoft Home of the Future a few years ago, a mock house meant to showcase forthcoming technology. It was really neat. Part of the demo involved placing items on the kitchen counter until, suddenly, the light softened and a voice came out of nowhere. “Hello, I’m Grace. It looks like you’re baking a cake. Do you need a recipe?” It is all well and good for Grace to watch and respond if you are baking. But what if you are using the kitchen counter for, um, a different activity?

Jibo has sensors and will live your home, and so it raises all the usual privacy concerns that attend devices which gather, process, and store information. But Jibo is also a social platform. There is a long literature, of which Breazeal is not only aware, but helped pioneer, suggesting that humans are hardwired to react to anthropomorphic machines as though a person were present. This includes the feeling of being observed. What this means is that you will never feel alone with Jibo. Your few remaining opportunities for solitude could disappear. And activities that feel very transactional today—typing a phrase into a search box—will become conversations. You won’t search for “symptoms X, Y, or Z” but ask Jibo, “Can you help me find out more about hemorrhoids?”

On the other hand, one advantage of Jibo is that, for once, a device that is recording information about you will actually feel like it is! Jibo is a far cry from a Nest thermometer on your wall, silently collecting your energy habits for processing at Google. Jibo will provide what I call “visceral notice” of the collection of information—more powerful, certainly, than any privacy policy.

It will take very clever design, at any rate, and likely much iteration, to properly balance a welcome social feel with a constant source of judgment. Moreover, not only will Breazeal’s very expert team face this issue, but potentially every app developer with access to the robot that can influence the interface.

Manipulation

In his book Robot Futures, roboticist Illah Reza Nourbakhsh explores the prospect that social robots will be used to manipulate consumers and citizens. BJ Fogg, Ian Kerr, and I have all made similar arguments in the past. The danger is, of course, highly speculative. But the scenario runs something like this: Imagine a virtual pet that requires digital food pellets to stay alive. Say the pellets are free at first but then come at an escalating cost. If you don’t pay, unfortunately your increasingly beloved pet “dies.” Another scenario has the lonely user buying gifts for a virtual girlfriend.

Let me be clear: never in a million years do I think Breazeal would permit her creations to be used in this way. She has, as far as I can tell, devoted her life to putting humanity into human-machine interaction. Yet, is this true of everyone? If not, the issue is one bad actor away from a consumer protection problem. Moreover, imagine if Breazeal had no choice in the matter. Imagine a suspected criminal owned a Jibo. Law enforcement could, without question, obtain Jibo’s sensory date with sufficient process. But is this all? Could law enforcement, for instance, obtain a warrant to plant a question in order to check out an alibi? “I’m trying to update your calendar. Did you end up going to your mother’s Friday morning?” Could a hacker prompt Jibo to ask for password information not already in its files? These things are not necessarily in the developer's control.

Liability

I said above that part of what makes Jibo so exciting is the prospect that anyone with the training can write software for it. Breazeal refers to Jibo as a “platform.” Brian David Johnson goes further in describing Intel's Jimmy, encouraging consumers to think of the robot as a “smart phone with legs.” By design, Jimmy’s hardware and software are meant to be customizable by the end user.

This is great news. It means that we do not have to wait on Jibo or Intel to come up with every useful application for the robot. Indeed, as with personal computers, the true “killer app” (not literally, one hopes) could come from anyone or anywhere. Jonathan Zittrain’s book The Future of the Internet (And How To Stop It) describes this feature of open platforms particularly well. But, as I explain at length in an article (and much shorter Mashable op-ed), the prospect of open robotic platforms also raises thorny legal question it may take courts, states, or even Congress to address. Specifically, we do not know whom to hold responsible when an open robot running third-party software hurts the consumer or a friend.

Zibo does not directly confront the issue of physical harm because, perhaps wisely, Breazeal’s team has equipped the first generation with neither a gripper, nor the means to move about the room. The robot can swivel and rotate but remains fixed in place. Will this always be the case? And, if the platform is open, will someone provide Jibo with wheels—just as Romo gave wheels and a face to the iPhone? Time will tell.

This is hardly an exhaustive list of potential hurdles. Were Jibo to used for therapeutic purposes, for instance, then the startup might have to contend with the Food and Drug Administration. But any mass market home robot will have to contend with at least these issues. It is a very exciting time to study robotics, in part because of the genius and perspiration of world-class roboticists like Cynthia Braezeal who have been clearing one hurdle after another for decades. It is our responsibility as a legal and policy community to help Breazeal and company clear the remaining ones while preserving incentives for user safety and respect.