Listen to this phone call.
And now listen to this phone call.
Just somebody calling to make reservations and appointments. Nothing odd about that, right? Well, nothing aside from neither of the callers being human.
Today we announce Google Duplex, a new technology for conducting natural conversations to carry out “real world” tasks over the phone. The technology is directed towards completing specific tasks, such as scheduling certain types of appointments. For such tasks, the system makes the conversational experience as natural as possible, allowing people to speak normally, like they would to another person, without having to adapt to a machine.
One of the key research insights was to constrain Duplex to closed domains, which are narrow enough to explore extensively. Duplex can only carry out natural conversations after being deeply trained in such domains. It cannot carry out general conversations.
While sounding natural, these and other examples are conversations between a fully automatic computer system and real businesses.
The Google Duplex technology is built to sound natural, to make the conversation experience comfortable. It’s important to us that users and businesses have a good experience with this service, and transparency is a key part of that. We want to be clear about the intent of the call so businesses understand the context. We’ll be experimenting with the right approach over the coming months.
That is both amazing and extraordinarily creepy and unnerving. It’s yet another case of it’s cool that you can do that, but should you really be doing it? As someone who often gets headaches because of the telephone I like the idea of somebody else doing all my scheduling and such, but having a computer dressed up as a person do it feels dishonest and wrong, like I’m tricking people.
I want my machines to sound like machines. Clear voiced machines sure, but still obviously machines. It’s a trust thing, like I said. Right now it’s just scheduling appointments, but eventually it’s going to get more sophisticated. And the better it gets, the more its use will spread. The more it spreads, the more likely it becomes that any of us could find ourselves interacting with it. And the more we interact with it, the less we understand what’s real and what isn’t. Google might want to somehow be transparent, but what about the next company? There are already enough things in this life that can’t be trusted (audio clips, video clips, photography, corporations, the government), the last thing any of us needs is for our basic one on one interactions to join that list.