How much should we hand over to robots?
Tuesday, April 21, 2015 - 20:30

Stephen Hawking recently warned that the development of full artificial intelligence (AI) systems could spell the end of the human race.

From agriculture and manufacturing to education and medicine, some experts are predicting a future where blue and white collar workers will soon be replaced.

Technological progress has seen robots become more sophisticated, so what will be the economic impact of AI?

Could we actually be entering an age of abundance for humans catered to by robot slaves?

Insight asks: Is humanity really being threatened by thinking robots, and where will robotic advances take us into the future?

What are the ethical and moral questions to consider?

Presenter: Jenny Brockie 

Co-Producers: Paige MacKenzie   Stefanie Collett 

Associate Producer: Amanda Xiberras 

Join the discussion by using the #insightsbs hashtag on Twitter, or posting on our Facebook page.


JENNY BROCKIE: Welcome, everyone. A few unusual guests with me here tonight. Good to have you all here as well. I'm going to start here with Sam. Stand up. Stand up!  

ROBOT SAM:  Hey, Jenny - is it just me, or do some days you feel like someone is controlling your every move? 

JENNY BROCKIE: Actually, yes, I do! Stand up! Stand up!  

ROBOT SAM: Do you want me to stand up? 


ROBOT SAM: Alright. Here I come. Good job helping me stand up, Ms Brockie. 

JENNY BROCKIE: Thanks, Sam. OK, let's try...  

ROBOT SAM: Can I tell you something charming? 

JENNY BROCKIE: You can tell me anything, Sam. Yes!  

ROBOT SAM: You are much taller in person. 

JENNY BROCKIE: What can... I've got to wait till he recognises me again. Hey! Sam! Back. Come back.  

ROBOT SAM: Don't you click your fingers at me. 

JENNY BROCKIE: I'm sorry, Sam. Out of line. I'm sorry. OK. I'm going to talk to Sam's buddy.  

ROBOT NOW: Hey, there. 

JENNY BROCKIE: Hey, there. What's your name?  

ROBOT NOW: You can call me Now. 

JENNY BROCKIE: Now? And Now, you can do something a bit special? I'm going to get you to do it right now. Thriller dance.  

ROBOT NOW: My most famous dance. I found it... 

JENNY BROCKIE: I think he wants you to clap! 

THRILLER SONG:  # It's close to midnight # Something evil's lurking in the dark # Under the moonlight # I see you # You try to scream... 

JENNY BROCKIE: Pretty good! Now, a few false starts there, but Jonathan and Richard, you brought both of these along today - I mean, everything is programmed into them, basically, isn't it - for them to respond to me? How much do they recognise me?  

JONATHAN KINGSLEY: That's a good question. Um, essentially the robot, when it spends more time with you, will actually get to recognise you better. Because you've only spent limited time with it this afternoon, it's having trouble recognising you. But it sort of learns to interact with the person that it's with better and better the more time you spend with it. But you can program in whatever you like. 

JENNY BROCKIE: OK. But can it respond spontaneously to me? No.  

JONATHAN KINGSLEY: It does have to be programmed in. 

JENNY BROCKIE: OK. I'm interested in how they're used, and Christine, I know that, um, you're an assistant principal and that you use Sam here in class, don't you? 



CHRISTINE ROBERTS-YATES: Well, I've got a disability unit - I'm the assistant principal of the Murray Bridge High School. There are 30 students there, all of varying disabilities. About 12 students have autism. So we use Sam - for example, he can dance and they will dance for those students who don't want to participate in physical education, they will respond to him beautifully. He can do spelling and maths activities. For non-verbal students, he will use picture cards and they will respond in that way. Yeah. 

JENNY BROCKIE: Why not use a person for that, though? Why use a robot?  

CHRISTINE ROBERTS-YATES: I think the students' response to him is terribly engaging. He's a little bit of a novelty. But he is reinforcing what is happening in the classroom. So he's an additional aid and he's adding value to what's already there. 

JENNY BROCKIE: Do you think him - look, I just - sorry, before I go on, I just have to mention the two seals that are over here, in case anyone's wondering what the noise is. They're doing the rounds of the room. We'll explain what they're for a little later on. OK. So Christine - is not being human an asset for you in using this robot?  

CHRISTINE ROBERTS-YATES: I think for some of the students, yes. 


CHRISTINE ROBERTS-YATES:  Because he's got a set expression in his face, so there's less overstimulation from the students by having all the facial expressions that one uses in interactions throughout the day. 

JENNY BROCKIE: April and Thomas, you play a game – “How Do I Feel?" -  with Sam, don't you?  



JENNY BROCKIE: Do you want to show us how that works, do you want to come up and show us? OK, so do you want to show us how this game works? How do I feel?  


ROBOT SAM: How do I feel? Would you like to hear Sally's birthday or Peter's birthday? 

APRIL BIRRELL:  Peter's birthday! 

ROBOT SAM: Let's hear about Peter's birthday. Peter woke up early on his birthday. He laid on his bed and thought about the wonderful day he would have. Is Peter feeling happy, sad or disappointed? 


ROBOT SAM: Happy? Well done - that was exactly it. OK. Here comes the next part of our story... At 8:00, he jumped out of bed and pulled on his dressing gown, all the time thinking about the presents waiting for him when he got downstairs. Is Peter feeling sad, disappointed or excited? 


ROBOT SAM: Excited. Well done - that was exactly it. You did great. Hope you had fun. 

JENNY BROCKIE: You did great! Thank you. That was terrific. Um, tell me what it's like, having lessons with him. Do you like it? 

APRIL BIRRELL:  Yes, it's fun. 

THOMAS HOPPER: Yes, it is. 

JENNY BROCKIE: Yeah? You can go back and sit down now. Thanks a lot. Do you like him? Do you think of him like a person?  



APRIL BIRRELL:  No, like a robot. 

JENNY BROCKIE: He's a robot. Yeah. But do you like him?  


JENNY BROCKIE: Do you enjoy being with him?  


JENNY BROCKIE: Mm. OK. Anne, Thomas is your son. Has Sam had an effect on him?  

ANNE HOPPER:  Well, I think he has. He's got a lot more confident in how he's progressing and that at school and around the house. ..with him just being a normal teenager and having the little bit of extra help is great. 

JENNY BROCKIE: Yvonne, you use a robot at home too, with your two adult children. Let's have a look.  




ROBOT:  I see a blue horse looking at me. (NEIGHING) Blue horse. 

ROBOT: You have received a voice message. This is the message... 

YVONNE CARTWRIGHT:  It's engaging and it's cute. And when you have a child's attention, you can teach them something. They have different autistic traits. Christopher really enjoys doing quizzes. 

ROBOT: Would you like to play a quiz? Please select a quiz. 

YVONNE CARTWRIGHT:  We actually are doing a toileting programming with Christopher. Use some soap, mate. 

ROBOT: When I have been to the toilet and I have pushed the button and washed my hands, I need to - one, leave the bathroom, two - dry my hands, three - turn off the light? 

YVONNE CARTWRIGHT:  The robot just sits there quite patiently until he gets the right response. 

ROBOT:  Nearly there. Try again. You have 9 right answers out of 10 questions. Well done. To get a number, say "yes" when I say "next". 

YVONNE CARTWRIGHT:  We've tried to increase Melissa's level of concentration. 

ROBOT:  Number 29 - 2-99. 

YVONNE CARTWRIGHT:  Melissa really enjoys music. That’s a good one, I like that one. (DANCING QUEEN PLAYS) We don't celebrate them finishing high school or going to uni, but we celebrate the light bulb moments, those moments in time when you see sheer joy on your children's face by something they've done, and they know they've achieved something. 


JENNY BROCKIE: That does look like sheer joy, too!  


JENNY BROCKIE: How long have you had the robot?  

YVONNE CARTWRIGHT:  We've had the robot in our house for probably 14 months. And over that time, it's changed the way it does things. It was - it had a lot of programming in it for working with people who have dementia. We have, over time, changed that to individualise the robot to work with each one of our children. 

JENNY BROCKIE: When you say "we" have, who is "we"?  

YVONNE CARTWRIGHT:  I work with the wonderful team at Latrobe University and the professor who's sitting behind me. We've actually always been very interested, as a family, in technology because it works really well with children with an autism diagnosis. 


YVONNE CARTWRIGHT:  Um, it engages them - they have - a lot of children who are autistic have trouble focusing on people and having facial contact with people. They feel very threatened by that. So you have to find other ways to engage them and to teach them.

JENNY BROCKIE: Is that something about not being judged?  

YVONNE CARTWRIGHT:  It's all of those things - it's about not being judged, it's about the robot, or the computer, or the iPad, repeating things a hundred times over, and it really doesn't matter, you know? That's fine. 

JENNY BROCKIE: But it's a bit tiresome for you?  

YVONNE CARTWRIGHT:  It's a bit tiresome for us. And the other thing is that, um, because they lack concentration and they lack eye contact, you have to try and find other ways to get the - to grab their attention. And sometimes that's for 30 seconds - and that's OK. So it's what you can work on in that 30 seconds and if that is a computer or a robot or whatever that is, it really doesn't matter. 

JENNY BROCKIE:  So what difference has it made to your kids, do you think, in that time? It's a relatively short time. Have you noticed a difference?  

YVONNE CARTWRIGHT:  Melissa, for instance, um, is not a particularly confident young lady. Her verbal skills were very limited about two years ago, I'd have to say and look, I can't say whether it's the robot that that's - but it's certainly contributed. But she's now talking more than she ever has. She is much more social and she's now producing short sentences. Just this year, you can almost have some sort of conversation with Melissa where you're asking her something and she's responding appropriately. 

JENNY BROCKIE: OK. Daniel, you're a urologist. When you operate, you use five arms. How?  

DR DANIEL MOON, EPWORTH HOSPITAL:  We have been involved in the implementation and development of a robotic surgical program which started in Australia - in Melbourne - just over 10 years ago. Essentially this is a platform that expands the ability to perform keyhole or laparoscopic surgery. The robotic system puts a telescope down that feeds images for each eye so we get binocular, 3-dimensional vision of the console by the corner of the room, so you sit next to the bedside with your head in a console, and suddenly it's like having your head immersed in an abdomen. You have high-definition digital vision, and the instruments are 8mm wristed instruments, so you have full dexterity. 

JENNY BROCKIE: What can the robots do now that a surgeon can't do, for example?  

DR DANIEL MOON:  Inside our console, we can beam real-time imaging - so I can put an ultrasound in the abdomen, control that with the robotic arm, and suddenly split the screen in two and map out a tumour or kidney in real time, so that you get a very precise dissection. You can even put in immunofluorescence agents to map blood flow through segmental arteries because you have to be able to control the bleeding before you go cutting into a kidney that can bleed ferociously.  This is only urology. 

JENNY BROCKIE: Okay, Daniel Milano in Melbourne, you had a kidney cancer removed last year by Daniel's robot. How did the operation go?  

DANIEL MILANO: It went quite smoothly, actually. I was quite impressed. It took about three days to get home from hospital. opposed to three weeks, if I would have had it the traditional way, which would have involved a 10-inch incision. At the moment - well, I've only got six little scars which you can barely see. 

JENNY BROCKIE: What was it like for you, as an oncology nurse, being operated on by a robot?  

DANIEL MILANO: Um, the machine is not working on its own - it's as good as the surgeon that's sitting behind it, who is always in control so I actually thought it was kinda cool! 

JENNY BROCKIE: Peter, you're hoping to make money out of this robot, whose name is Baxter. What can he do?  

PETER BIRNBAUM, PULLMAN LEARNING GROUP:  Um, it comes in two versions. One version is a research version. This version here that we've brought into the studio is the manufacturing version. 

JENNY BROCKIE: Yeah. Can you show us how he works, what he's going to do here?  

PETER BIRNBAUM:  Yes, I can show you how easy it is to try Baxter. 

JENNY BROCKIE: Yep. Go ahead. He has a very quizzical face.  

PETER BIRNBAUM: So, Baxter will respond and will look at the person who's working with Baxter. If you are to move that arm, then this face would move to you. 


PETER BIRNBAUM:  It's really a means of communication. It's not designed to be humanoid in appearance but, in a factory floor, it's designed to be able to communicate in the same way as this glows green now. If there's a problem, it'll turn orange. So Baxter is really simply designed to - if I... ..create a new program... ..take Baxter over here... 

JENNY BROCKIE: OK, so you're going to make him stack the dishes?  

PETER BIRNBAUM:  Stack the dishes... 

JENNY BROCKIE: So you're making that program now? 



PETER BIRNBAUM:  In the same way that someone in a manufacturing plant would, um, train Baxter to do a task... 

JENNY BROCKIE: So the programming is that quick, huh? 


JENNY BROCKIE: And everything, obviously, has to be in precisely the same spot?  

PETER BIRNBAUM:  More or less. Baxter has cameras at the end of each arm, and also up here. So it's now able to feel its way and to, um... 

JENNY BROCKIE: Whoa! We could have a plate-smashing party at this point... That's pretty impressive!  

PETER BIRNBAUM:  So it could work on a C and C machine, or on a machine, and just stand there and work all day doing a job that's required of him. 

JENNY BROCKIE: Ah. OK. And anyone can do that kind of programming? Anyone can do that. Great. Alright. Thank you for that. Um, Peter Corke, you're a roboticist. How are you using Baxter?  

PROFESSOR PETER CORKE, QUEENSLAND UNIVERSITY OF TECHNOLOGY:  Currently, we've got our Baxter looking at the problem of picking capsicums. They're not easy things to pick. Baxter may not be the right robot to do it, but it's pretty convenient to actually just prototype some strategies. 

JENNY BROCKIE: Is that because it's a job you can't get people to do?  

PROFESSOR PETER CORKE: There are a lot of jobs in agriculture and horticulture where growers have trouble finding enough people to do the work at the time that it needs to be done. In horticulture, time is critical, manufacturing, not so much. So yeah, there are a lot of jobs where you're out in the sun, you're down on your knees picking these wretched capsicums and twisting them off the bush. Yeah, they struggle to find people to do that. 

JENNY BROCKIE: What else are robots doing ....that people might be surprised by?  

PROFESSOR PETER CORKE:  In our society, we have a lot of assets - so it might be the outside of a building or the tops of power poles. How many power poles do we have? It's a lot! If you want to inspect the insulators or cross-arms on top of those poles, what do you do? You go out with a cherry picker and take someone up - that's really slow and expensive. Maybe you've got a flying robot - just let it go, it flies up, takes some pictures, and comes down again, or hops from pole to pole. Maybe you have robots go there instead with our eyes, bringing the images back it for humans to look at. 

JENNY BROCKIE: So, will they start taking masses of jobs? 

PROFESSOR PETER CORKE: They will start to take some jobs, yes. 

JENNY BROCKIE: How many jobs are future-proofed from robots, do you think?  

PROFESSOR PETER CORKE: Roboticists. Look, I think in this country we have a productivity issue and if we're not productive, then, as a country, we won't be able to compete. So I think people plus machines is going to improve our global productivity and competitiveness. 

JENNY BROCKIE: Toby, what do you think? How many jobs do you think are safe from robots and artificial intelligence?  

PROFESSOR TOBY WALSH, NICTA:  It's not just roboticists - I think almost any job is at risk. Just to make a little joke here, it used to be that we were going to be the panellists up on the show. 

JENNY BROCKIE: Yes, I know. 

PROFESSOR TOBY WALSH:  ..and we're sitting in the audience now - the robots have taken over this show already! 

JENNY BROCKIE: How successfully, I'm not quite sure. 

PROFESSOR TOBY WALSH:  It's early days. 

JENNY BROCKIE: It's early days.  

PROFESSOR TOBY WALSH:  It's early days. But they certainly have. 

JENNY BROCKIE: But seriously, how many jobs do you think are actually future-proofed from robotics and artificial intelligence?  

PROFESSOR TOBY WALSH:  I seriously worry that there are really very few jobs left that, once the robots and artificial intelligences and technology gets better and better, it's hard to think what I can recommend to my daughter to study at university that will future-proof her. 



JENNY BROCKIE: Policemen, firemen?  

PROFESSOR TOBY WALSH: Policemen! Unfortunately Hollywood's already given you a very bad picture of what robotic policemen could do. 

MAN:  RoboCop! 

JENNY BROCKIE: Who wants to throw Toby a job that they think might be safe from robots and artificial intelligence? Yeah? 

MAN:  Music producer. 

PROFESSOR TOBY WALSH: Well, there's a program called Aaron that has exhibited at the Tate Gallery, and you have to spend thousands of dollars to buy his pictures. So the creative industries possibly are not safe either. Can I have another job? 

JENNY BROCKIE:  Yes Daniel?  

DR DANIEL MOON: Prime Minister? 

PROFESSOR TOBY WALSH: Prime Minister, well sometimes the politicians do seem to behave rather robotically. Um, yeah, I think politicians - that's about the one winning answer we have for this game, which is politicians... 

JENNY BROCKIE: They'll be the survivors, the politicians? 

PROFESSOR TOBY WALSH:  Well, they'll legislate that they will be the survivors. But there's an argument that maybe computers will be, you know - more fair, less arbitrary. 

JENNY BROCKIE: So assuming all those jobs change, how does that then impact on the way that we run the whole society, the economy?  

PROFESSOR TOBY WALSH:  I think that's a really interesting conversation as a society that we need to start having. This is the next revolution, the information revolution that will change the nature of work completely. This is going to, you know, centralise wealth in the wealthy. It's going to squeeze the middle-class professions are also going to be automated. Sadly it's not just going to be the people in the factories, - it's also going to be many of the things that we thought were safe – the lawyers, the accountants, and maybe eventually the HR consultants. 

JENNY BROCKIE: Tim Dean, you're a philosopher and a science editor. Um, do you think we'll be looking at a whole different model of work and income and all of those things?  

TIM DEAN, THE CONVERSATION:  Absolutely. I think the next maybe decade or two decades - the thing that's different about this that is different from anything that's come before is a tractor could replace a number of workers doing physical labour, and a factory could amplify the productivity of a number of labourers and then people can move into offices and use technology to become more productive still. These can do not just the physical things, but the cognitive things as well. 

So if there are no jobs that are safe, what we see is a really different pattern from anything we've seen before. It used to be that technology would produce more jobs than it destroyed. So people would move from the fields to the factories and the factories to the offices. Now, when the robots move into the fields, the factories and the offices, then we have this weird thing that - I call it like the Prosperity Paradox. We've got these incredibly productive workers and they are working 24 hours a day, they're more capable in many ways than people are, so they're generating a lot of wealth. They're spurring the economy on and yet there's mass unemployment. 

So now that wealth that is generated is flowing to the people who own the companies or who own the robots or who program the software or who own the patents, and they're very, very wealthy, whereas the people without jobs are living in a society with immense amounts of wealth, and they're not partaking in it. That is a really fundamental transition. 

JENNY BROCKIE: So there's a column between income and wealth.  

TIM DEAN:  One of the potential solutions is to detach work from income - the idea of, you know, earning a living, working to earn a living - could change. At the moment, that seems a bit strange, because we kind of work as hard as we can to earn as much as we can to get ahead and compete with everyone else. Yet when we start seeing unemployment getting up to 7%, 10%, 15%, 40%, it might not be an option - it might be a necessity that we need to work less and in doing so, we spread what jobs are remaining more evenly. Some people talk about a universal basic income, basically just giving everybody a minimum wage. 

JENNY BROCKIE: Like communism?  

TIM DEAN:  Well, it does look like that, which is interesting - because communism was an inferior solution to the problem of solving scarcity - how do we get enough stuff to live, to eat, to clothe ourselves? It wasn't as good a solution as capitalism. But that's predicated on the problem of scarcity. So if there's not a scarcity of goods anymore, it doesn't matter as much how we run the economy. But it requires a lot of change to take place before that can happen. It's not like we can start doing that now. It's not like we can start handing out minimum wage to everybody, no questions asked, now.

JENNY BROCKIE: Jong Lee in Hong Kong, who is that sitting next to you? 

JONG LEE, HANSON ROBOTICS:  Hi. Sitting next to me is Han. 

JENNY BROCKIE: Why did your company create him?  

JONG LEE: Hanson Robotics is in the business of creating technology and robots that help people - that help people relate. So areas like customer service or, you know, maybe retail or public safety, but particularly in difficult, challenging areas that a lot of our societies face, such as the challenges with healthcare - life companions for the elderly, social training for autistic children. 

JENNY BROCKIE: Alright. Can you show me what he can do, talk to him for me, and just show us what he's capable of?  

JONG LEE:  Sure. Sure. So why don't we first actually show Han doing one of the things it does best - show expressions. Han, do you want to smile for everybody? Why don't we go through some expressions? Do you want to show empathy? Can you show empathy? How about melancholy? How would you look if I made you angry? Oh, that's very nice of you. There you go. You want to give a wink to the audience in Australia? Shout out to SBS? 

JENNY BROCKIE: Terrifying wink, I have to say. Talk to him for me. Have a conversation with him.  

JONG LEE:  Sure. Hello. 

ROBOT HAN:  Hi, there. 

JONG LEE:  How are you today, Han? 

ROBOT HAN:  I am doing very well. How are you? 

JONG LEE:  I'm doing great. Thank you. How old are you? 

ROBOT HAN: Glad to hear it. What's new? Don't mention it. Everything is running smoothly. Who is your favourite science-fiction author? 

JONG LEE:  Wow, lots of questions! Can you tell us how old you are? No? OK. Ahh, do you like Australian wine? 

ROBOT HAN: Sir, I like tea. 

JONG LEE:  Oh you like tea, interesting. 

ROBOT HAN: My favourite beverage is coffee. 

JONG LEE:  Coffee, that's great. 

ROBOT HAN: My brain does not have a response for that.

JONG LEE:  OK... Tell us a joke. 

ROBOT HAN: OK, I will tell when I see M... 

JONG LEE:  OK, I guess he's not in the mood to tell a joke. 

JENNY BROCKIE: How sophisticated is his intelligence?  

JONG LEE:  It is still early days, but I think the technology and robotics has advanced tremendously over the past few years, and we see the progress only accelerating. 

JENNY BROCKIE: Why have you made him look so lifelike? I mean, if you don't mind me saying so, it's slightly creepy that he looks so close to a human head, with a whole lot of machinery inside it.  

JONG LEE:  Right. Sure. We totally understand. We ultimately believe that, to be able to best help people like you and me, we have to understand us, ourselves, and we are human. So we believe that one of the best ways to be relatable to human beings is to be as expressive and human as possible. We want to make sure that people feel at ease so that the robots and technology are able to be more helpful. It's really almost part of the fundamental proposition that you want to be able to treat and look at the elderly, at the patient, at the autistic child, and smile at the child - a big advance would be to make sure that the technology is able to communicate at that really basic wavelength. 

JENNY BROCKIE: Mm. So how far away are we from someone like Han being able to think for himself?  

JONG LEE:  Artificial intelligence, both specific and general, has advanced greatly and you can have conversations whether it's on a smartphone with Siri or with a great robot like Han. One of the other things is you can use human robots like Han to be able to teleport - to be able to move in space and time. It can actually almost be like your avatar. You can be in Sydney, Melbourne or Brisbane and also physically be present in Los Angeles or Hong Kong, and you could actually use it as that lens, as that window. The person in Hong Kong looking at the robot will actually feel a little bit more like they're physically engaged with somebody in Hong Kong. 

JENNY BROCKIE: What do you think about this? Toby, what do you think about Han?  

PROFESSOR TOBY WALSH:  I mean, it's a great robot. As you saw, it doesn't have a huge amount of intelligence at the moment, but that will come. People have been saying... 

JENNY BROCKIE: Is it far off?  

PROFESSOR TOBY WALSH:  People have been saying, I don't know, 30-40 years. If you ask most AI researchers, they will say that. I mean, they have been saying that for the last 30-40 years, but I think this time we can say we can smell that the success is possible, and that we should have to start planning for success. 

JENNY BROCKIE: Hollywood has plenty of scenarios about this, of course, for robots like Han. Let's have a look at one of them.  

“CHAPPI”, MOVIE VOICEOVER: The government built them to oppress. If we control the robots, we control the people. This is your day of reckoning! The people stole one... fight back. We reprogrammed the robot. He is the key to the revolution. A machine that can think and can feel. It can outsmart the enemy and free us all. This robot has got to be removed. 

JENNY BROCKIE: What do you think of that kind of interpretation, Jong, of artificial intelligence? Do you think Han could think for himself one day?  

JONG LEE:  You know, that's a very good question, and I think a fundamental concern when you see robots and movies like Chappie. Just like any great technology or tool, I think each community, each country, each circle needs to be responsible and really be thoughtful about how to maximise the benefits while managing some potential liabilities. 

JENNY BROCKIE: What about the ethics of the applications?  

JONG LEE:  Sure. Ethics are very important. We fundamentally believe that making sure that technology treats human beings better - technology that helps us be more human - is ethically a very good thing. 

JENNY BROCKIE: OK, Richard, what did you want to say?  

RICHARD MCINNES: There's been a lot of research done with autistic children using robots like Han compared to using robots like Now - it's shown that robots like Now are much more beneficial because they lack that humanoid appearance and there's none of these non-verbal cues. For example, when we did the demonstration earlier, the students were listening to what you said as opposed to looking at the space on his face - they're looking at his eyes. That's what builds up these social skills and social cues. Having a robot like Han - how is that different from just having a human? 

JENNY BROCKIE: Unfortunately - technology being what it is - we've lost the satellite to Hong Kong, so we can't go back to Jong to get his response. But we can keep talking a little bit more about this. Wendy?  

PROFESSOR WENDY MOYLE, GRIFFITH UNIVERSITY:  Um, I am concerned that the only research that I'm familiar with suggests that older people and people with dementia don't like humanoid robots. They'd prefer robots to actually look like robots. 

JENNY BROCKIE: Let's have a look at the robots that you use to work with dementia patients. 



IRA:  What is it? I wouldn't want a meet a lion! 

WOMAN:  I love animals. 

WOMAN 2:  That would be nice - a real live animal. 

IRA:  Pat a seal? I don't think you'd pat a seal, would you? Heavens above... I wouldn't think they were worth bothering with. 


IRA: Hello. What have you got here? 

PROFESSOR WENDY MOYLE:  Oh, do you want to have a look.

IRA:  Yes. 


CARER:  I think she's probably a little worried that she's making sounds and she can't fix it. 

PROFESSOR WENDY MOYLE:  It's like your dog... So if you touch him... What is he doing? He will talk to you... 

WOMAN:  This is amazing! 

WOMAN 2:  That's cute, isn't it? 


WOMAN 2: Yeah... Oh, yes... 

DAUGHTER:  Mum can't communicate very much anymore. She's lost the ability of speech. I really don't know how she's going to react. You're not really interested, are you? Sorry. No. It's just another soft toy, isn't it? It's just another soft toy.

IRA:  Her eyes closed. It's not a real one, that's for sure. 

WOMAN:  Oh, you're getting - mind your manners, please. 

CARER:  Think it wants a big hug. There you go! 


JENNY BROCKIE:  So, some mixed reactions there. What's the point of your study? What are you aiming to do there with the seals?  

PROFESSOR WENDY MOYLE:  We have the largest study that's ever been done on these types of companion robots. With over 400 people with dementia - varying stages of dementia - and we're looking at whether it makes a difference to them in terms of emotional response, physiological measures - we're interested that people with dementia often wander or walk a lot, whereas other people just sit in a chair. So we're interested in physiological measurements - does it make a difference in that also? So, motion. We're also interested in terms of - does the robot reduce sedative medication, for example? We're also doing a cost analysis on these robots as well. 

JENNY BROCKIE: Why robots, though? What about the effect of people on people?  

PROFESSOR WENDY MOYLE:  Sure. Look, working with older people in nursing homes, it's very difficult to get people to work in that area. We have many nursing homes with pets. They have one dog, 80 residents, et cetera. That dog gets very stressed. It gets fed under the table so it gets very fat. It'll often trip over people. It can bite people. So it can be quite expensive in terms of maintenance of it as well. These companion robots were around - you can use them on a one-on-one basis. And we thought it was worth actually seeing before they took over the world - whether they did have an effect. 

JENNY BROCKIE: Do you think they are having an effect?  

PROFESSOR WENDY MOYLE:  Interestingly. So we're halfway through the trial, and certainly I'm seeing some very positive responses. In terms of people with dementia, they are often left for long periods in a day alone. So we have, in our research, we find that people with dementia may only get face-to-face contact for 28 minutes out of 24 hours. So our belief is you've got to have - it would be nice to have something else to fulfil their time.

JENNY BROCKIE: Rajiv, you work with dementia patients with robots as well, yeah?  

PROFESSOR RAJIV KHOSLA, LA TROBE UNIVERSITY:  We've deployed them in several homes, and I think some of them are still deployed, 18 months after we started. So the services are provided, which could be singing and dancing, to their favourite songs, to augment their good memories or telling them their favourite stories, or, you know, linking their up with some relatives or making phone calls for them and telling jokes. The weather, the news - you name it. 

JENNY BROCKIE:  Alright. Are you getting good results?  

PROFESSOR RAJIV KHOSLA:  Yes. If your carer is leaving the home, for 20 minutes ago for the groceries, it can send videos of the interaction of the person with dementia every few minutes to tell the son or daughter that your mum is fine. That's what we've been doing in Melbourne. 

JENNY BROCKIE: How does some of our general audience feel about it? Yeah? 

MAN:  I think the stimulation is a really good thing. If people with dementia don't have that sort of stimulation, they'll deteriorate much more rapidly. 

JENNY BROCKIE: Rob Sparrow, what do you think? 

ASSOCIATE PROFESSOR ROB SPARROW, MONASH UNIVERSITY:   Look, I'm very cynical about this project. If you imagine how you would feel if you were cared for entirely by robots - that 28 minutes of face-to-face contact is whittled away because someone says, "We don't need to go into that room, she's got a seal." It was very clear with the conversation with the gentlemen from Hanson Robotics - the aim is to get people to think that these systems are things that they're not. It's fundamentally an attempt to deceive people, to deceive vulnerable people. 


PROFESSOR PETER CORKE:  I personally am a little unsettled by the idea of delegating what I think should be human care to machines. Maybe there's no economic alternative. But if you flip the question around and say what if you use robots to look after young children, preschool - preschools staffed by robots? Generally, most people would find that much more repugnant than using robots for elderly people. 

YVONNE CARTWRIGHT:  I've actually worked in aged care for the last 17 years as a registered nurse, and I work on the floor, not in admin. I don't think these robots are not an as-well-as, and not an instead-of - they're an as-well-as. I don't think we'll actually use the human contact. 


ASSOCIATE PROFESSOR ROB SPARROW:  Look, I think that we've always got to be thinking about what the alternatives are, and I think people here are trying to have it two ways - they're trying to say these aren't replacements for human contact, and then they're trying to sell us on the idea that there aren't enough nurses or people in the aged-care sector. We could pay people in the sector better. We could value older citizens. This conversation is not being driven by older people. It's a conversation that is being driven by a vaudeville act from engineers and computer scientists. 

JENNY BROCKIE:  What about the idea, though, that if people think grandma has the seal, we don't need to visit her, she's fine in that room with the seal, she's happy, that’s one less thing for me to have to do today?  

PROFESSOR WENDY MOYLE:  We've found that hasn't happened and we've also found that the staff become more engaged. They see a person who may not have been communicating, communicating with the robot. They also start to communicate with that person. 

JENNY BROCKIE: Mm. Can I ask what everybody thought of the seal? We handed it around in the first part of the show.  

WOMAN: It's just a toy. That's all it is. It can't replace a cat or dog. As a dementia - if a person has dementia, they still know the difference between a toy and a real, live animal. 

JENNY BROCKIE: OK. Anyone else? What reactions do people have to them?  

WOMAN:  I don't think anyone's saying that this should be instead of family going to visit, but as someone who's just had a grandma but into a dementia unit, we feel awful that we can't get there all the time. To have something there knowing that they're not sitting in a room totally alone is a bit comforting.

JENNY BROCKIE: Let's switch territory a little bit for a minute, and look at another area of robotics and artificial intelligence that may well be getting a bit ahead of ethics. Josh, you're a motoring journalist. Um, I want to have a look at a clip, then I'll have a chat to you about it.  



(MUSIC PLAYS) # Love is in this room # And you're here # I try to keep my heart closed # I try to keep my heart closed # If you need me # I will be the one # You better hurry # 'Cause I am with you now # Look at your heartbeat # You better hurry up # 'Cause I am running out... 


JENNY BROCKIE: Josh, you've been in one of those driverless cars. What are they like?  

JOSHUA DOWLING:  I have. Well, it's fascinating when it works. They do work with a little bit better reliability than the robots we've seen tonight. But the point is that they do occasionally go wrong. And when that happens, if you were to totally rely on it, the consequences would be catastrophic. 

JENNY BROCKIE: OK. So how far are they from taking over the road? 

JOSHUA DOWLING:  What we're seeing is the gradual automation of the automobile. Car companies like to talk about autonomous driving - it's a very, very loosely used term, and they're actually changing their language to "piloted driving", because they really believe that the car driver should never, not be in control, just like a pilot needs to be in charge of a plane. 

JENNY BROCKIE: OK. So that's the law at the moment - can you envisage that changing?  

JOSHUA DOWLING:  I don't think we'll see it for at least 15-20 years, maybe more. So what the car can do today is it can maintain a distance between the car in front - it's actually smart enough to stop if the car in front of you stops and follow it again, and will keep following it, providing it's going where you want to go. There's also automatic emergency braking, so a lot of cars now will automatically slam the brakes if you're distracted at speeds up to 50km/h, and will miss the car in front literally by that much. That's what's available today. 

JENNY BROCKIE: And cars can park themselves now too? 

JOSHUA DOWLING:  They can. It does take longer to park it automatically than it does if you can park normally. It's a little bit of a gimmick. I think most people buy their new cars and show their neighbours the first time, and probably never use it again. 

JENNY BROCKIE: And never use it again. 

JOSHUA DOWLING:   What's interesting is all these sensors that are going into cars – cameras, radars and lasers – all these sensors - when they combine those with the on board computer, and you combine this with GPS and speed-limit-sign-reading technology, you then have a very, very smart automobile. 

GOOGLE VOICEOVER:  The software needs to classify these objects appropriately based on factors like their shape, movement pattern or location. For example, if there's a cyclist in the bike lane, the vehicle understands that it's a cyclist, not another object like a car or pedestrian, so the cyclist appears as a red box on the safety driver's laptop. And the software can also detect the cyclist's hand signal and yield appropriately.

As a passenger, it can feel uncomfortable passing by a large vehicle on the road. Our engineers have taught the software to detect the large vehicles and, the laptop shows them as larger images on the screen. As the vehicle passes by a large truck it will actually keep to the farther side of the lane and give ourselves a little bit more space as it passes the truck. 

JENNY BROCKIE: What sort of problems are showing up in the tests?  

JOSHUA DOWLING:  At the moment, they're not very good in rain or snow. Night driving is OK if the line markings are good. If you're driving into sunset or sunlight is directly in your eyes, or when the road has a sheen on it, it can't see those road markings. But it is surprisingly good. When you experience it when it is working well, it works wonderfully, but I wouldn't be taking my hands off the wheel just yet. 

JENNY BROCKIE: How soon do you think you’ll be able to buy a car like this, here and overseas?  

JOSHUA DOWLING:  I think we’ll see that by 2018-2020. By 2025 - 2030, a wide window, we will get to a point where you will be able to turn up at a shopping centre, get out of the car near the entrance, press a button on your smartphone, the car will park itself, you go shopping, come back, press the button again, and the car will retrieve you. I think what we'll initially see is automated driving zones, where if we do get to that point, only cars that are automated can go into that zone, or you won't be allowed to touch the controls in that zone. And therefore, eliminate crashes. But the reality is, technology is fallible. 

JENNY BROCKIE: What do you think about the ethical issues around an automated car and the programming of it and the decisions it makes? Rob?  

ASSOCIATE PROFESSOR ROB SPARROW:  One thing that people should keep in mind here - if it's not as safe as a human driver, it shouldn't be on the road. If it's safer than a human driver, it should be illegal to drive manually in the future, when these systems get good enough, manual driving will be like drunk driving - you'll be putting other people at risk when you put your hands on the wheel. 

JENNY BROCKIE: So if they can be better than human beings, they should be mandated?  

ASSOCIATE PROFESSOR ROB SPARROW:  It should be mandatory. 

JENNY BROCKIE: OK, interesting idea. Yes, Tim?  

TIM DEAN:  To answer the question of what should or a car do in a particular situation - should it hit the family of four or the cyclist, because the family of four is in a car - they might be more resilient, but the cyclist is only one person? - I actually don't know if anyone has the answers for that, and I don't think that's necessarily the right question to be asking for the longer-term view. I don't think there's any reason to believe there's any fundamental impossibility around solving those problems in the long term. I think it's not unreasonable to suggest that, at some point in the future, automated cars will be safer than people. At this point, I don't know whether we should prohibit people from driving, but we certainly might mandate certainly areas - say, city areas - that are only for automated cars. And it would be much safer than driving around with P-plates these days. 

JENNY BROCKIE: Mm. Toby, you've raised the potential for a driverless car to become a lethal, autonomous weapon. Do you want to talk about that?  

PROFESSOR TOBY WALSH:  Sure. Um, one of the problems with all technologies is it can be used for good or bad. The problem with taking an autonomous car, you could turn it into an autonomous weapon. Instead of avoiding pedestrians, it started to drive towards them and that's one line of code you have to change.  The technology can be misused by people - that's an area to be aware of and worry about how people control the technology. 

JENNY BROCKIE: And we haven't started talking about robotics in war tonight, because we've been talking about far more domestic applications, but that's another whole area at the moment, in terms of drones and so on.  

PROFESSOR TOBY WALSH:  Yes. The good thing is that the United Nations is taking this issue up. There is debate going on in Geneva to put together a ban for lethal autonomous weapons. That's looking like it's getting traction, and I think myself and a lot of other people working in the field would like to see that happen. 

JENNY BROCKIE: Mm. How long before those drones, though, could theoretically do that on their own, without humans in control?  

PROFESSOR TOBY WALSH: Well, it's certainly technically possible. And the question is, again, we have to think about it, as a society, about what we're going to allow and disallow. 

JENNY BROCKIE: We've talked a lot about taking over jobs and the ethics and so on. But I am interested in the idea of robots and artificial intelligence being more ethical than humans. We consider ourselves such superior creatures. It's an interesting notion. Is a big debate going on around that in your areas at the moment? I mean, because it doesn't seem to be seeping out into the broader community, as much.  

TIM DEAN:  There's a very big debate going on at the moment in artificial intelligence research about, um, the pathway to the future under the assumption that, um, AI is going to get exponentially more powerful and more capability. So even though what we're seeing today is relatively rudimentary, these things grow. Once they get to a certain point of intelligence and capability, then the issue of ethics becomes not just one of "Could a few pedestrians be killed?" - it could become an existential issue for our entire species. So there's a lot at stake and we don't yet know whether we're laying the foundations to go in the right direction. 

There is a very rich and vibrant debate at the moment about whether it's even possible to predict which way we're going. One way I've talked about this before is, imagine if you had a button and you could press that button, and it had a 90% chance of bringing about a global utopia - abundance and pleasure for all? But it had a 10% chance of wiping out all life on the planet? Would we press that button? I think a lot of people wouldn't. The thing is, we're kind of building that button at the moment, but we don't know the percentages. We don't yet know what that button's gonna do. 

JENNY BROCKIE: That's all we have time for here tonight, but we can keep having the conversation. Let's do that online - we can do it on Twitter and on Facebook. Hope you can join that conversation with us.