hri ethics
play

HRI Ethics Nao pill (MAARS) American remote ATLAS reminder - PDF document

2/21/19 Ethics and Robotics HRI Ethics Nao pill (MAARS) American remote ATLAS reminder system control gunbot Robear transfer robot Modular Advanced Armed Robotic Nao pill ATLAS System (MAARS) American reminder robot remote


  1. 2/21/19 Ethics and Robotics HRI Ethics Nao pill (MAARS) – American remote ATLAS reminder system control gunbot Robear transfer robot Modular Advanced Armed Robotic Nao pill ATLAS System (MAARS) – American reminder robot remote control gunbot What is Ethics? Normative Ethics (“Ethics”) The study of right and wrong ac-on Consequen1alism/u1litarianism (consequence based) • • Maximize the overall good • Descrip1ve (or compara1ve) ethics: • ‘The greatest good for the greatest number. ’ • How do people think about these issues? • What are the laws? What are the cultural norms? • Deontology (duty based) • It is moral to follow rules, and intent is what maRers. • Prescrip1ve ethics: • ‘Because it’s the law.’ • How should one act? • Neither What should the laws and norms be? • Contractualism (contract based) • absolute nor rela1ve Do exactly those things where a ra1onal agent would • Applied ethics • want to live in a world where everyone did those things. Iden1fying the correct course of ac1on for real problems • Extremely ‘Golden Rule’ • near-term ques1ons! Robot Vision Cybernetics You’re building a vision system: • 1

  2. 2/21/19 Categories of Harm Meta-Questions Ques1ons we will not answer today: Robots can physically harm people. • • What do “right” and “wrong” mean? • How? • This is a huge Who gets to decide what’s right and wrong? • Negligently • category – but How do/should those decisions be made? • Deliberately • is it the most likely? What should we do about things that are wrong? • ..? • We’ll use commonly understood ideas of wrong: • Robots can impinge on people’s self-determina1on. • It’s wrong to harm people • • Physically, emo1onally, financially… Robots can fail to act as expected, causing… • It’s wrong to discriminate against people “Without extenuating • Physical harm It’s wrong to steal from people circumstances,” and • • understanding that It’s wrong to invade people’s privacy Emo1onal distress • • sometimes there’s no It’s wrong to be unfair to people • “vic1mless” ethics is Disappointment • “right” alternative rarely “applied” Categories of Harm What Robots, and How? • Military Robots • Cyberne1cs Robots can change our defini1on of “humanity.” • • Caretaker robots Robots can have rights that are impinged upon. • Search-and-rescue • Elderly • Robots can discriminate. • • Automated cars Children • Robots can do environmental damage. • • Assis1ve robots • Bipeds and Quadrupeds Robots can increase the have/have-not gap. • • Explora1on robots For people • • Factory robots For na1ons • • Why is ATLAS • Surgical robots Scary? Big Questions Topics Drive discussion with an example: Self-driving cars Can computers “hurt” people? Sure. • • What about robots? Even more so. • Can a machine be • Sort of. There’s a “unfair” (discriminatory)? GIGO aspect. Why do we, as robo6cists , • Ethics and morals, care? legal liability What are some immediate • And generalize from there issues, right now? • 2

  3. 2/21/19 Self-Driving Cars Harder Questions What about naked self-driving cars? Cars can hurt or kill people. • • No control mechanisms inside at all • How many fatali1es is acceptable? • Is it enough to not cause accidents? • Should it be legal for a person to drive? • Even if cars are demonstrably beRer at it? • People cause accidents! • ~38,000 deaths per year in the U.S. Why? • • Lately it’s been going up Because I wanna? • • Because we dislike giving up control? How many of you text and drive? • • Even if you accept the risks, what about my rights? • Do cars have to be perfect? Just beRer than humans? • Somewhere in between? Who’s legally liability? ß this is a big question � • that will affect the future The Hardest One When an accident is inevitable… • Should the car occupants get hurt? • That is, the person who paid for it? • If it’s not their fault? • Would you buy a car that could hurt or kill you? • If it could be avoided by hur1ng or killing someone else? • Can you buy any other kind? What’s the difference? • But consider: • Would you swerve to avoid a kid in the road? • What about a baby stroller? • Who should be deciding these things? Uber? • Harm in HRI Code of Ethics Therapy: Taking it away • Dignity: Respect privacy, frailty, and emo1onal needs • Privacy rights and vulnerable popula1ons • Design: Be transparent, predictable, and trustworthy; • convey status; allow op1ng out Physical contact • Legality: Respect all laws; all decisions reconstructable; • Discouraging bonding • always seek informed consent Decep1ve studies • Social: Avoid decep1on; consider emo1onal bonding; • Sales agents • disallow human morphology; avoid –ist behaviors Haves and have-nots • Do we like it? • Diversity • 3

  4. 2/21/19 Ethical Governor Rules A component that intervenes in ethical cases Drawn from exis1ng literature • • …by computer science students • “More sensi1ve signals [are] hard to detect … • currently, but we can add those more sensi1ve signals to our architecture later by developing the technology. “ Anger-based • Emo1onal or physical withdrawal based • Uses rules to provide ac6ons • Rule Structure Evaluation and Conclusions Evalua1on • Reviewed by an OT expert • Qualita1ve analysis à changes • Actual user trials planned • Pros and cons • PD-specific • Early stages • is it scalable? (No. Not at all. Does it maRer?) • How did they do wrt. Riek? • Overall Disabled, Able, and Super-Able Disabled: What are our responsibili1es? • • “Having a physical or mental condi1on that • What about cultural context? limits movements, senses, or ac1vi1es.” • Compared to who? I can’t lik 180 kg. • Caregivers vs. pa1ents • 20-20 vision is compara1vely new. • Prosthe1cs: pacemakers, cochlear Diversity, again • • implants, bionic limbs, re1nal implants Military applica1ons • Beyond able-bodied… • For work: TALOS; EKSO • For informa1on: Google Glass • For fun? • 4

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend