HRI Ethics Nao pill (MAARS) American remote ATLAS reminder - - PDF document

hri ethics
SMART_READER_LITE
LIVE PREVIEW

HRI Ethics Nao pill (MAARS) American remote ATLAS reminder - - PDF document

2/21/19 Ethics and Robotics HRI Ethics Nao pill (MAARS) American remote ATLAS reminder system control gunbot Robear transfer robot Modular Advanced Armed Robotic Nao pill ATLAS System (MAARS) American reminder robot remote


slide-1
SLIDE 1

2/21/19 1

HRI Ethics

Modular Advanced Armed Robotic System (MAARS) – American remote control gunbot

ATLAS

Nao pill reminder robot

Ethics and Robotics

ATLAS

Nao pill reminder system

(MAARS) – American remote control gunbot

Robear transfer robot

What is Ethics?

  • The study of right and wrong ac-on
  • Descrip1ve (or compara1ve) ethics:
  • How do people think about these issues?
  • What are the laws? What are the cultural norms?
  • Prescrip1ve ethics:
  • How should one act?
  • What should the laws and norms be?
  • Applied ethics
  • Iden1fying the correct course of ac1on for real problems

Neither absolute nor rela1ve Extremely near-term ques1ons!

Normative Ethics (“Ethics”)

  • Consequen1alism/u1litarianism (consequence based)
  • Maximize the overall good
  • ‘The greatest good for the greatest number. ’
  • Deontology (duty based)
  • It is moral to follow rules, and intent is what maRers.
  • ‘Because it’s the law.’
  • Contractualism (contract based)
  • Do exactly those things where a ra1onal agent would

want to live in a world where everyone did those things.

  • ‘Golden Rule’

Robot Vision

  • You’re building a vision system:

Cybernetics

slide-2
SLIDE 2

2/21/19 2 Categories of Harm

  • Robots can physically harm people.
  • How?
  • Negligently
  • Deliberately
  • ..?
  • Robots can impinge on people’s self-determina1on.
  • Robots can fail to act as expected, causing…
  • Physical harm
  • Emo1onal distress
  • Disappointment

“vic1mless” ethics is rarely “applied” This is a huge category – but is it the most likely?

Meta-Questions

  • Ques1ons we will not answer today:
  • What do “right” and “wrong” mean?
  • Who gets to decide what’s right and wrong?
  • How do/should those decisions be made?
  • What should we do about things that are wrong?
  • We’ll use commonly understood ideas of wrong:
  • It’s wrong to harm people
  • Physically, emo1onally, financially…
  • It’s wrong to discriminate against people
  • It’s wrong to steal from people
  • It’s wrong to invade people’s privacy
  • It’s wrong to be unfair to people

“Without extenuating circumstances,” and understanding that sometimes there’s no “right” alternative

Categories of Harm

  • Robots can change our defini1on of “humanity.”
  • Robots can have rights that are impinged upon.
  • Robots can discriminate.
  • Robots can do environmental damage.
  • Robots can increase the have/have-not gap.
  • For people
  • For na1ons

What Robots, and How?

  • Military Robots
  • Caretaker robots
  • Elderly
  • Children
  • Assis1ve robots
  • Explora1on robots
  • Factory robots
  • Surgical robots
  • Cyberne1cs
  • Search-and-rescue
  • Automated cars
  • Bipeds and

Quadrupeds

  • Why is ATLAS

Scary?

Big Questions

  • Can computers “hurt” people?
  • What about robots?
  • Can a machine be

“unfair” (discriminatory)?

  • Why do we, as robo6cists,

care?

  • What are some immediate

issues, right now? Sure. Even more so. Sort of. There’s a GIGO aspect. Ethics and morals, legal liability

Topics

  • Drive discussion with an example: Self-driving cars
  • And generalize from there
slide-3
SLIDE 3

2/21/19 3 Self-Driving Cars

  • Cars can hurt or kill people.
  • How many fatali1es is acceptable?
  • Is it enough to not cause accidents?
  • People cause accidents!
  • ~38,000 deaths per year in the U.S.
  • Lately it’s been going up
  • How many of you text and drive?
  • Do cars have to be perfect? Just beRer than humans?

Somewhere in between?

Harder Questions

  • What about naked self-driving cars?
  • No control mechanisms inside at all
  • Should it be legal for a person to drive?
  • Even if cars are demonstrably beRer at it?
  • Why?
  • Because I wanna?
  • Because we dislike giving up control?
  • Even if you accept the risks, what about my rights?
  • Who’s legally liability? ß this is a big question

that will affect the future

The Hardest One

  • When an accident is inevitable…
  • Should the car occupants get hurt?
  • That is, the person who paid for it?
  • If it’s not their fault?
  • Would you buy a car that could hurt or kill you?
  • If it could be avoided by hur1ng or killing someone else?
  • Can you buy any other kind? What’s the difference?
  • But consider:
  • Would you swerve to avoid a kid in the road?
  • What about a baby stroller?
  • Who should be deciding these things? Uber?

Harm in HRI

  • Therapy: Taking it away
  • Privacy rights and vulnerable popula1ons
  • Physical contact
  • Discouraging bonding
  • Decep1ve studies
  • Sales agents
  • Haves and have-nots
  • Diversity

Code of Ethics

  • Dignity: Respect privacy, frailty, and emo1onal needs
  • Design: Be transparent, predictable, and trustworthy;

convey status; allow op1ng out

  • Legality: Respect all laws; all decisions reconstructable;

always seek informed consent

  • Social: Avoid decep1on; consider emo1onal bonding;

disallow human morphology; avoid –ist behaviors

  • Do we like it?
slide-4
SLIDE 4

2/21/19 4 Ethical Governor

  • A component that intervenes in ethical cases
  • Uses rules to provide ac6ons

Rules

  • Drawn from exis1ng literature
  • …by computer science students
  • “More sensi1ve signals [are] hard to detect …

currently, but we can add those more sensi1ve signals to our architecture later by developing the

  • technology. “
  • Anger-based
  • Emo1onal or physical withdrawal based

Rule Structure Evaluation and Conclusions

  • Evalua1on
  • Reviewed by an OT expert
  • Qualita1ve analysis à changes
  • Actual user trials planned
  • Pros and cons
  • PD-specific
  • Early stages
  • is it scalable? (No. Not at all. Does it maRer?)
  • How did they do wrt. Riek?

Overall

  • What are our responsibili1es?
  • What about cultural context?
  • Caregivers vs. pa1ents
  • Diversity, again
  • Military applica1ons

Disabled, Able, and Super-Able

  • Disabled:
  • “Having a physical or mental condi1on that

limits movements, senses, or ac1vi1es.”

  • Compared to who? I can’t lik 180 kg.
  • 20-20 vision is compara1vely new.
  • Prosthe1cs: pacemakers, cochlear

implants, bionic limbs, re1nal implants

  • Beyond able-bodied…
  • For work: TALOS; EKSO
  • For informa1on: Google Glass
  • For fun?