Some Afterthoughts on Knowledge-based Obligation Rahul Bendre 1 2 Rohit Parikh 1 3 Andreas Witzel 1 4 1 Graduate Center of CUNY 2 IIT Kanpur 3 Brooklyn College of CUNY 4 ILLC, University of Amsterdam Jan 9, 2009
Outline Introduction The Easy Cases The Kitty Genovese case Issues with the Original Model Ideas and Suggestions Action Tuples as Events Adding Probabilities and Expected Values Putting it all Together Applying the Framework Previous Examples Revisited Diffusion of Obligation Moral Obligation: a Co-ordination Game Summary References
Outline Introduction The Easy Cases The Kitty Genovese case Issues with the Original Model Ideas and Suggestions Action Tuples as Events Adding Probabilities and Expected Values Putting it all Together Applying the Framework Previous Examples Revisited Diffusion of Obligation Moral Obligation: a Co-ordination Game Summary References
Introduction In [KBO] the authors developed a framework for dealing with situations where an agent’s moral obligation depends on her knowledge. Two contrasting cases discussed in that paper were: ◮ Example 1: Uma is a physician whose neighbour is ill. Uma does not know and has not been informed. Uma has no obligation (as yet) to treat the neighbour. ◮ Example 2: Uma is a physician whose neighbour Sam is ill. The neighbour’s daughter Ann comes to Uma’s house and tells her. Now Uma does have an obligation to treat Sam, or perhaps call in an ambulance or a specialist. The difference in obligation arises because in the second example Uma has Knowledge which she does not have in the first.
Knowledge Framework The [KBO] paper uses the history based semantics of [PR]. ◮ Events happen sequentially to form Global History. ◮ Local History of any agent h is a homomorphic image of the finite prefix of the actual Global History H t . ◮ The Local View function ( λ i ) is used to derive the local history from the global history as λ i ( H t ) = h ◮ Each Local History may be compatible with many Global Histories. [ h ] i = { H ∈ H| λ i ( H t ) = h } ◮ There exists an equivalence relation ∼ i given by: H t ∼ i H ′ t iff λ i ( H t ) = λ i ( H ′ t ) ◮ Agent knows formula φ about the global situation if φ is true of all global histories compatible with her local history. The resulting framework along with an the assignment of social utilities to the Global Histories is used to define the obligation of an agent to do that action a which maximizes the social utility.
Knowledge-based Obligation ◮ An agent i has a knowledge-based obligation to perform action a iff a is an action that (only) i can perform and i knows that it is good to perform a . ◮ For an agent i to know that it is good to perform an action a , all maximum valued extensions of all the histories considered possible by i must contain the next action as a . ◮ An underlying assumption for this framework is that all the agents act in a turn-based manner . We shall see how this assumption affects the application of the framework to certain examples.
Sam and Uma Example 1: Uma is a physician whose neighbour Sam is ill. Uma does not know and has not been informed. Uma has no obligation (as yet) to treat the neighbour. ◮ Histories in the protocol are: H = { H 1 = ( ¬ v ¬ h ) , H 2 = ( vh ) , H 3 = ( ¬ vh ) , H 4 = ( v ¬ h ) } in the decreasing order of social utility. ◮ Uma has no access to know Sam’s action (the first event) so her local history at this time consists of a non-informative clock tick c . ◮ At this time, her local history is compatible with all histories in the protocol and she has a Default Obligation 1 to do action ¬ h . Thus, Uma has no obligation to offer her help to Sam due to lack of knowledge. 1 A system of [Grove Spheres] is defined on the protocol and a default obligation is an obligation in only the most plausible histories considered by the agent.
Sam, Ann and Uma Example 2: Uma is a physician whose neighbour Sam is ill. The neighbour’s daughter Ann comes to Uma’s house and tells her. Now Uma does have an obligation to treat Sam, or perhaps call an ambulance or a specialist. ◮ H = { ( ¬ v ¬ m ¬ h ) , ( vmh ) , ( ¬ v ¬ mh ) , ( vm ¬ h ) } . Thus, the protocol prohibits Ann from lying. ◮ After Uma receives the message, at time t 2 , her local history looks like this: ( cm ). ◮ This knowledge makes histories ( ¬ v ¬ m ¬ h ) and ( ¬ v ¬ mh ) incompatible with her local history. ◮ Out of the remaining histories ( vmh ) , and ( vm ¬ h ), Uma now knows that (only) she has the action h which will guarantee the maximum valued history making it her Knowledge Based Obligation . The [KBO] framework works perfectly with these cases to capture the notion of a moral obligation and the influence of ‘knowledge’ on the same.
The story The K.G. case is a classical murder case posing challenges to attempts of explaining when it is that feelings of obligation arise. ◮ In the early morning hours of March 13, 1964 Catherine Genovese was brutally attacked and killed in the Kew Gardens section of Queens, New York City. ◮ Many neighbours saw what was happening, but no one called the police. ◮ People reasoned that there must have been many calls already and that there was no point in their calling and adding to the commotion. ◮ When the cops finished polling the immediate neighbourhood, they discovered at least 38 people who had heard or observed some part of the fatal assault on Kitty Genovese. Some 35 minutes passed between Kitty being attacked and someone calling the police. This runs counter to the intuition that the more people are present, the more likely it should be that someone of them will intervene and help.
Bystander Effect ◮ The Kitty Genovese case sparked off a line of research in psychology to examine the so called bystander effect . ◮ Diffusion of Responsibility is offered as one of the explanations of this effect where each bystander assumes that someone else is going to intervene and so refrains from acting himself. ◮ It is intriguing to try and formalize the underlying issues of knowledge and obligation in this case and diffusion of responsibility seems to be the explanation which most naturally lends itself to mathematical modeling. ◮ Some technical assumptions in the [KBO] framework create problems with explaining the diffusion of responsibility in the K.G. case. ◮ The modifications to the [KBO] framework will be directed to be able to capture this idea of diffusion of responsibility. In the subsequent sections we will look at the issues with the original model and some suggestions to overcome them.
The Original Model The following are the salient points of the original [KBO] model: ◮ The original framework consists of the following: ◮ A set of agents ; ◮ Sets of (disjoint) actions , or events , for each agent; ◮ A set of possible histories , which are basically (infinite) sequences of events that take place; ◮ And an associated value representing that history’s utility to society . ◮ All possible extensions to any finite prefix of a possible history start with some action of one same agent. That is, at any point in time, there is exactly one agent whose turn it is to take the next action or effect the next event.
The Original Model ◮ Agents can only observe certain events (including their own actions and possibly those of other agents), and can only distinguish histories which differ in events observable by them (given by λ i and ∼ i ). ◮ An agent knows everything which holds of all the histories that he cannot distinguish from the actual one. ◮ Some agent’s action a is obligatory at some finite prefix of a history, if all maximum-valued extensions of that prefix start with a , and she knows this. Now, we would like to try to use this formal framework in order to explain why no single bystander felt an obligation to act in the K.G. case.
The Model applied to the KG case In [KBO], the formal details are not spelled out; trying to do that, we encounter some complications. ◮ First, assume that the values of histories decrease with time (in this emergency case). ◮ So, due to the (commonly known) turns, the agent whose turn it is immediately after the event does have the obligation to act because there is exactly one history with maximum value and that agent knows this. ◮ On the other hand, assume that all histories in which anyone helps at just any time have the same utility. Then, indeed no single agent has the obligation to help.
Issues with the Model ◮ So, the only way to explain the K.G. case using the framework from [KBO] is by making the unintuitive assumption that it does not matter how much time passes before someone calls the police. ◮ And, in a way, that removes the whole twist of the case, since if it indeed did not matter how much time passed, then it would not be a scandal that it took 35 minutes for the first call to come in. ◮ The assumption of turn-based actions is the main cause for the framework to fail, at a technical level, to explain the behavior observed in the KG case.
Recommend
More recommend