towards semantics for provenance security
play

Towards Semantics for Provenance Security Stephen Chong Harvard - PowerPoint PPT Presentation

Towards Semantics for Provenance Security Stephen Chong Harvard University TaPP 09 Provenance security Some data are sensitive Must ensure provenance does not reveal sensitive data E.g., John participated in medical study S reveals


  1. Towards Semantics for Provenance Security Stephen Chong Harvard University TaPP ’09

  2. Provenance security Some data are sensitive Must ensure provenance does not reveal sensitive data E.g., “John participated in medical study S” reveals “John has disease D” Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 2

  3. Provenance security Some data are sensitive Must ensure provenance does not reveal sensitive data E.g., “John participated in medical study S” reveals “John has disease D” Some provenance is sensitive Must ensure output does not reveal sensitive provenance E.g., Workshop referee reports should not contain name/email of referee Must ensure provenance does not reveal sensitive provenance E.g., If student in Disciplinary Hearing, then student’s advisor must attend. “Prof. Smith participated as an Advisor” may reveal “John participated as respondent” Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 2

  4. Provenance security Some data are sensitive Must ensure provenance does not reveal sensitive data E.g., “John participated in medical study S” reveals “John has disease D” Some provenance is sensitive Must ensure output does not reveal sensitive provenance E.g., Workshop referee reports should not contain name/email of referee Must ensure provenance does not reveal sensitive provenance E.g., If student in Disciplinary Hearing, then student’s advisor must attend. “Prof. Smith participated as an Advisor” may reveal “John participated as respondent” How do we know if we have security right? Complex interaction between information security and provenance Not well-understood Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 2

  5. Semantics for provenance security Goal: precise, useful, intuitive definitions of provenance security understand provenance security principles and mechanisms to apply in practice This work: Formal definitions for provenance security public data does not reveal sensitive provenance public provenance does not reveal sensitive provenance public provenance does not reveal sensitive data (public data does not reveal sensitive data) Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 3

  6. Semantics for provenance security Goal: precise, useful, intuitive definitions of provenance security understand provenance security principles and mechanisms to apply in practice This work: Formal definitions for provenance security public data does not reveal sensitive provenance public provenance does not reveal sensitive provenance public provenance does not reveal sensitive data (public data does not reveal sensitive data) Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 3

  7. Language model Simple language-based model (based on Cheney, Acar, Ahmed [2008]) Program c has input locations, produces single output 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v E.g., 〈 l 1 =3, l 2 =5, l 3 =7 ; x = l 1 ; if (x) then l 2 else l 3 〉⇒ 5 Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 4

  8. Language model Simple language-based model (based on Cheney, Acar, Ahmed [2008]) Program c has input locations, produces single output 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v Provenance T describes execution 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v ! T E.g., 〈 l 1 =3, l 2 =5, l 3 =7 ; x = l 1 ; if (x) then l 2 else l 3 〉⇒ 5 ! x= l 1 ; cond(x,true, l 2 ) Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 4

  9. Language model Simple language-based model (based on Cheney, Acar, Ahmed [2008]) Program c has input locations, produces single output 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v Provenance T describes execution 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v ! T Partial provenance: allow parts of T to be elided E.g., 〈 l 1 =3, l 2 =5, l 3 =7 ; x = l 1 ; if (x) then l 2 else l 3 〉⇒ 5 cond(x,true, l 2 ) ! x= l 1 ; cond(x,true, l 2 ) Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 4

  10. Language model Simple language-based model (based on Cheney, Acar, Ahmed [2008]) Program c has input locations, produces single output 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v Provenance T describes execution 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v ! T Partial provenance: allow parts of T to be elided E.g., 〈 l 1 =3, l 2 =5, l 3 =7 ; x = l 1 ; if (x) then l 2 else l 3 〉⇒ 5 cond(x,true, " ) ! x= l 1 ; cond(x,true, l 2 ) Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 4

  11. Language model Simple language-based model (based on Cheney, Acar, Ahmed [2008]) Program c has input locations, produces single output 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v Provenance T describes execution 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v ! T Partial provenance: allow parts of T to be elided E.g., 〈 l 1 =3, l 2 =5, l 3 =7 ; x = l 1 ; if (x) then l 2 else l 3 〉⇒ 5 cond(x, " , " ) ! x= l 1 ; cond(x,true, l 2 ) Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 4

  12. Language model Simple language-based model (based on Cheney, Acar, Ahmed [2008]) Program c has input locations, produces single output 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v Provenance T describes execution 〈 l 1 = v 1 , …, l n = v n ; c 〉 � v ! T Partial provenance: allow parts of T to be elided E.g., 〈 l 1 =3, l 2 =5, l 3 =7 ; x = l 1 ; if (x) then l 2 else l 3 〉⇒ 5 ! x= l 1 ; cond(x,true, l 2 ) " Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 4

  13. Security policies Each input location has security policy for data and provenance e.g., � ( l 1 ) = LL � ( l 2 ) = LH � ( l 3 ) = HH Data security: Provenance security: H : High security (secret) H : High provenance (secret) L : Low security (public) L : Low provenance (public) Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 5

  14. Security policies Each input location has security policy for data and provenance e.g., � ( l 1 ) = LL � ( l 2 ) = LH � ( l 3 ) = HH User knows low security inputs, and is given output and partial provenance trace User should not learn high security data User should not learn which high provenance locations involved in computation What (partial) provenance can we give to user? Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 5

  15. First attempt We think T is secure for execution 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v if: 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v ! T and T does not contain any high provenance locations. Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 6

  16. First attempt We think T is secure for execution 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v if: 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v ! T and T does not contain any high provenance locations. E.g., 〈 … ; if ( l 1 ) then l 2 +l 3 else l 4 +l 5 〉⇒ 5 ! cond( l 1 ,true, l 2 +l 3 ) cond( l 1 ,true, l 2 +l 3 ) � ( l 1 ) = HL � ( l 2 ) = HH � ( l 3 ) = HL � ( l 4 ) = HH � ( l 5 ) = HL Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 6

  17. First attempt We think T is secure for execution 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v if: 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v ! T and T does not contain any high provenance locations. E.g., 〈 … ; if ( l 1 ) then l 2 +l 3 else l 4 +l 5 〉⇒ 5 ! cond( l 1 ,true, l 2 +l 3 ) cond( l 1 ,true, " +l 3 ) � ( l 1 ) = HL � ( l 2 ) = HH � ( l 3 ) = HL � ( l 4 ) = HH � ( l 5 ) = HL Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 6

  18. Provenance security T satisfies provenance security for execution 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v if: 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v ! T and for any high provenance l i , there is an execution 〈 l 1 = w 1 , …, l n = w n ; c 〉⇒ v such that if l j is low security then v j = w j and 〈 l 1 = w 1 , …, l n = w n ; c 〉⇒ v ! T and l i involved in 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v iff l i not involved in 〈 l 1 = w 1 , …, l n = w n ; c 〉⇒ v Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 7

  19. Provenance security T satisfies provenance security for execution 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v if: 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v ! T and for any high provenance l i , there is an execution 〈 l 1 = w 1 , …, l n = w n ; c 〉⇒ v such that if l j is low security then v j = w j and 〈 l 1 = w 1 , …, l n = w n ; c 〉⇒ v ! T and l i involved in 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v iff l i not involved in 〈 l 1 = w 1 , …, l n = w n ; c 〉⇒ v Looks the same Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 7

  20. Provenance security T satisfies provenance security for execution 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v if: 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v ! T and for any high provenance l i , there is an execution 〈 l 1 = w 1 , …, l n = w n ; c 〉⇒ v such that if l j is low security then v j = w j and 〈 l 1 = w 1 , …, l n = w n ; c 〉⇒ v ! T and l i involved in 〈 l 1 = v 1 , …, l n = v n ; c 〉⇒ v iff l i not involved in 〈 l 1 = w 1 , …, l n = w n ; c 〉⇒ v Looks the same but l i not involved Towards Semantics for Provenance Security, Stephen Chong, Harvard University. 7

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend