security and cooperation in wireless networks
play

Security and Cooperation in Wireless Networks Additional Problems - PDF document

Security and Cooperation in Wireless Networks Additional Problems edited by Levente Butty an and Jean-Pierre Hubaux July 2009 Lausanne Preface The problems hereafter have been generated by students participating in the course Security and


  1. Security and Cooperation in Wireless Networks Additional Problems edited by Levente Butty� an and Jean-Pierre Hubaux July 2009 Lausanne

  2. Preface The problems hereafter have been generated by students participating in the course Security and Co- operation in Wireless Networks . The course is part of the doctoral school of EPFL in information and communication sciences (http://phd.ep�.ch/edic) and its URL is http://secowinetcourse.ep�.ch/. We are particularly grateful to students Joppe Bos, Zarko Milosevic, Seyyd Hasan Mirjalili, and Onur � Ozen, whose contributions were convincing enough to be included in the present document. 1

  3. Problems Problem 1 In self-organized mobile networks, there is the need for nodes to be able to generate their own ad- dresses and verify the ones from others. One technique to solve this problem is using self-certifying addresses, which allows hosts and domains to prove that they have the address they claim to have without relying on any global trusted authority. The notion of a self-certifying name is straightfor- ward: the name of the object is the public-key (or, for convenience the hash of the public-key) that corresponds to that object (similar to the concept of CGA as described in Chapter 4 of the book). (a) Compare the ef�ciency/security trade-off between address generation with hashing the public-key and without hashing (i.e., by using the public-key itself)? What are the advantages and disadvantages of both techniques? (b) Recall the basic security properties of a cryptographic hash function. What are the computational complexities of the brute force attacks aiming to defeat those properties? Consider CGA (described in Chapter 4 of the book) without hash extensions (i.e., when parameter sec = 0 ) where the addresses are generated by only hashing the public-key, subnet pre�x and collision count. (c) How do the generic attacks on the underlying hash function relate to CGA without hash extension? What are the computational complexities of these generic attacks on CGA without hash extension? Now consider CGA with hash extension: (d) Calculate the number of SHA-1 evaluations needed to generate an IPv6 address using the value of sec as a parameter. What is the security/ef�ciency trade-off to generate an IPv6 address by increasing values of sec ? (e) What are the computational complexities of the generic attacks on CGA with hash extension? Problem 2 Observe that in CGA (described in Chapter 4 of the book) the subnet pre�x is not used in the compu- tation of Hash2. (a) How can this observation be used to perform an attack? ( Hint: Use a time-memory trade-off.) (b) What is the overall complexity of this attack (required storage, time)? (c) Does including the subnet pre�x in the computation of Hash2 prevents this attack? What is the disadvantage of using the subnet pre�x here? Problem 3 Let us consider the following trust estimation engine for a server. The basic idea is to rate users after the completion of some transactions to derive a trust score, which can assist the system in deciding whether or not to transact with that user in the future. In fact, the system attempts to measure the trustworthiness of a user. Suppose the server's rating of each transaction is binary, i.e., positive or negative. Posteriori probabilities of binary events can be represented as Beta distribution 1 . 1 In probability theory and statistics, the beta distribution is a family of continuous probability distributions de�ned on 2

  4. We can interpret trust as the probability expectation with which positive behavior will happen in the future. The probability expectation value of the Beta distribution is given as: α E ( p ) = α + β after observing α − 1 independent events with probability p and β − 1 with probability 1 − p , if the prior distribution of p was uniform. Let Trust Score of user i , denoted by Θ i , be equal to E ( p ) . Furthermore, let r be the observed number of positive outcomes and s be the observed number of negative outcomes. (a) What is the Trust Score of user i after T transactions? Assume that at the beginning, when the server does not have any experience with the user (i.e. r, s = 0 ), Θ i (0) = 1 2 , i.e. a neutral opinion about the user. Note that 0 < Θ i ( T ) < 1 for all T and Θ i ( T ) ≈ 0 means distrust and Θ i ( T ) ≈ 1 means trust. (b) Old behavior may not always be relevant for the actual trust score, because the user may change its behavior over time. What is needed is a model which gives less weight to old behaviors and more weight to recent ones. This translates into gradually forgetting old behavior. Introduce a forgetting factor λ ∈ [0 , 1] in your equation which can be adjusted according to the expected rapidity of change in the observed user. Assuming λ = 1 means nothing is forgotten. The other extreme is when λ = 0 which means only the last behavior rating to be counted and all others to be completely forgotten. Here the order in which rating was given is important. (c) The equations in Parts (a) and (b) can be written in recursive way, i.e. Θ i ( t ) = f (Θ i ( t − 1)) . If you have written the equations in non-recursive way, the disadvantage is that all ratings given by the system should be kept. This can be avoided by transferring your equation to a recursive equation. Translate your equation in Part (b) to a recursive function. Problem 4 In Sections 8.3.1 and 8.3.2 of the book, it is shown that coordinated changing of pseudonyms inside mix zones is one possible solution for providing location privacy. This solution assumes that all nodes change pseudonyms inside mix zones (nodes are always cooperative) which may not be a realistic assumption as changing a pseudonym has a cost (consisting of obtaining the new pseudonym, routing overhead due to changing the pseudonym, etc. ). The goal of this exercise is to model, using game theory (See Appendix B), the pseudonym changing approach for achieving location privacy under the assumption that nodes are not always cooperative (rather the nodes are rational). (a) De�ne a strategic-form game that represents the pseudonym changing approach (let's call that the pseudonym changing game) assuming that • two players meets in a mix zone and engage in the game; • the players have two possible strategies: C - changing pseudonym (or cooperating) and D - no pseudonym change (defecting); • the achieved level of privacy L is equal to log 2 ( n ) where n is the number of players that changed pseudonym (i.e., played C); if n = 0 the achieved level of privacy is equal to 0 ; the interval [0 , 1] parameterized by two positive shape parameters, typically denoted by α and β . The beta distribution is the conjugate prior of the binomial distribution. 3

  5. • the cost of changing the pseudonym is γ ; and • the goal of each player is to maximize its utility (the level of its privacy). (b) Identify the Nash equilibria (NE). What is the Pareto-optimal NE strategy pro�le? (c) Let us modify the pseudonym changing game such that the player P 2 is malicious, i.e., the goal of player P 2 is to minimize the utility of the rational player P 1 . The gain G ( P 2 ) of P 2 is de�ned as G ( P 2 ) = 1 − L ( P 1 ) . The cost of changing pseudonym is γ for both players. The goal of each player is to maximize its utility, de�ned as the difference between the obtained gain and the incurred cost. Give the strategic-form representation of this game and identify the Nash equilibria. (d) Let us now assume that the players can be malicious with some prede�ned probability q . Fur- thermore, let us assume that the players make their moves sequentially (i.e., the game is dynamic, see Appendix B). Player P 1 moves �rst and then player P 2 moves. The advantage of P 2 is that it can observe the move of player P 1 . Identify the Nash equilibria in this game. Problem 5 In the improved anonymous routing protocol that is described in Section 8.4 of the book, we intro- duced a counter c SD whose value is synchronously maintained by the source and the destination. Does this protocol ensure forward secrecy? If so, why? If not, could the protocol be modi�ed to ensure it? Problem 6 This problem is related to the ElGamal asymmetric-key encryption scheme. (a) Assume Alice and Bob use the ElGamal asymmetric-key encryption scheme without the use of certi�cates; i.e. without ensuring the authenticity of the public keys. Think of a way for Eve to successfully read and possibly modify messages going from Alice to Bob without either of them noticing. (b) Show that the ElGamal scheme is unconditionally malleable, and hence it is not secure under a chosen ciphertext attack; i.e. given an encryption ( R, C ) of some (possibly unknown) message m , construct a valid encryption ( R ′ , C ′ ) � = ( R, C ) of some other message m ′ � = m . (c) Show that the version of ElGamal as presented in Appendix A of the book does not have the IND- CPA property (i.e., indistinguishability under chosen-plaintext attack). This means that a challenger can freely choose two messages m 0 and m 1 , next he challenges someone to encrypt one of these mes- sages and receives this encrypted message back, and he can always tell which message was encrypted. ( Hint: What is the order of Z ∗ p ?) (d) Find a way to solve the problem stated in Part (c). 4

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend