i complexity and discrete
play

I-Complexity and Discrete Towards Precise . . . Derivative of - PowerPoint PPT Presentation

Kolmogorov Complexity Need for Approximate . . . I-Complexity Good Properties of I- . . . I-Complexity and Discrete Towards Precise . . . Derivative of Logarithms: Our Result Proof A Group-Theoretic Acknowledgments References Explanation


  1. Kolmogorov Complexity Need for Approximate . . . I-Complexity Good Properties of I- . . . I-Complexity and Discrete Towards Precise . . . Derivative of Logarithms: Our Result Proof A Group-Theoretic Acknowledgments References Explanation Home Page Title Page Vladik Kreinovich and Jaime Nava ◭◭ ◮◮ Department of Computer Science ◭ ◮ University of Texas at El Paso 500 W. University Page 1 of 12 El Paso, TX 79968, USA Emails: vladik@utep.edu, jenava@miners.utep.edu Go Back Full Screen Close Quit

  2. Kolmogorov Complexity Need for Approximate . . . 1. Kolmogorov Complexity I-Complexity • The best way to describe the complexity of a given Good Properties of I- . . . string s is to find its Kolmogorov complexity K ( s ). Towards Precise . . . Our Result • K ( s ) is the shortest length of a program that com- Proof putes s . Acknowledgments • For example, a sequence is random if and only if its References Kolmogorov complexity is close to its length. Home Page • We can check how close are two DNA sequences s and Title Page s ′ by comparing K ( ss ′ ) with K ( s ) + K ( s ′ ): ◭◭ ◮◮ – if they are unrelated , the only way to generate ss ′ ◭ ◮ is to generate s and then generate s ′ , so Page 2 of 12 K ( ss ′ ) ≈ K ( s ) + K ( s ′ ); Go Back – if they are related , we have K ( ss ′ ) ≪ K ( s )+ K ( s ′ ). Full Screen Close Quit

  3. Kolmogorov Complexity Need for Approximate . . . 2. Need for Approximate Complexity I-Complexity • The big problem is that the Kolmogorov complexity is, Good Properties of I- . . . in general, not algorithmically computable . Towards Precise . . . Our Result • Thus, it is desirable to come up with computable ap- Proof proximations. Acknowledgments • At present, most algorithms for approximating K ( s ): References – use some loss-less compression technique to com- Home Page press s , and Title Page – take the length � K ( s ) of the compression as the de- ◭◭ ◮◮ sired approximation. ◭ ◮ • However, this approximation has limitations: for ex- ample, Page 3 of 12 – in contrast to K ( s ), where a change (one-bit) change Go Back in x cannot change K ( s ) much, Full Screen – a small change in s can lead to a drastic change Close in � K ( s ). Quit

  4. Kolmogorov Complexity Need for Approximate . . . 3. I-Complexity I-Complexity • Limitation of � K ( s ): a small change in s = ( s 1 s 2 . . . s n ) Good Properties of I- . . . can lead to a drastic change in � K ( s ). Towards Precise . . . Our Result • To overcome this limitation, V. Becher and P. A. Heiber Proof proposed the following new notion of I-complexity . Acknowledgments • For each position i , we find the length B s [ i ] of the References largest repeated substring within s 1 . . . s i . Home Page • For example, for aaaab , the corresponding values of Title Page B s ( i ) are 01233. ◭◭ ◮◮ � n def • We then define I ( s ) = f ( B s [ i ]), for an appropriate ◭ ◮ i =1 decreasing function f ( x ). Page 4 of 12 • Specifically, it turned out that the discrete derivative Go Back of the logarithm works well: f ( x ) = dlog( x + 1), where Full Screen def dlog( x ) = log( x + 1) − log( x ) . Close Quit

  5. Kolmogorov Complexity Need for Approximate . . . 4. Good Properties of I-Complexity I-Complexity � n Good Properties of I- . . . • Reminder: I ( s ) = f ( B s [ i ]), where: i =1 Towards Precise . . . • B s [ i ] is the length of the largest repeated substring Our Result within s 1 . . . s i , and Proof Acknowledgments • f ( x ) = log( x + 1) − log( x ) . References • Similarly to K ( s ): Home Page • If s starts s ′ , then I ( s ) ≤ I ( s ′ ). Title Page • We have I (0 s ) ≈ I ( s ) and I (1 s ) ≈ I ( s ). ◭◭ ◮◮ • We have I ( ss ′ ) ≤ I ( s ) + I ( s ′ ). ◭ ◮ • Most strings have high I-complexity. Page 5 of 12 • In contrast to K ( s ) : I-complexity can be computed in Go Back linear time. Full Screen • A natural question : why this function f ( x )? Close Quit

  6. Kolmogorov Complexity Need for Approximate . . . 5. Towards Precise Formulation of the Problem I-Complexity • We view the desired function f ( x ) as a discrete ana- Good Properties of I- . . . logue of an appropriate continuous function F ( x ): Towards Precise . . . � x +1 Our Result f ( x ) = g ( y ) dy = F ( x + 1) − F ( x ) . Proof x Acknowledgments • Which function F ( x ) should we choose? References Home Page • In the continuous case, the numerical value of each quantity depends: Title Page ◭◭ ◮◮ – on the choice of the measuring unit and ◭ ◮ – on the choice of the starting point. • By changing them, we get a new value x ′ = a · x + b . Page 6 of 12 Go Back • For length x , the starting point 0 is fixed. Full Screen • So, we only have re-scaling x → x ′ = a · x . Close Quit

  7. Kolmogorov Complexity Need for Approximate . . . 6. Our Result I-Complexity • By changing a measuring unit, we get x ′ = a · x . Good Properties of I- . . . Towards Precise . . . • When we thus re-scale x , the value y = F ( x ) changes, to y ′ = F ( a · x ). Our Result Proof • It is reasonable to require that the value y ′ represent Acknowledgments the same quantity. References • So, we require that y ′ differs from y by a similar re- Home Page scaling: Title Page y ′ = F ( a · x ) = A ( a ) · F ( x )+ B ( a ) for some A ( a ) and B ( a ) . ◭◭ ◮◮ • It turns out that all monotonic solutions of this equa- ◭ ◮ tion are linearly equivalent to log( x ) or to x α , i.e.: Page 7 of 12 a · x α + � a · ln( x ) + � F ( x ) = � b or F ( x ) = � b. Go Back • So, symmetries do explain the selection of the function Full Screen F ( x ) for I-complexity. Close Quit

  8. Kolmogorov Complexity Need for Approximate . . . 7. Proof I-Complexity • Reminder: for some monotonic function F ( x ), for ev- Good Properties of I- . . . ery a , there exist values A ( a ) and B ( a ) for which Towards Precise . . . Our Result F ( a · x ) = A ( a ) · F ( x ) + B ( a ) . Proof • Known fact: every monotonic function is almost every- Acknowledgments where differentiable. References Home Page • Let x 0 > 0 be a point where the function F ( x ) is dif- Title Page ferentiable. ◭◭ ◮◮ • Then, for every x , by taking a = x/x 0 , we conclude that F ( x ) is differentiable at this point x as well. ◭ ◮ Page 8 of 12 • For any x 1 � = x 2 , we have F ( a · x 1 ) = A ( a ) · F ( x 1 )+ B ( a ) and F ( a · x 2 ) = A ( a ) · F ( x 2 ) + B ( a ). Go Back • We get a system of two linear equations with two un- Full Screen knowns A ( a ) and B ( a ). Close Quit

  9. Kolmogorov Complexity Need for Approximate . . . 8. Proof (cont-d) I-Complexity • We get a system of two linear equations with two un- Good Properties of I- . . . knowns A ( a ) and B ( a ): Towards Precise . . . Our Result F ( a · x 1 ) = A ( a ) · F ( x 1 ) + B ( a ) . Proof F ( a · x 2 ) = A ( a ) · F ( x 2 ) + B ( a ) . Acknowledgments • Thus, both A ( a ) and B ( a ) are linear combinations of References differentiable functions F ( a · x 1 ) and F ( a · x 2 ). Home Page • Hence, both functions A ( a ) and B ( a ) are differentiable. Title Page • So, F ( a · x ) = A ( a ) · F ( x ) + B ( a ) for differentiable ◭◭ ◮◮ functions F ( x ), A ( a ), and B ( a ). ◭ ◮ • Differentiating both sides by a , we get Page 9 of 12 x · F ′ ( a · x ) = A ′ ( a ) · F ( x ) + B ′ ( a ) . Go Back • In particular, for a = 1, we get x · dF Full Screen dx = A · F + B , Close def def = A ′ (1) and B = B ′ (1). where A Quit

  10. Kolmogorov Complexity Need for Approximate . . . 9. Proof (final part) I-Complexity • Reminder: x · dF Good Properties of I- . . . dx = A · F + B . Towards Precise . . . A · F + b = dx dF Our Result • So, x ; now, we can integrate both sides. Proof • When A = 0 : we get F ( x ) Acknowledgments = ln( x ) + C , so b References F ( x ) = b · ln( x ) + b · C. Home Page Title Page d � = F + b F = dx def • When A � = 0 : for � F A , we get x , so A · � ◭◭ ◮◮ F 1 A · ln( � F ( x )) = ln( x )+ C , and ln( � F ( x )) = A · ln( x )+ A · C . ◭ ◮ Page 10 of 12 def • Thus, � F ( x ) = C 1 · x A , where C 1 = exp( A · C ). Go Back F ( x ) − b A = C 1 · x A − b • Hence, F ( x ) = � A . Full Screen • The theorem is proven. Close Quit

  11. Kolmogorov Complexity Need for Approximate . . . 10. Acknowledgments I-Complexity This work was supported in part: Good Properties of I- . . . Towards Precise . . . • by the National Science Foundation grants HRD-0734825 Our Result and DUE-0926721, and Proof • by Grant 1 T36 GM078000-01 from the National Insti- Acknowledgments tutes of Health. References Home Page Title Page ◭◭ ◮◮ ◭ ◮ Page 11 of 12 Go Back Full Screen Close Quit

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend