solution of algebraic equations by using autonomous
play

Solution of Algebraic Equations by Using Autonomous Computational - PowerPoint PPT Presentation

Solution of Algebraic Equations by Using Autonomous Computational Methods Andrew Pownuk 1 , Jose Gonzalez 2 1 Department of Mathematical Sciences, University of Texas at El Paso, El Paso, Texas, ampownuk@utep.edu 2 Undergraduate Student at the


  1. Solution of Algebraic Equations by Using Autonomous Computational Methods Andrew Pownuk 1 , Jose Gonzalez 2 1 Department of Mathematical Sciences, University of Texas at El Paso, El Paso, Texas, ampownuk@utep.edu 2 Undergraduate Student at the Department of Electrical and Computer Engineering, University of Texas at El Paso, El Paso, Texas, jhgonzalez5@miners.utep.edu 25th Joint NMSU/UTEP Workshop on Mathematics, Computer Science, and Computational Sciences 1 / 37

  2. Outline Differentiation 1 Algebraic Equations 2 Machine Learning 3 Conclusions and Generalizations 4 2 / 37

  3. Differentiation Differentiation Algebraic Equations Machine Learning Input information Conclusions ( f ( x ) + g ( x )) ′ = f ′ ( x ) + g ′ ( x ) and General- izations ( f ( x ) − g ( x )) ′ = f ′ ( x ) − g ′ ( x ) ( f ( x ) g ( x )) ′ = f ′ ( x ) g ( x ) + f ( x ) g ′ ( x ) ( f ( x ) / g ( x )) ′ = f ′ ( x ) g ( x ) − f ( x ) g ′ ( x ) g 2 ( x ) ( f ( g ( x ))) ′ = f ′ ( g ( x )) g ′ ( x ) arithmetic operations 3 / 37

  4. Differentiation - Sample Application Differentiation Product rule (input information) Algebraic Equations ( f ∗ g ) ′ = f ′ ∗ g + f ∗ g ′ Machine Learning Conclusions and General- After calculations (new theorem created automatically) izations (( f ∗ g ) ∗ h ) ′ = ( f ∗ g ) ′ ∗ h + ( f ∗ g ) ∗ h ′ (( f ∗ g ) ∗ h ) ′ = ( f ′ ∗ g + f ∗ g ′ ) ∗ h + ( f ∗ g ) ∗ h ′ ( f ∗ g ∗ h ) ′ = ( f ′ ∗ g ) ∗ h + ( f ∗ g ′ ) ∗ h + ( f ∗ g ) ∗ h ′ New theorem can be used in exactly the same way like the original theorem. ( f ∗ g ∗ h ) ′ = f ′ ∗ g ∗ h + f ∗ g ′ ∗ h + f ∗ g ∗ h ′ 4 / 37

  5. Differentiation (Latex source) Differentiation Algebraic Equations Machine Learning Conclusions and General- izations 5 / 37

  6. Differentiation (step 1) Differentiation Algebraic Equations Machine Learning Conclusions and General- izations 6 / 37

  7. Differentiation (step 2) Differentiation Algebraic Equations Machine Learning Conclusions and General- izations 7 / 37

  8. Differentiation (step 3) Differentiation Algebraic Equations Machine Learning Conclusions and General- izations 8 / 37

  9. Differentiation (step 4) Differentiation Algebraic Equations Machine Learning Conclusions and General- izations 9 / 37

  10. Differentiation (step 5) Differentiation Algebraic Equations Machine Learning Conclusions and General- izations 10 / 37

  11. Group Axioms Differentiation A group is a set (mathematics), G together with an Binary Algebraic operation ′′ · ′′ (called the group law of G) that combines any Equations two element elements a and b to form another element, Machine Learning denoted a · b . To qualify as a group, the set and operation, Conclusions ( G , · ), must satisfy four requirements known as the group and General- izations axioms: 1 Closure: For all a , b ∈ G , the result of the operation, a · b , is also in G . 2 Associativity: For all a , b and c in G , ( a · b ) · c = a · ( b · c ). 3 Identity element: There exists an element e in G such that, for every element a in G , the equation 1 = e · a = a · e = a holds. Such an element is unique. 4 Inverse element: For each a ∈ G , there exists an element b ∈ G , commonly denoted a − 1 (or − a , if the operation is denoted +, such that a · b = b · a = e , where e is the identity element. 11 / 37

  12. Expression Evaluation Differentiation Algebraic Equations Machine Learning Conclusions and General- izations 12 / 37

  13. Expression Evaluation Differentiation Algebraic Equations Machine Learning Conclusions and General- izations Expression evaluation is possible without specifying any explicit method for expression evaluation. 13 / 37

  14. Computational Graph Differentiation Algebraic Equations Machine Learning Conclusions and General- izations COCONUT Project COntinuous CONstraints - Updating the Technology. https://www.mat.univie.ac.at/ neum/glopt/coconut/ 14 / 37

  15. Solution of Algebraic Equations Differentiation Algebraic step-1 Equations x = ( x + (2 + x )) Machine Learning step-3 Conclusions x = ( x + 2 + x ) and General- izations step-4 x = ( x + 2 + x ) step-5 x = x + 2 + x step-6 x = x + 2 + x step-7 x = 2 + 2 ∗ x step-9 x + ( − 1) ∗ x + ( − 1) ∗ x = 2 15 / 37

  16. Solution of Algebraic Equations Differentiation Algebraic Equations Machine step-10 Learning x + ( − 2) ∗ x = 2 Conclusions and General- step-11 izations ( − 1) ∗ x = 2 step-12 x = (2 / ( − 1)) step-13 x = ( − 2) step-14 x = (( − 2) / 1) 16 / 37

  17. Automatically Generated Latex Report Differentiation Algebraic Equations Machine Learning Conclusions and General- izations 17 / 37

  18. Automatically Generated PDF Reports Differentiation Algebraic Hundred/thousand/unlimited number of examples: Equations Machine Learning Conclusions and General- izations http://andrew.pownuk.com/research/AlgebraicEquations/ 18 / 37

  19. Simplification of the Solution Differentiation Algebraic Equations Machine Learning Conclusions and General- izations 19 / 37

  20. Simplification of the Solution Differentiation Algebraic Equations Machine Finding optimal form of the computational process is one Learning or the main goal of this project. Conclusions and General- Quality of the simplification depend on the amount of izations knowledge available in the system and processing power. For small problem the simplification can be found uniquely. For more complex problem simplification of the expressions never stops. It is possible to use new computational results in order future calculations (self-adaptivity). In this way the system can generate knowledge in autonomous way. 20 / 37

  21. Finite Number of Steps Differentiation Algebraic Equations Machine Limits lim n →∞ 1 Learning n = 0 Conclusions Convergence of infinite series and General- izations 1 � ∞ n 2 ( p = 2 > 1 series converges). n =1 If function f is integrable, then the sequence of Riemann � b sum � N n =1 f ( x n )∆ x n converges to a f ( x ) dx (for appropriate partitions). Presented methodology can be applied to many scientific theories (mathematics, engineering, chemistry, biology etc.) with finite number of steps. 21 / 37

  22. What is Special About This Research? Differentiation Algebraic Equations Machine Learning Conclusions and General- All calculations are done in fully autonomous way. izations Why autonomous calculations? 22 / 37

  23. What is Special About This Research? Differentiation Algebraic Equations Machine All calculations are done in fully autonomous way. Learning Conclusions and General- izations Why autonomous calculations? Correctness and Scalability . 23 / 37

  24. What is Special About This Research? Differentiation Algebraic Equations Machine Learning All calculations are done in fully autonomous way. Conclusions and General- izations Why autonomous calculations? Results are NOT Biased and Subjective . 24 / 37

  25. Classification problem Differentiation Algebraic Equations Machine Learning Conclusions and General- izations Figure: Classification of pictures 25 / 37

  26. Classification problem from mathematical point of view Differentiation Algebraic Equations Example Machine If x 1 is in the class A then f ( x 1 , W ) > 0. Learning If x 2 is in the class B then f ( x 2 , W ) < 0. Conclusions and General- izations 26 / 37

  27. Training of the Model Differentiation Algebraic Equations Least square error: Machine Learning n n O � 2 � y ( i ) y ( i ) � � Conclusions J ( W ) = − ˆ and General- j j izations i =1 j =1 Learning process n n O � 2 W ∗ = arg min J ( W ) = arg min � y ( i ) y ( i ) � � − ˆ j j i =1 j =1 Prediction y j ( x , W ∗ ) y j = ˆ 27 / 37

  28. Predicting Solution Method Differentiation Algebraic Equations Let x is a given mathematical problem, then after the training Machine Learning it is possible to predict the solution step y . Conclusions and General- izations y = f ( x , W ) Google AI system proves over 1200 mathematical theorems, April 26th, 2019. Kshitij Bansal, Sarah M Loos, Markus N. Rabe, Christian Szegedy, and Stewart Wilcox, HOList: An Environment for Machine Learning of Higher-Order Theorem Proving, 24 May 2019, https://arxiv.org/pdf/1904.03241.pdf 28 / 37

  29. Finite Number of Steps? Logic? Differentiation Algebraic Equations Machine Learning The fundamental difference between if-else and AI is that Conclusions and General- if-else models are static deterministic environments and izations machine learning (ML) algorithms, the primary underpinning of AI, are probabilistic stochastic environments. Remark: machine learning (probabilistic) reasoning use mathematics which is based on above described methodology. 29 / 37

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend