logics in machine learning and data mining achievements
play

Logics in Machine Learning and Data Mining: Achievements and Open - PowerPoint PPT Presentation

Logics in Machine Learning and Data Mining: Achievements and Open Issues Francesca A. Lisi University of Bari Aldo Moro Department of Computer Science Lab of Knowledge Acquisition and Machine Learning (LACAM) francesca.lisi@uniba.it


  1. Logics in Machine Learning and Data Mining: Achievements and Open Issues Francesca A. Lisi University of Bari “Aldo Moro” Department of Computer Science Lab of Knowledge Acquisition and Machine Learning (LACAM) francesca.lisi@uniba.it June 19, 2019 F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 1 / 28

  2. Overview Introduction 1 Three cases for Logics in ML/DM 2 Combining rules and ontologies Dealing with imprecision and granularity Modeling and metamodeling Final remarks 3 References 4 F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 2 / 28

  3. Introduction I Model-free AI vs. Model-based AI Current hype about AI due to many successful ML applications Deep learning follows the model-free approach ML tasks are function-fitting problems! Need to construct and use models (“good old-fashioned AI”) Logics and probability as main tools Is there any ML algorithm following the model-based approach? F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 3 / 28

  4. Introduction II Inductive Logic Programming in a nutshell Major logic-based approach to rule learning [Muggleton, 1990] Concept Learning within the LP framework Use of background knowledge (BK) Bunch of techniques for structuring, searching and bounding the hypothesis space [Nienhuys-Cheng and de Wolf, 1997] Two representative ILP algorithms: Foil [Quinlan, 1990] 1 Warmr [Dehaspe and Toivonen, 1999] 2 Several extensions, e.g. , towards statistical learning and other probabilistic approaches (see [Riguzzi et al., 2014] for a survey). F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 4 / 28

  5. Three cases for Logics in ML/DM 1 Combining rules and ontologies 2 Dealing with imprecision and granularity 3 Modeling and metamodeling F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 5 / 28

  6. Combining rules and ontologies I Logic Programming vs. Descripion Logics 1 CWA vs OWA 2 Single vs. multiple models 3 Negation as failure vs. classical negation 4 Strong negation vs. classical negation 5 Treatment of equality Unique Names Assumption (UNA) [Reiter, 1980] might not hold in DLs 6 Existential quantification Recent work on Datalog ± [Cal` ı et al., 2009] 7 Decidability F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 6 / 28

  7. Combining rules and ontologies II Exploiting the power of combination Combination is more than the sum of the parts Very expressive FOL languages as an outcome Solutions to the semantic mismatch between LP and DLs Approaches Non-hybrid Combination within a homogeneous semantic framework e.g. , description logic programs [Grosof et al., 2003] Hybrid Combination within a heterogeneous semantic framework e.g. , AL - log [Donini et al., 1998] F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 7 / 28

  8. Combining rules and ontologies III Learning hybrid rules with ILP Learning Carin - ALN rules Learning AL - log rules Learning SHIQ + log [Rouveirol and Ventos, 2000, [Lisi, 2008] rules [Lisi, 2010] Kietz, 2003] prior knowledge Carin - ALN KB AL - log KB SHIQ + log KB ontology language ALN ALC SHIQ rule language HCL Datalog Datalog hypothesis language Carin - ALN non-recursive AL - log non-recursive rules SHIQ + log non- rules recursive rules target predicate Horn predicate Datalog predicate SHIQ / Datalog predi- cate logical setting interpretations interpretations/entailment entailment scope of induction prediction prediction/description prediction/description generality order generalized subsumption generalized subsumption generalized subsumption DL + log ∨ query answer- coverage test Carin query answering AL - log query answering ing ref. operators n.a. downward downward/upward implementation unknown yes, see [Lisi, 2011] no application no yes, see no [Lisi and Malerba, 2004] F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 8 / 28

  9. Combining rules and ontologies IV AL - QuIn : general features [Lisi, 2011] Task: multi-level association rule mining Method: levelwise search BK: hybrid (relational DB + ontology) Knowledge representation formalism: AL - log . Upgrade from: Warmr . F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 9 / 28

  10. Combining rules and ontologies V AL - QuIn : problem statement Given a relational data set Π a taxonomic ontology Σ a multi-grained language L = {L l } 1 ≤ l ≤ maxG a set { minsup l } 1 ≤ l ≤ maxG of support thresholds the problem of frequent pattern discovery in Π at l levels of description granularity w.r.t. Σ, 1 ≤ l ≤ maxG , is to find the set F of all the patterns P ∈ L l frequent in B = (Π , Σ), namely P ’s with support s such that (i) s ≥ minsup l and (ii) all ancestors of P w.r.t. Σ are frequent. F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 10 / 28

  11. Combining rules and ontologies VI AL - QuIn : An example Q 1 = q(X) ← believes(X,Y) & MiddleEastCountry(X) Q 3 = q(X) ← believes(X,Y) & MiddleEastCountry(X) , Religion(Y) Q 4 = q(X) ← believes(X,Y) & MiddleEastCountry(X) , MonotheisticReligion(Y) Q 5 = q(X) ← believes(X,Y) , speaks(X,Z) & MiddleEastCountry(X) , MonotheisticReligion(Y), IndoEuropeanLanguage(Z) F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 11 / 28

  12. Dealing with imprecision and granularity I Fuzzy Description Logics Several ways of extending DLs with fuzzy logic [Straccia, 2013] Some ad-hoc reasoners already available ( e.g. , FuzzyDL [Bobillo and Straccia, 2008]) Fuzzy quantifiers in Fuzzy DLs [Sanchez and Tettamanzi, 2006] Proposal of Fuzzy OWL 2 [Bobillo and Straccia, 2010] F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 12 / 28

  13. Dealing with imprecision and granularity II Fuzzy EL ( D ) [Straccia, 2005] Complex concepts built according to the following syntactic rules: C → ⊤ | ⊥ | A | C 1 ⊓ C 2 | ∃ R . C | ∃ T . d where d can be one of the membership functions of fuzzy sets ⊓ and ∃ are interpreted as truth combination functions Concepts are interpreted as fuzzy sets Axioms are graded, i.e. have a truth degree α (if omitted, α = 1) e.g., I satisfies an axiom � a : C , α � if C I ( a I ) ≥ α The best entailment degree of an axiom τ w.r.t. K is defined as bed ( K , τ ) = sup { α | K | = � τ, α �} . (1) For a crisp axiom τ , we also write K | = + τ iff bed ( K , τ ) > 0. F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 13 / 28

  14. Dealing with imprecision and granularity III Fuzzy DL Learning [Konstantopoulos and Charalambidis, 2010] propose an ad-hoc translation of fuzzy � Lukasiewicz ALC DL constructs into LP in order to apply a conventional ILP method for rule learning. Unsound method [Iglesias and Lehmann, 2011] propose to interface CELOE with the fuzzyDL reasoner Uncomparable method [Lisi and Straccia, 2013] present a method for learning fuzzy EL GCI axioms from fuzzy DL-Lite KBs. Unimplemented method F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 14 / 28

  15. Dealing with imprecision and granularity IV Foil - DL : general features [Lisi and Straccia, 2014] Task: classification rule mining Method: sequential covering BK: DL KB Hypothesis description language: fuzzy EL ( D ). Upgrade from: Foil . F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 15 / 28

  16. Dealing with imprecision and granularity V Foil - DL : problem statement Given: a consistent DL KB K = �T , A� (the background theory ); an atomic concept A t (the target concept ); a set E = E + ∪ E − of crisp DL concept assertions labelled as either positive or negative examples for H (the training set ); a set L H of fuzzy EL ( D ) GCIs of the form C ⊑ A t (the language of hypotheses ) where C is a complex concept Find: a set H ⊂ L H (a hypothesis ) such that: Completeness. ∀ e ∈ E + , K ∪ H | = + e , and Consistency. ∀ e ∈ E − , K ∪ H �| = + e . F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 16 / 28

  17. Dealing with imprecision and granularity VI Foil - DL : An example Confidence Axiom 1,000 Hostel subclass of Good_Hotel 1,000 hasPrice_veryhigh subclass of Good_Hotel 0,739 hasDistance some (isDistanceFor some (Bus_Station) and hasValue_low) and hasDistance some (isDistanceFor some (Town_Hall) and hasValue_fair) and hasRank some (Rank) and hasPrice_verylow subclass of Good_Hotel 0,569 hasPrice_high subclass of Good_Hotel 0,289 Hotel_3_Stars and hasDistance some (isDistanceFor some (Train_Station) and hasValue_verylow) and hasPrice_fair subclass of Good_Hotel 0,198 Hotel_4_Stars and hasDistance some (isDistanceFor some (Square) and hasValue_high) and hasRank some (Rank) and hasPrice_fair subclass of Good_Hotel F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 17 / 28

  18. Dealing with imprecision and granularity VII Information granulation and DLs [Lisi and Mencar, 2017, Lisi and Mencar, 2018] Support to both coarser-granularity fuzzy quantified sentences such as “Many hotels have a low distance from attractions” and finer-granularity fuzzy quantified sentences such as “Hotel Verdi has a low distance from many attractions” F.A. Lisi (Univ. Bari) Logics in ML/DM June 19, 2019 18 / 28

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend