ontologies and knowledge based systems
play

Ontologies and Knowledge-based Systems Is there a flexible way to - PowerPoint PPT Presentation

Ontologies and Knowledge-based Systems Is there a flexible way to represent relations? How can knowledge bases be made to interoperate semantically? D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 1 1 / 12


  1. Ontologies and Knowledge-based Systems Is there a flexible way to represent relations? How can knowledge bases be made to interoperate semantically? � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 1 1 / 12

  2. Choosing Individuals and Relations How to represent: “Pen #7 is red.” � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 2 2 / 12

  3. Choosing Individuals and Relations How to represent: “Pen #7 is red.” red ( pen 7 ). It’s easy to ask “What’s red?” Can’t ask “what is the color of pen 7 ?” � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 3 2 / 12

  4. Choosing Individuals and Relations How to represent: “Pen #7 is red.” red ( pen 7 ). It’s easy to ask “What’s red?” Can’t ask “what is the color of pen 7 ?” color ( pen 7 , red ). It’s easy to ask “What’s red?” It’s easy to ask “What is the color of pen 7 ?” Can’t ask “What property of pen 7 has value red ?” � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 4 2 / 12

  5. Choosing Individuals and Relations How to represent: “Pen #7 is red.” red ( pen 7 ). It’s easy to ask “What’s red?” Can’t ask “what is the color of pen 7 ?” color ( pen 7 , red ). It’s easy to ask “What’s red?” It’s easy to ask “What is the color of pen 7 ?” Can’t ask “What property of pen 7 has value red ?” prop ( pen 7 , color , red ). It’s easy to ask all these questions. � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 5 2 / 12

  6. Choosing Individuals and Relations How to represent: “Pen #7 is red.” red ( pen 7 ). It’s easy to ask “What’s red?” Can’t ask “what is the color of pen 7 ?” color ( pen 7 , red ). It’s easy to ask “What’s red?” It’s easy to ask “What is the color of pen 7 ?” Can’t ask “What property of pen 7 has value red ?” prop ( pen 7 , color , red ). It’s easy to ask all these questions. prop ( Individual , Property , Value ) is the only relation needed: called individual-property-value representation or triple representation � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 6 2 / 12

  7. Universality of prop To represent “a is a parcel” � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 7 3 / 12

  8. Universality of prop To represent “a is a parcel” prop ( a , type , parcel ), where type is a special property prop ( a , parcel , true ), where parcel is a Boolean property � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 8 3 / 12

  9. Reification To represent scheduled ( cs 422 , 2 , 1030 , cc 208) . “section 2 of course cs 422 is scheduled at 10:30 in room cc 208.” � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 9 4 / 12

  10. Reification To represent scheduled ( cs 422 , 2 , 1030 , cc 208) . “section 2 of course cs 422 is scheduled at 10:30 in room cc 208.” Let b 123 name the booking: prop ( b 123 , course , cs 422) . prop ( b 123 , section , 2) . prop ( b 123 , time , 1030) . prop ( b 123 , room , cc 208) . We have reified the booking. Reify means: to make into an individual. What if we want to add the year? � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 10 4 / 12

  11. Semantic Networks / Knowledge Graphs When you only have one relation, prop , it can be omitted without loss of information. Logic: prop ( Individual , Property , Value ) triple: � Individual , Property , Value � simple sentence: Individual Property Value . graphically: Prop Obj Val � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 11 5 / 12

  12. An Example Semantic Network / Knowledge Graph building comp_sci lemon_laptop_10000 r107 building r117 ming lemon_computer room room model craig deliver_to brand owned_by logo comp_2347 lemon_disc packing color weight size cardboard_box light brown medium � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 12 6 / 12

  13. Equivalent Logic Program prop ( comp 2347 , owned by , craig ) . prop ( comp 2347 , deliver to , ming ) . prop ( comp 2347 , model , lemon laptop 10000) . prop ( comp 2347 , brand , lemon computer ) . prop ( comp 2347 , logo , lemon disc ) . prop ( comp 2347 , color , brown ) . prop ( craig , room , r 107) . prop ( r 107 , building , comp sci ) . . . . � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 13 7 / 12

  14. A Structured Semantic Network / Knowledge Graph room r117 ming cardboard_box building deliver_to packing computer comp_sci subClassOf building lemon_computer logo color r107 brown lemon_disc subClassOf lemon_laptop_10000 room size weight type medium craig light comp_2347 owned_by � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 14 8 / 12

  15. Logic of Property p An arc c → v from a class c with a property p to value v means − every individual in the class has value v on property p : prop ( Obj , p , v ) ← prop ( Obj , type , c ) . Example: prop ( X , weight , light ) ← prop ( X , type , lemon laptop 10000) . prop ( X , packing , cardboard box ) ← prop ( X , type , computer ) . � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 15 9 / 12

  16. Logic of Property Inheritance You can do inheritance through the subclass relationship: prop ( X , type , T ) ← prop ( S , subClassOf , T ) ∧ prop ( X , type , S ) . � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 16 10 / 12

  17. Multiple Inheritance An individual is usually a member of more than one class. For example, the same person may be a wine expert, a teacher, a football coach,. . . . The individual can inherit the properties of all of the classes it is a member of: multiple inheritance. With default values,what is an individual inherits conflicting defaults from the different classes? multiple inheritance problem. � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 17 11 / 12

  18. Choosing Primitive and Derived Properties Associate an property value with the most general class with that property value. Don’t associate contingent properties of a class with the class. For example, if all of current computers just happen to be brown. � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.1, Page 18 12 / 12

  19. Knowledge Sharing A conceptualization is a map from the problem domain into the representation. A conceptualization specifies: ◮ What sorts of individuals are being modeled ◮ The vocabulary for specifying individuals, relations and properties ◮ The meaning or intention of the vocabulary If more than one person is building a knowledge base, they must be able to share the conceptualization. An ontology is a specification of a conceptualization. An ontology specifies the meanings of the symbols in an information system. � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.2, Page 1 1 / 20

  20. Mapping from a conceptualization to a symbol � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.2, Page 2 2 / 20

  21. Semantic Web Ontologies are published on the web in machine readable form. Builders of knowledge bases or web sites adhere to and refer to a published ontology: ◮ a symbol defined by an ontology means the same thing across web sites that obey the ontology. ◮ if someone wants to refer to something not defined, they publish an ontology defining the terminology. Others adopt the terminology by referring to the new ontology. In this way, ontologies evolve. ◮ Separately developed ontologies can have mappings between them published. � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.2, Page 3 3 / 20

  22. Challenges of building ontologies They can be huge: finding the appropriate terminology for a concept may be difficult. � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.2, Page 4 4 / 20

  23. Challenges of building ontologies They can be huge: finding the appropriate terminology for a concept may be difficult. How one divides the world can depend on the application. Different ontologies describe the world in different ways. People can fundamentally disagree about an appropriate structure. � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.2, Page 5 4 / 20

  24. Challenges of building ontologies They can be huge: finding the appropriate terminology for a concept may be difficult. How one divides the world can depend on the application. Different ontologies describe the world in different ways. People can fundamentally disagree about an appropriate structure. Different knowledge bases can use different ontologies. To allow KBs based on different ontologies to inter-operate, there must be mapping between ontologies. It has to be in user’s interests to use an ontology. � D. Poole and A. Mackworth 2017 c Artificial Intelligence, Lecture 14.2, Page 6 4 / 20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend