Knowledge representation Artificial intelligence



an ontology represents knowledge set of concepts within domain , relationships between concepts.



knowledge representation , knowledge engineering central ai research. many of problems machines expected solve require extensive knowledge world. among things ai needs represent are: objects, properties, categories , relations between objects; situations, events, states , time; causes , effects; knowledge knowledge (what know other people know); , many other, less researched domains. representation of exists ontology: set of objects, relations, concepts, , properties formally described software agents can interpret them. semantics of these captured description logic concepts, roles, , individuals, , typically implemented classes, properties, , individuals in web ontology language. general ontologies called upper ontologies, attempt provide foundation other knowledge acting mediators between domain ontologies cover specific knowledge particular knowledge domain (field of interest or area of concern). such formal knowledge representations suitable content-based indexing , retrieval, scene interpretation, clinical decision support, knowledge discovery via automated reasoning (inferring new statements based on explicitly stated knowledge), etc. video events represented swrl rules, can used, among others, automatically generate subtitles constrained videos.


among difficult problems in knowledge representation are:



default reasoning , qualification problem
many of things people know take form of working assumptions . example, if bird comes in conversation, people typically picture animal fist sized, sings, , flies. none of these things true birds. john mccarthy identified problem in 1969 qualification problem: commonsense rule ai researchers care represent, there tend huge number of exceptions. nothing true or false in way abstract logic requires. ai research has explored number of solutions problem.
the breadth of commonsense knowledge
the number of atomic facts average person knows large. research projects attempt build complete knowledge base of commonsense knowledge (e.g., cyc) require enormous amounts of laborious ontological engineering—they must built, hand, 1 complicated concept @ time. major goal have computer understand enough concepts able learn reading sources internet, , able add own ontology.
the subsymbolic form of commonsense knowledge
much of people know not represented facts or statements express verbally. example, chess master avoid particular chess position because feels exposed or art critic can take 1 @ statue , realize fake. these non-conscious , sub-symbolic intuitions or tendencies in human brain. knowledge informs, supports , provides context symbolic, conscious knowledge. related problem of sub-symbolic reasoning, hoped situated ai, computational intelligence, or statistical ai provide ways represent kind of knowledge.




^ cite error: named reference knowledge representation invoked never defined (see page).
^ cite error: named reference knowledge engineering invoked never defined (see page).
^ cite error: named reference representing categories , relations invoked never defined (see page).
^ cite error: named reference representing time invoked never defined (see page).
^ cite error: named reference representing causation invoked never defined (see page).
^ cite error: named reference representing knowledge knowledge invoked never defined (see page).
^ sikos, leslie f. (june 2017). description logics in multimedia reasoning. cham: springer. doi:10.1007/978-3-319-54066-5. isbn 978-3-319-54066-5. archived original on 29 august 2017. 
^ cite error: named reference ontology invoked never defined (see page).
^ bertini, m; del bimbo, a; torniai, c (2006). automatic annotation , semantic retrieval of video sequences using multimedia ontologies . mm ‘06 proceedings of 14th acm international conference on multimedia. 14th acm international conference on multimedia. santa barbara: acm. pp. 679–682. 
^ cite error: named reference qualification problem invoked never defined (see page).
^ cite error: named reference default reasoning , non-monotonic logic invoked never defined (see page).
^ cite error: named reference breadth of commonsense knowledge invoked never defined (see page).
^ dreyfus & dreyfus 1986.
^ gladwell 2005.
^ cite error: named reference intuition invoked never defined (see page).






Comments

Popular posts from this blog

Types Raffinate

Biography Michał Vituška

Caf.C3.A9 Types of restaurant