Hobbs, J., Croft, W., Davies, T., Edwards, D., & Laws, K. (1987). Commonsense metaphysics and lexical semantics. Computational linguistics, 13(3-4), 241-250.
In the TACITUS project for using commonsense knowledge in the understanding of texts about mechanical devices and their failures, we have been developing various commonsense theories that are needed to mediate between the way we talk about the behavior of such devices and causal models of their operation. Of central importance in this effort is the axiomatization of what might be called “commonsense, metaphysics”. This includes a number of areas that figure in virtually every domain of discourse, such as scalar notions, granularity, time, space, material, physical objects, causality, functionality, force, and shape. Our approach to lexical semantics is to construct core theories of each of these areas, and then to define, or at least characterized, a large number of lexical items in terms provided by the core theories. In the tactic us system, processes for solving pragmatics problems, posed by a text, will use the knowledge base, consisting of these theories, in conjunction, with the logical forms of the sentences in the text, to produce an interpretation. In this paper, we do not stress these interpretation processes, this is another, important aspect of the tactic is project, and it will be described in subsequent papers.
This work represents a convergence of research in lexical semantics in linguistics and efforts in AI to encode common sense knowledge. Over the years, lexical semanticists have developed formalisms of increasing adequacy for encoding word meaning, progressing from simple sets of features to notations for predicate – argument, structure, but the early attempts still limited access to world knowledge, and assumed only very restricted sorts of processing. Workers in computational linguistics introduced inference, and other complex cognitive processes into our understanding of the role of word meaning. Recently, linguists have given greater attention to the cognitive processes that would operate on their representations. Independently, in AI and effort arose to encode large amounts of common sense knowledge. The research reported here represents a convergence of these various developments. By developing core theories of certain fundamental phenomenon and defining lexical terms within these theories, using the full power of predicate calculus, we are able to cope with complexities of word, meaning that have heather to escaped lexical semanticists. Moreover, we can do this within the framework that gives full scope to the planning, and reasoning processes that manipulate representations of word meaning.