For the past forty years, linguistics has been dominated by the idea that language is categorical and linguistic competence discrete. It has become increasingly clear, however, that many levels of representation, from phonemes to sentence structure, show probabilistic properties, as does the language faculty. Probabilistic linguistics conceptualizes categories as distributions and views knowledge of language not as a minimal set of categorical constraints but as a set of gradient rules that may be characterized by a statistical distribution.
This is the first detailed study to explore the little-understood notions of "knowing who someone is," "knowing a person's identity," and related locutions. It locates these notions within the context of a general theory of believing and a semantical theory of belief- and knowledge-ascriptions.
This collection of articles and associated discussion papers focuses on a problem that has attracted increasing attention from linguists and psychologists throughout the world during the past several years. Reduced to essentials, the problem is that of discovering the character of the mental capacities that make it possible for human beings to attain knowledge of their language on the basis of fragmentary and haphazard early linguistic experience.
The idea that the language we speak influences the way we think has evoked perennial fascination and intense controversy. According to the strong version of this hypothesis, called the Sapir-Whorf hypothesis after the American linguists who propounded it, languages vary in their semantic partitioning of the world, and the structure of one's language influences how one understands the world. Thus speakers of different languages perceive the world differently.
In this two-volume set Leonard Talmy defines the field of cognitive semantics. He approaches the question of how language organizes conceptual material both at a general level and by analyzing a crucial set of particular conceptual domains: space and time, motion and location, causation and force interaction, and attention and viewpoint. Talmy maintains that these are among the most fundamental parameters by which language structures conception.
What does our ability to use words—that is, our lexical competence—consist of? What is the difference between a system that can be said to understand language and one that cannot? Most approaches to word meaning fail to account for an essential aspect of our linguistic competence, namely, our ability to apply words to the world. This monograph proposes a dual picture of human lexical competence in which inferential and referential abilities are separate—a proposal confirmed by neuropsychological research on brain-damaged persons.
The psychologist William James observed that "a native talent for perceiving analogies is ... the leading fact in genius of every order." The centrality and the ubiquity of analogy in creative thought have been noted again and again by scientists, artists, and writers, and understanding and modeling analogical thought have emerged as two of the most important challenges for cognitive science.
Connectionist approaches, Andy Clark argues, are driving cognitive science toward a radical reconception of its explanatory endeavor. At the heart of this reconception lies a shift toward a new and more deeply developmental vision of the mind - a vision that has important implications for the philosophical and psychological understanding of the nature of concepts, of mental causation, and of representational change.
According to the received view of linguistic communication, the primary function of language is to enable speakers to reveal the propositional contents of their thoughts to hearers. Speakers are able to do this because they share with their hearers an understanding of the meanings of words. Christopher Gauker rejects this conception of language, arguing that it rests on an untenable conception of mental representation and yields a wrong account of the norms of discourse.