Charles J. Fillmore (born 1929) is an American linguist, and an Emeritus Professor of Linguistics at the University of California, Berkeley. He was one of the first linguists to introduce a representation of linguistic knowledge that blurred this strong distinction between syntactic and semantic knowledge of a language.
He introduced what was termed case structure grammar and this representation subsequently had considerable influence on psychologists as well as computational linguists.
Grammar Case is a system of linguistic analysis, focusing on the link between the valence, or number of subjects, objects, etc., of a verb and the grammatical context it requires.
The system was created by the American linguist Charles J. Fillmore in (1968), in the context of Transformational Grammar. This theory analyzes the surface syntactic structure of sentences by studying the combination of deep cases (i.e. semantic roles) Agent, Object, Benefactor, Location or Instrument which are required by a specific verb.
According to Fillmore, each verb selects a certain number of deep cases which form its case frame. Thus, a case frame describes important aspects of semantic valency, of verbs, adjectives and nouns.
Case frames are subject to certain constraints, such as that a deep case can occur only once per sentence. Some of the cases are obligatory and others are optional.
Obligatory cases may not be deleted, at the risk of producing ungrammatical sentences. The case structure representation served to inspire the development of what was termed a frame-based representation in AI research.
Within a frame-base architecture it is quite natural to have these type of inferences triggered by the representation of the sentence. (For those familiar with certain types of Object Oriented programming language; the frame-based architecture in AI was a somewhat more complicated and elaborated programming environment.)
One of the consistent findings in human sentence understanding is that we seem to draw these inferences automatically. And, we rarely remember whether or not such information was explicitly stated in the sentence. This observation is consistent with some of the features of a frame-based representation as suggested by case structure grammar Another aspect of the case grammar representation is that it can be effectively used to parse incomplete or noisy sentences.
For example, while John gave book is not grammatical; it is still possible to create an appropriate case grammar parse of this string of words. However, case grammar is not a particularly good representation for use in parsing sentences that involve complex syntactic constructions. The web page on representing textual information will give you some appreciation of this difficulty.
Structural
Semantics according to William Chafe´s perspective
Structural
Semantics is the study of relationships between the meanings of terms within a
sentence, and how meaning can be composed from smaller elements. However,
some critical theorists suggest
that meaning is only divided into smaller structural units via its regulation
in concrete social interactions; outside of these interactions language may
become meaningless.
In the
approaches labelled "Structural semantics" by cognitive linguists, word meanings,
or lexical meanings can
be broken down into atomic semantic
features, which are in a way the distinctive properties of the
meaning of a word.
In accordance
with the objectivist bias of structural semantics, semantic features are
believed to refer to actual properties, objects or relations in the exterior
world.
Syntactic
description has usually taken the sentence to be its basic unit of
organization, although probably no one would deny that systematic constraints
exist across sentence boundaries as well.
From time to
time some attention has been given to “discourse” structure, but the structure
of the sentences has seemed to exhibit a kind of closure which allows it to be
investigated in relative, if not complete, independence.
Language seen
from a semantic perspective, intersentential constraints play a role that is
probably more important than under other views of language, for a number of the
limitations which cross sentence boundaries are clearly semantic in nature.
The term
sentence provides a convenient way of referring to a verb and its accompanying
nouns, the status of sentence as an independent structural entity is doubtful.
There seems no need for some independent symbol as the starting point for
generation of sentences, the verb is all the starting point needed.
A sentence is
either a verb alone, a verb accompanied by one or more nouns, or a
configuration of this kind to which one ore more coordinate or subordinate
verbs have been added.
Bibliography
No hay comentarios:
Publicar un comentario