What are the key principles and applications of Relational Frame Theory?

Relational Frame Theory (RFT) is a psychological theory that aims to explain how human language and cognition work. It is based on the concept of relational framing, which refers to the ability to relate things to one another based on their similarities and differences. RFT has been applied to various areas of psychology, including language development, behavior therapy, and understanding the nature of human thought. In this essay, we will discuss the key principles of RFT and its applications in different fields of psychology. We will explore how this theory has contributed to our understanding of language and cognition and how it has been used to improve therapeutic interventions.

Relational frame theory, or RFT, is a psychological theory of human language and cognition. It was developed largely through the efforts of Steven C. Hayes of University of Nevada, Reno and Dermot Barnes-Holmes of National University of Ireland, Maynooth and is currently being tested in about three dozen laboratories around the world.

Relational frame theory is based on the philosophical roots of functional contextualism, it focuses on how humans learn language through interactions with the environment. Functional contextualism is an extension and contextualistic interpretation of B.F. Skinner’s radical behaviorism, and emphasizes the importance of predicting and influencing psychological events, such as thoughts, feelings, and behaviors, by focusing on manipulable variables in their context.

Development

RFT is a behavioral approach to language. B.F. Skinner proposed one such approach in 1957 in his book Verbal Behavior. Skinner presented his approach as an interpretation, not an experimental research program, and researchers commonly acknowledge that the research products are somewhat limited in scope. For example, it has been useful in some aspects of language training in developmentally disabled children, but it has not led to a robust research program in the range of areas relevant to language and cognition, such as problem-solving, reasoning, metaphor, logic, and so on. RFT advocates are fairly bold in stating that their goal is an experimental behavioral research program in all such areas, and RFT research has indeed emerged in a number of these areas including grammar.

In a review of Skinner’s book, linguist Noam Chomsky argued that the generativity of language shows that it cannot simply be learned, that there must be some innate “language acquisition device”. Many have seen this review as a turning point, when cognitivism took the place of behaviorism as the mainstream in psychology. Behavior analysts generally viewed the criticism as unfair and largely off point (for a behavior analytic response to Chomsky, see MacCorquodale (1970), On Chomsky’s Review Of Skinner’s Verbal Behavior), but it is undeniable that psychology turned its attention elsewhere and the review was very influential in helping to produce the rise of cognitive psychology.

Despite the lack of attention from the mainstream, behavior analysis is alive and growing. Its application has been extended to areas such as language and cognitive training, animal training, business and school settings, as well as hospitals and areas or research.

RFT distinguishes itself from Skinner’s work by identifying and defining a particular type of operant conditioning known as derived relational responding. This is a learning process that to date appears to occur only in humans possessing a capacity for language. Derived relational responding is theorized to be a pervasive influence on almost all aspects of human behavior. The theory represents an attempt to provide a more empirically progressive account of complex human behavior while preserving the naturalistic approach of behavior analysis.

Evidence

Several dozen studies have tested RFT ideas. Supportive data exists in the areas needed to show that an action is “operant” such as the importance of multiple examples in training derived relational responding, the role of context, and the importance of consequences. Derived relational responding has also been shown to alter other behavioral processes such as classical conditioning, an empirical result that RFT theorists point to in explaining why relational operants modify existing behavioristic interpretations of complex human behavior. Empirical advances have also been made by RFT researchers in the analysis and understanding of such topics as metaphor, perspective taking, and reasoning.

Proponents of RFT often indicate the failure to establish a vigorous experimental program in language and cognition as the key reason why behavior analysis fell out of the mainstream of psychology despite its many contributions, and argue that RFT might provide a way forward. The theory is still somewhat controversial within behavioral psychology, however. At the current time the controversy is not primarily empirical since RFT studies publish regularly in mainstream behavioral journals and few empirical studies have yet claimed to contradict RFT findings. Rather the controversy seems to revolve around whether RFT is a positive step forward, especially given that its implications seem to go beyond many existing interpretations and extensions from within this intellectual tradition.

Applications

Acceptance and commitment therapy

RFT underlies the therapeutic practice known as acceptance and commitment therapy. RFT provides conceptual and procedural guidance for enhancing the cognitive and language development capability (through its detailed treatment and analysis of derived relational responding and the transformation of function) of early intensive behavior intervention (EIBI) programs for young children with autism and related disorders. Relational frame theory has become important in predicting the differences between standard cognitive therapy changes through thought change versus acceptance-based interventions like acceptance and commitment therapy.

The IRAP

The Implicit Relational Assessment Procedure (IRAP), an implicit measure similar to the Implicit Association Test (IAT), with the key difference being that it measures specific relations between stimuli rather than general associations, has its theoretical basis in RFT. The IRAP was developed by Dermot Barnes-Holmes.

Emergentism

Emergentist theories, such as MacWhinney’s competition model, posit that language acquisition is a cognitive process that emerges from the interaction of biological pressures and the environment. According to these theories, neither nature nor nurture alone is sufficient to trigger language learning; both of these influences must work together in order to allow children to acquire a language. The proponents of these theories argue that general cognitive processes subserve language acquisition and that the end result of these processes is language-specific phenomena, such as word learning and grammar acquisition. The findings of many empirical studies support the predictions of these theories, suggesting that language acquisition is a more complex process than many believe.

Syntax

Generativism

Generative grammar, associated especially with the work of Noam Chomsky, is currently one of the principal approaches to children’s acquisition of syntax. The leading idea is that human biology imposes narrow constraints on the child’s “hypothesis space” during language acquisition. In the Principles and Parameters Framework, which has dominated generative syntax since Chomsky’s (1980) Lectures on Government and Binding, the acquisition of syntax resembles ordering from a menu: The human brain comes equipped with a limited set of choices, and the child selects the correct options using her parents’ speech, in combination with the context.

An important argument in favor of the generative approach is the Poverty of the stimulus argument. The child’s input (a finite number of sentences encountered by the child, together with information about the context in which they were uttered) is in principle compatible with an infinite number of conceivable grammars. Moreover, few if any children can rely on corrective feedback from adults when they make a grammatical error. Yet, barring situations of medical abnormality or extreme privation, all the children in a given speech-community converge on very much the same grammar by the age of about five years. An especially dramatic example is provided by children who for medical reasons are unable to produce speech, and therefore can literally never be corrected for a grammatical error, yet nonetheless converge on the same grammar as their typically developing peers, according to comprehension-based tests of grammar.

Considerations such as these have led Chomsky, Jerry Fodor, Eric Lenneberg and others to argue that the types of grammar that the child needs to consider must be narrowly constrained by human biology (the nativist position). These innate constraints are sometimes referred to as universal grammar, the human “language faculty,” or the “language instinct.”

Empiricism

Since Chomsky in the 1950s, many criticisms of the basic assumptions of generative theory have been put forth. Critics argue that the concept of a Language Acquisition Device (LAD) is unsupported by evolutionary anthropology, which tends to show a gradual adaptation of the human brain and vocal chords to the use of language, rather than a sudden appearance of a complete set of binary parameters delineating the whole spectrum of possible grammars ever to have existed and ever to exist. (Binary parameters are common to digital computers but not, as it turns out, to neurological systems such as the human brain.)

Further, while generative theory has several hypothetical constructs (such as movement, empty categories, complex underlying structures, and strict binary branching) that cannot possibly be acquired from any amount of linguistic input, it is unclear that human language is actually anything like the generative conception of it. Since language, as imagined by nativists, is unlearnably complex, subscribers to this theory argue that it must therefore be innate. A different theory of language, however, may yield different conclusions. While all theories of language acquisition posit some degree of innateness, a less convoluted theory might involve less innate structure and more learning. Under such a theory of grammar, the input, combined with both general and language-specific learning capacities, might be sufficient for acquisition.

Since 1980, linguists studying children, such as Melissa Bowerman, and psychologists following Jean Piaget, like Elizabeth Bates and Jean Mandler, came to suspect that there may indeed be many learning processes involved in the acquisition process, and that ignoring the role of learning may have been a mistake.

In recent years, opposition to the nativist position has multiplied. The debate has centered on whether the inborn capabilities are language-specific or domain-general, such as those that enable the infant to visually make sense of the world in terms of objects and actions. The anti-nativist view has many strands, but a frequent theme is that language emerges from usage in social contexts, using learning mechanisms that are a part of a general cognitive learning apparatus (which is what is innate). This position has been championed by Elizabeth Bates, Catherine Snow, Brian MacWhinney, Michael Tomasello, Michael Ramscar, William O’Grady, and others. Philosophers, such as Fiona Cowie and Barbara Scholz with Geoffrey Pullum have also argued against certain nativist claims in support of empiricism.

Statistical learning

Some language acquisition researchers, such as Elissa Newport, Richard Aslin, and Jenny Saffran, believe that language acquisition is based primarily on general learning mechanisms, namely statistical learning. The development of connectionist models that are able to successfully learn words and syntactical conventions supports the predictions of statistical learning theories of language acquisition, as do empirical studies of children’s learning of words and syntax.

Chunking

Chunking theories of language acquisition constitute a group of theories related to statistical learning theories in that they assume that the input from the environment plays an essential role; however, they postulate different learning mechanisms. The central idea of these theories is that language development occurs through the incremental acquisition of meaningful chunks of elementary constituents, which can be words, phonemes, or syllables. Recently, this approach has been highly successful in simulating several phenomena in the acquisition of syntactic categories and the acquisition of phonological knowledge. The approach has several features that make it unique: the models are implemented as computer programs, which enables clear-cut and quantitative predictions to be made; they learn from naturalistic input, made of actual child-directed utterances; they produce actual utterances, which can be compared with children’s utterances; and they have simulated phenomena in several languages, including English, Spanish, and German.

Researchers at the Max Planck Institute for Evolutionary Anthropology have developed a computer model analyzing early toddler conversations to predict the structure of later conversations. They showed that toddlers develop their own individual rules for speaking with slots into which they could put certain kinds of words. A significant outcome of the research was that rules inferred from toddler speech were better predictors of subsequent speech than traditional grammars.

Vocabulary acquisition

The capacity to acquire the ability to incorporate the pronunciation of new words depends upon the capacity to engage in speech repetition. Children with reduced abilities to repeat nonwords (a marker of speech repetition abilities) show a slower rate of vocabulary expansion than children for whom this is easy. It has been proposed that the elementary units of speech has been selected to enhance the ease with which sound and visual input can be mapped into motor vocalization. Several computational models of vocabulary acquisition have been proposed so far.

Meaning

Children learn on average 10 to 15 new word meanings each day, but only one of these words can be accounted for by direct instruction. The other nine to 14 word meanings need to be picked up in some other way. It has been proposed that children acquire these meanings with the use of processes modeled by latent semantic analysis; that is, when they meet an unfamiliar word, children can use information in its context to correctly guess its rough area of meaning.

Neurocognitive research

According to several linguists, neurocognitive research has confirmed many standards of language learning, such as: “learning engages the entire person (cognitive, affective, and psychomotor dominas), the human brain seeks patterns in its searching for meaning, emotions affect all aspects of learning, retention and recall, past experience always affects new learning, the brain’s working memory has a limited capacity, lecture usually results in the lowest degree of retention, rehearsal is essential for retention, practice [alone] does not make perfect, and each brain is unique” (Sousa, 2006, p. 274). In terms of genetics, the gene ROBO1 has been associated with phonological buffer integrity or length.

Scroll to Top