B. F. Skinner (1904-1990) was an American Psychologist, who continued the development of behaviorism, building on the work of Pavlov, Thorndike, Watson, Tolman, Guthrie, Hull, and others.
Behaviorism had long ceased to be the major force in psychology by the time of Skinner’s death in 1990. Its strong influence, however, such as in applications to special education, mental health, developmental disabilities, and animal behavior continues today, and Skinner was a major contributor to all of those important developments.
Skinner argued that psychology is the study of behavior--more specifically, it is the study of the prediction and control of behavior. Unlike Tolman, Hull, and other behaviorists, Skinner rejected the need for any intervening variables, calling them explanatory fictions. For example, to say that a rat runs a maze because he is hungry or a child explores and solves a puzzle because she is curious or a person behaves badly against a minority group because he or she is prejudiced explains nothing. In Skinner’s model, events such as hunger, curiosity, thinking, physiological events, attitudes, and so on are private events. They do not cause behavior, but are themselves subject to the same laws of learning as are public behavioral events, such as talking, running, singing, and so on.
To achieve the prediction and control of private and public events, a psychologist needs to conduct careful experimental analysis of behavior. That analysis of behavior consists of careful observation and measurement of behavior and the environmental conditions that are present just before and after the behavior occurs.
Like other behavioral models, Skinner’s model relies heavily on the concepts of S-R contiguity and association. However, Skinner’s model adds another critical concept, the consequences of behavior. An organism’s behavior is weakened or strengthened by the immediate consequences of that behavior. When the immediate consequences are positive, such as obtaining food or praise, the behavior is strengthened (positive reinforcement). Likewise, behavior is also strengthened when aversive conditions are removed by a response (negative reinforcement). When the consequences of behavior are negative, behavior is weakened (punishment). When behavior that was previously reinforced is no longer reinforced that behavior weakens (extinction).
Skinner called this type of conditioning operant conditioning, because behavior operates on the environment and produces consequences. He distinguished this from Pavlov’s classical conditioning, which Skinner called respondent conditioning. In classical conditioning, the organism responds to stimuli, but does not operate on the environment to cause consequences.
Behavior can be systematically strengthened and/or weakened by the careful application or withholding of positive reinforcement and punishment. Skinner’s research examined the effects of varying the patterns or schedules of reinforcement. Reinforcement can be continuous or intermittent and can occur on fixed or variable time schedules. Each pattern is associated with differences in how quickly learning occurs, how much reinforcement is necessary, and how resistant to extinction the behavior becomes.
Suppose a parent wants to shape certain behavior in a child, such as hand washing before dinner. To establish the behavior quickly, the parent reinforces every instance of correct hand washing (continuous reinforcement). Once established, hand washing must be made resistant to extinction, and the parent would gradually thin out the reinforcement to perhaps one reinforcement for several correct hand washing responses (partial or intermittent reinforcement). This strategy optimizes the learning process, because continuous reinforcement leads to faster learning, but intermittent reinforcement increases resistance to extinction.
In the application of operant conditioning to education and child rearing, Skinner advised that positive reinforcement is far more effective than punishment. Indeed punishment can result in disruptive behavior and other negative factors, such as anxiety and fear in the child.
Skinner placed heavy emphasis on the environment. Behavior, human or animal, is shaped by its consequences in the environment. Like Watson, Skinner described a society organized along behavioral principles, an operant utopia governed by behavioral engineering (Walden Two, 1948). His books Verbal Behavior (1957) and Beyond Freedom and Dignity (1971) also applied Skinner’s operant model to complex human behavior.
Skinner and Watson had another similarity. Their ideas generated considerable discussion and public controversy. Skinner believed that there is no individual freedom in our society; environment always controls people. In essence, the lives of ordinary people are under the control of the contingencies (the environmental reinforcing stimuli and patterns of reinforcement) that are created by the actions of government, employers, industry, traditional beliefs, religion, and so on. People do not control them; they control people. Personal freedom and self-determination are fictions.
To get out from under the control of contingencies created by others, people need to recognize how control works, identify the controlling contingencies, and then operate on the environment to change them. Not long before his death, Skinner noted that perhaps we have gone too far in allowing ourselves to become too deeply and completely controlled, and have built up too many fictions, such as the belief that we are free. He despaired that it may be too late now, that we might never change those contingencies but have, instead, adopted the fictions and ignored the realities.
Skinner dominated much of psychology from the 1940s well into the 1970s and and remained influential right to his death in 1990. The implications of his ideas and his development of behavioral technology have had enormous impact in many applied areas.