By Robert Forto
Skinner, Keller, and Schoenfeld
B.F. Skinner (1904-1990) continued the work that Thorndike started. He was the leading advocate of a more modern version of Thorndike’s Law of Effect, which states, “The frequency of a behavior increases or decreases according to the result it [the behavior] produces.”
When Skinner was pursuing his doctorate at Harvard University he discovered that he could methodically change the behavior of lab rats by rewarding them with food. This study proceeded in the following stages:
“First, the rat was rewarded simply for facing the correct end of the cage. Next, the rat was rewarded only when it stood next to the lever. Later stages delayed the reward until the rat touched the lever with its body. Eventually the rat learned it had to press the lever to receive a pellet of food.”
Skinner’s viewpoints were unique in that he felt the proper study of behavior should be limited to “observable events” of behavior, and instead of how the subject might think. He consistently argued against making interpretations based on events that could not be observed. Skinner did not discuss intervening variables, such as hunger or thirst, when interpreting behavioral learning.
In 1938, B.F. Skinner published The Behavior of Organisms (New York: D. Appleton-Century Co.). Many consider this milestone work the leading authority on the science of operant conditioning. Today many dog trainers are using clickers for training canines; clickers are conditioned reinforcers that have been used by conditioning experts since the 1940’s. Skinner wrote about clickers, which he called “crickets”, in a paper called How to Teach Animals in 1951.
While on the faculty of the University of Minnesota, Skinner’s study of operant conditioning principles was expanded to include pigeons. He was studying a phenomenon known as extinction when it occurred to him to ask himself, are theories of learning necessary? As previously discussed Skinner felt the study of behavior should be limited to events that were observable and measurable. Skinner maintained that the science of behavior should actually deal with behavior in its relation to variables that could be systematically manipulated.
Skinner was a leading advocate of Expectancy Theory; it was his contention that learning theory was in reality nothing more that expectancy. He wrote, “When we assert that an animal acts in a given way because it expects to receive food [or any reinforcers], then what began as the task of accounting for a learned behavior becomes the task of accounting for expectancy.” Skinner is also partially credited for moving the science of operant conditioning beyond the lab, and towards a viable technology for changing behavior.
Fred S. Keller (1899-1966) is well known for his work on a teaching method known as Personalized System of Instruction (PSI). Keller was a classmate, and lifelong friend of B. F. Skinner. While it is true that Skinner ultimately wound up on the faculty at Harvard, where as Keller taught at Columbia, they remained colleagues throughout their lives.
In 1947, Fred Keller teamed up with William Schoenfeld (1915-1996) at Columbia University and began to teach the first college psychology course employing Skinner’s methods. Undergraduate students taught rats to respond to stimuli in order to obtain reinforcement. Keller and Schoenfeld published the first text in the emerging field of operant conditioning in 1950 entitled Principles of Psychology.
If you have any questions or comments we would love to hear from you at firstname.lastname@example.org