5 Steps to a 5 AP Psychology, 2010-2011 Edition - Laura Lincoln Maitland [74]
Reinforcing behavior only some of the time, which is using partial reinforcement or an intermittent schedule, maintains behavior better than continuous reinforcement. Partial reinforcement schedules based on the number of desired responses are ratio schedules. Schedules based on time are interval schedules. Fixed ratio schedules reinforce the desired behavior after a specific number of responses have been made. For example, every three times a rat presses a lever in a Skinner box, it gets a food pellet. Fixed interval schedules reinforce the first desired response made after a specific length of time. Fixed interval schedules result in lots of behavior as the time for reinforcement approaches, but little behavior until the next time for reinforcement approaches. For example, the night before an elementary school student gets a weekly spelling test, she will study her spelling words, but not the night after (see Figure 10.2). In a variable ratio schedule, the number of responses needed before reinforcement occurs changes at random around an average. For example, if another of your flashlights works only after clicking it a number of times and doesn’t light on the first click, you try clicking it again and again. Because your expectation is different for this flashlight, you are more likely to keep emitting the behavior of clicking it. Using slot machines in gambling casinos, gamblers will pull the lever hundreds of times as the anticipation of the next reward gets stronger. On a variable interval schedule, the amount of time that elapses before reinforcement of the behavior changes. For example, if your French teacher gives pop quizzes, you never know when to expect them, so you study every night.
Figure 10.2 Partial reinforcement schedules.
fixed ratio schedule—know how much behavior for reinforcement
fixed interval schedule—know when behavior is reinforced
variable ratio schedule—how much behavior for reinforcement changes
variable interval schedule—when behavior is reinforced changes
Superstitious Behavior
Have you ever wondered how people develop superstitions? B. F. Skinner accounted for the development of superstitious behaviors in partial reinforcement schedule experiments he performed with pigeons. He found that if food pellets were delivered when a pigeon was performing some idiosyncratic behavior, the pigeon would tend to repeat the behavior to get more food. If food pellets were again delivered when the pigeon repeated the behavior, the pigeon would tend to repeat the behavior over and over, thus indicating the development of “superstitious behavior.” Although there was a correlation between the idiosyncratic behavior and the appearance of food, there was no causal relationship between the superstitious behavior and delivery of the food to the pigeon. But the pigeons acted as if there were. People who play their “lucky numbers” when they gamble or wear their “lucky jeans” to a test may have developed superstitions from the unintended reinforcement of unimportant behavior, too.
Cognitive Processes in Learning
John B. Watson and B. F. Skinner typified behaviorists. They studied only behaviors they could observe and measure—the ABCs of behavior: antecedents, observable behaviors, and their consequences. They disregarded thought processes because they could not observe or measure them. They considered learned behaviors the result of nurture (the environment).
The Contingency Model
Cognitivists interpret classical and operant conditioning differently. Beyond making associations between stimuli and learning from rewards and punishment, cognitive theorists believe that humans