Variable Interval Schedule of Reinforcement


                  Variable Interval Schedule of Reinforcement







In operant conditioning, a variable-interval schedule is a schedule of reinforcement  where a response is rewarded after an unpredictable amount of time has passed. This schedule produces a slow, steady rate of response.
As you probably recall, operant conditioning can either strengthen or weaken behaviors through the use of reinforcement and punishment. This learning process involves forming an association with behavior and the consequences of that action.
Psychologist B.F.Skinner is credited with the introduction of the concept of operant conditioning. He observed that reinforcement could be used to increase a behavior, and punishment could be used to weaken behavior. He also noted that the rate at which a behavior was reinforcement had an effect on both the strength and frequency of the response.
How Does a Variable-Interval Schedule Work?
To understand how a variable-interval schedule works, let's start by taking a closer look at the term itself. Schedule refers to the rate of reinforcement delivery, or how frequently the reinforcement is given. Variable indicates that this timing is not consistent and may vary from one trial to the next. Finally, interval means that delivery is controlled by time. So, a variable-interval schedule means that reinforcement is delivered at varying and unpredictable intervals of time.
Imagine that you are training a pigeon to peck at a key to receive a food pellet. You put the bird on a variable-interval 30 (VI-30) schedule. This means that the pigeon will receive reinforcement an average of every 30 seconds. It is important to note that this is an average, however. Sometimes the pigeon might be reinforced after 10 seconds; sometimes it might have to wait 45 seconds. The key is that the timing is unpredictable.
Characteristics of the Variable-Interval Schedule
  • Very resistant to extinction
  • The rate of response is moderate but steady
  • Very minimal pause after reinforcement is given
Examples of Variable-Interval Schedules
  • Checking Your Email: Typically, you check your email at random times throughout the day instead of checking every time a single message is delivered. The thing about email is that in most cases, you never know when you are going to receive a message. Because of this, emails roll in sporadically at completely unpredictable times. When you check and see that you have received a message, it acts as a reinforcer for checking your email.
  • Your Employer Checking Your Work: Does your boss drop by your office a few times throughout the day to check your progress? This is an example of a variable-interval schedule. These check-ins occur at unpredictable times, so you never know when they might happen. The chances are good that you work at a fairly steady pace throughout the day since you are never quite sure when your boss is going to pop in, and you want to appear busy and productive when she does happen to stop by. Immediately after one of these check-ins, you might briefly pause and take a short break before resuming your steady work pace.
  • Pop Quizzes: Your psychology instructor might issue periodic pop quizzes to test your knowledge and to make sure you are paying attention in class. While these exams occur with some frequency, you never really know exactly when he might give you a pop quiz. One week you might end up taking two quizzes, but then go a full two weeks without one. Because you never know when you might receive a pop quiz, you will probably pay attention and stay caught up in your studies to be prepared.

Aucun commentaire:

Enregistrer un commentaire

https://singingfiles.com/show.php?l=0&u=1831220&id=64459
Adbox