Operant conditioning is a learning process whereby deliberate behaviors are reinforced through consequences. It differs from classical conditioning, also called respondent or Pavlovian conditioning, in which involuntary behaviors are triggered by external stimuli.
With classical conditioning, a dog that has learned the sound of a bell precedes the arrival of food may begin to salivate at the sound of a bell, even if no food arrives. By contrast, a dog might learn that, by sitting and staying, it will earn a treat. If the dog then gets better at sitting and staying in order to receive the treat, then this is an example of operant conditioning.
The core concept of operant conditioning is simple: when a certain deliberate behavior is reinforced, that behavior will become more common. Psychology divides reinforcement into four main categories:
Timing and frequency are very important in reinforcement.
Positive reinforcement describes the best known examples of operant conditioning: receiving a reward for acting in a certain way.
Negative reinforcement is a different but equally straightforward form of operant conditioning. Negative reinforcement rewards a behavior by removing an unpleasant stimulus, rather than adding a pleasant one.
In psychology, punishment doesn't necessarily mean what it means in casual usage. Psychology defines punishment as something done after a given deliberate action that lowers the chance of that action taking place in the future. Whereas reinforcement is meant to encourage a certain behavior, punishment is meant to discourage a certain behavior.
Just as there are examples of positive reinforcement and negative reinforcement, there are also examples of positive punishment (like the ones above) and negative punishment. With the latter, a positive situation is removed when an undesired behavior is performed. For example, a parent may take a favorite toy away from a child who is misbehaving.
Psychology defines extinction as the loss of conditioning over time when the conditioning stimuli are no longer present. Over time, an animal (or person) will become less conditioned unless the stimuli that conditioned them in the first place is reapplied.
Burrhus Frederic Skinner was a psychologist and researcher credited with establishing the principles of operant conditioning. B.F. Skinner began with Thorndike's law of effect, which states that behaviors that cause satisfactory results will be repeated. Skinner considered satisfaction to be insufficiently specific to measure, and set out to design a means of measuring learned behaviors.
The operant conditioning chamber, popularly known as a Skinner box, was his solution. He kept his test subjects, primarily pigeons and rats, in circumstances that allowed him to closely observe their behavior. He would isolate the animal and every time the animal performed a defined behavior, like pushing a lever, it'd be rewarded with food. When the animal began to reliably push the lever, he'd know it had been conditioned.
Skinner's work took that first principle and applied it to human behavior, representing the school of psychology called behaviorism. Behaviorism defined much of psychology for the second half of the 20th century, but is currently being combined with other psychological perspectives.
It can be uncomfortable to talk about human behavior in the clinical language of psychology. That said, operant conditioning describes a simple phenomenon that happens in every part of life. It's just one of the mechanisms by which people learn. It's vital to understand how that mechanism works to make sure it works best for you.
For more on the science behind conditioning, check out our article on Examples of Behaviorism. It's the school of psychology that focuses on observable behavior, rather than emotions or motives, to explain how and why people do the things they do.