Search results
Results From The WOW.Com Content Network
B.F. Skinner (1904–1990) is referred to as the Father of operant conditioning, and his work is frequently cited in connection with this topic. His 1938 book "The Behavior of Organisms: An Experimental Analysis", [ 6 ] initiated his lifelong study of operant conditioning and its application to human and animal behavior.
To study operant conditioning, he invented the operant conditioning chamber (aka the Skinner box), [8] and to measure rate he invented the cumulative recorder. Using these tools, he and Charles Ferster produced Skinner's most influential experimental work, outlined in their 1957 book Schedules of Reinforcement .
The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University. The chamber can be used to study both operant conditioning and classical conditioning. [1] [2] Skinner created the operant conditioning chamber as a variation of the puzzle box originally created by Edward Thorndike. [3]
The experimental analysis of behavior is a science that studies the behavior of individuals across a variety of species. A key early scientist was B. F. Skinner who discovered operant behavior, reinforcers, secondary reinforcers, contingencies of reinforcement, stimulus control, shaping, intermittent schedules, discrimination, and generalization.
B.F. Skinner, American psychologist credited for understanding of operant conditioning associated with instinctive drift. B.F. Skinner was an American behaviourist inspired by John Watson's philosophy of behaviorism. [5] Skinner was captivated with systematically controlling behaviour to result in desirable or beneficial outcomes.
People know him only for discovering operant conditioning, schedules of reinforcement, and for books like Walden Two, Verbal Behavior, Beyond Freedom D, and more. Even the B.F. Skinner Foundation fails to put a missile on its hat, so thank you for finally putting the record straight." [5]
B. F. Skinner first identified and described the principles of operant conditioning that are used in clicker training. [6] [7] Two students of Skinner's, Marian Kruse and Keller Breland, worked with him researching pigeon behavior and training projects during World War II, when pigeons were taught to "bowl" (push a ball with their beaks). [8]
In his autobiography, B. F. Skinner noted how he accidentally discovered the extinction of an operant response due to the malfunction of his laboratory equipment: My first extinction curve showed up by accident. A rat was pressing the lever in an experiment on satiation when the pellet dispenser jammed.