Associative Learning Example Psychology Personal Statement

Extinction is observed in both operantly conditioned and classically conditioned behavior. When operant behavior that has been previously reinforced no longer produces reinforcing consequences the behavior gradually stops occurring.[1] In classical conditioning, when a conditioned stimulus is presented alone, so that it no longer predicts the coming of the unconditioned stimulus, conditioned responding gradually stops. For example, after Pavlov's dog was conditioned to salivate at the sound of a metronome, it eventually stopped salivating to the metronome after the metronome had been sounded repeatedly but no food came. Many anxiety disorders such as posttraumatic stress disorder are believed to reflect, at least in part, a failure to extinguish conditioned fear.[2]


The dominant account of extinction involves associative models. However, there is debate over whether extinction involves simply "unlearning" the unconditional stimulus (US) – Conditional stimulus (CS) association (e.g., the Rescorla–Wagner account) or, alternatively, a "new learning" of an inhibitory association that masks the original excitatory association (e.g., Konorski, Pearce and Hall account). A third account concerns non-associative mechanisms such as habituation, modulation and response fatigue. Myers and Davis laboratory work with fear extinction in rodents has suggested that multiple mechanisms may be at work depending on the timing and circumstances in which the extinction occurs.[3]

Given the competing views and difficult observations for the various accounts researchers have turned to investigations at the cellular level (most often in rodents) to tease apart the specific brain mechanisms of extinction, in particular the role of the brain structures (amygdala, hippocampus, the prefrontal cortex), and specific neurotransmitter systems (e.g., GABA, NMDA).[3] A recent study in rodents by Amano, Unal and Paré published in Nature Neuroscience found that extinction of a conditioned fear response is correlated with synaptic inhibition in the fear output neurons of the central amygdala that project to the periaqueductal gray that controls freezing behavior. They infer that inhibition derives from the ventromedial prefrontal cortex and suggest promising targets at the cellular level for new treatments of anxiety.[4]

Operant conditioning[edit]

In the operant conditioning paradigm, extinction refers to the process of no longer providing the reinforcement that has been maintaining a behavior. Operant extinction differs from forgetting in that the latter refers to a decrease in the strength of a behavior over time when it has not been emitted.[5] For example, a child who climbs under his desk, a response which has been reinforced by attention, is subsequently ignored until the attention-seeking behavior no longer occurs. In his autobiography, B.F. Skinner noted how he accidentally discovered the extinction of an operant response due to the malfunction of his laboratory equipment:

My first extinction curve showed up by accident. A rat was pressing the lever in an experiment on satiation when the pellet dispenser jammed. I was not there at the time, and when I returned I found a beautiful curve. The rat had gone on pressing although no pellets were received. ... The change was more orderly than the extinction of a salivary reflex in Pavlov's setting, and I was terribly excited. It was a Friday afternoon and there was no one in the laboratory who I could tell. All that weekend I crossed streets with particular care and avoided all unnecessary risks to protect my discovery from loss through my accidental death.[6]

When the extinction of a response has occurred, the discriminative stimulus is then known as an extinction stimulus (SΔ or S-delta). When an S-delta is present, the reinforcing consequence which characteristically follows a behavior does not occur. This is the opposite of a discriminative stimulus which is a signal that reinforcement will occur. For instance, in an operant chamber, if food pellets are only delivered when a response is emitted in the presence of a green light, the green light is a discriminative stimulus. If when a red light is present food will not be delivered, then the red light is an extinction stimulus (food here is used as an example of a reinforcer).

Successful extinction procedures[edit]

In order for extinction to work effectively, it must be done consistently. Extinction is considered successful when responding in the presence of an extinction stimulus (a red light or a teacher not giving a bad student attention, for instance) is zero. When a behavior reappears again after it has gone through extinction, it is called resurgence.


While extinction, when implemented consistently over time, results in the eventual decrease of the undesired behavior, in the short term the subject might exhibit what is called an extinction burst. An extinction burst will often occur when the extinction procedure has just begun. This usually consists of a sudden and temporary increase in the response's frequency, followed by the eventual decline and extinction of the behavior targeted for elimination. Novel behavior, or emotional responses or aggressive behavior, may also occur.[1]

Take, as an example, a pigeon that has been reinforced to peck an electronic button. During its training history, every time the pigeon pecked the button, it will have received a small amount of bird seed as a reinforcer. So, whenever the bird is hungry, it will peck the button to receive food. However, if the button were to be turned off, the hungry pigeon will first try pecking the button just as it has in the past. When no food is forthcoming, the bird will likely try again ... and again, and again. After a period of frantic activity, in which their pecking behavior yields no result, the pigeon's pecking will decrease in frequency.

Although not explained by reinforcement theory, the extinction burst can be understood using control theory. In perceptual control theory, the degree of output involved in any action is proportional to the discrepancy between the reference value (desired rate of reward in the operant paradigm) and the current input. Thus, when reward is removed, the discrepancy increases, and the output is increased. In the long term, 'reorganisation', the learning algorithm of control theory, would adapt the control system such that output is reduced.

The evolutionary advantage of this extinction burst is clear. In a natural environment, an animal that persists in a learned behavior, despite not resulting in immediate reinforcement, might still have a chance of producing reinforcing consequences if the animal tries again. This animal would be at an advantage over another animal that gives up too easily.

Despite the name, however, not every explosive reaction to adverse stimuli subsides to extinction. Indeed, a small minority of individuals persist in their reaction indefinitely.

Extinction-induced variability[edit]

Extinction-induced variability serves an adaptive role similar to the extinction burst. When extinction begins, subjects can exhibit variations in response topography (the movements involved in the response). Response topography is always somewhat variable due to differences in environment or idiosyncratic causes but normally a subject's history of reinforcement keeps slight variations stable by maintaining successful variations over less successful variations. Extinction can increase these variations significantly as the subject attempts to acquire the reinforcement that previous behaviors produced. If a person attempts to open a door by turning the knob, but is unsuccessful, they may next try jiggling the knob, pushing on the frame, knocking on the door or other behaviors to get the door to open. Extinction-induced variability can be used in shaping to reduce problematic behaviors by reinforcing desirable behaviors produced by extinction-induced variability.

Classical conditioning[edit]

Extinction learning can also occur in a classical conditioning paradigm. In this model, a neutral cue or context can come to elicit a conditioned response when it is paired with an unconditioned stimulus. An unconditioned stimulus is one that naturally and automatically triggers a certain behavioral response. A certain stimulus or environment can become a conditioned cue or a conditioned context, respectively, when paired with an unconditioned stimulus. An example of this process is a fear conditioning paradigm using a mouse. In this instance, a tone paired with a mild footshock can become a conditioned cue, eliciting a fear response when presented alone in the future. In the same way, the context in which a footshock is received such as a chamber with certain dimensions and a certain odor can elicit the same fear response when the mouse is placed back in that chamber in the absence of the footshock.

In this paradigm, extinction occurs when the animal is re-exposed to the conditioned cue or conditioned context in the absence of the unconditioned stimulus. As the animal learns that the cue or context no longer predicts the coming of the unconditioned stimulus, conditioned responding gradually decreases, or extinguishes.



Glutamate is a neurotransmitter that has been extensively implicated in the neural basis of learning.[7] D-Cycloserine (DCS) is an agonist for the glutamate receptor NMDA, and has been trialed as an adjunct to conventional exposure-based treatments based on the principle of cue extinction.

A role for glutamate has also been identified in the extinction of a cocaine-associated environmental stimuli through testing in rats. Specifically, the metabotropic glutamate 5 receptor (mGlu5) is important for the extinction of a cocaine-associated context[8] and a cocaine-associated cue.[9]


Dopamine is another neurotransmitter recently implicated in extinction learning across both appetitive and aversive domains.[10] Dopamine signaling has been implicated in the extinction of conditioned fear[11][12][13][14][15] and the extinction of drug-related learning[16][17]


The brain region most extensively implicated in extinction learning is the infralimbic cortex (IL) of the medial prefrontal cortex (mPFC)[18] The IL is important for the extinction of reward- and fear-associated behaviors, while the amygdala has been strongly implicated in the extinction of conditioned fear.[3] The posterior cingulate cortex (PCC) and temporoparietal junction (TPJ) have also been identified as regions that may be associated with impaired extinction in adolescents[19].

Across development[edit]

There is a strong body of evidence to suggest that extinction alters across development.[20][21] That is, extinction learning may differ during infancy, childhood, adolescence and adulthood. During infancy and childhood, extinction learning is especially persistent, which some have interpreted as erasure of the original CS-US association,[22][23][24] but this remains contentious. In contrast, during adolescence and adulthood extinction is less persistent, which is interpreted as new learning of a CS-no US association that exists in tandem and opposition to the original CS-US memory.[25][26]

See also[edit]


  1. ^ abMiltenberger, R. (2012). Behavior modification, principles and procedures. (5th ed., pp. 87-99). Wadsworth Publishing Company.
  2. ^VanElzakker, M. B.; Dahlgren, M. K.; Davis, F. C.; Dubois, S.; Shin, L. M. (2014). "From Pavlov to PTSD: The extinction of conditioned fear in rodents, humans, and anxiety disorders". Neurobiology of Learning and Memory. 113: 3–18. doi:10.1016/j.nlm.2013.11.014. PMC 4156287. PMID 24321650. 
  3. ^ abcMyers; Davis (2007). "Mechanisms of Fear Extinction". Molecular Psychiatry. 12: 120–150. doi:10.1038/ PMID 17160066. 
  4. ^Amano, T; Unal, CT; Paré, D (2010). "Synaptic correlates of fear extinction in the amygdala". Nature Neuroscience. 13: 489–494. doi:10.1038/nn.2499. PMC 2847017. PMID 20208529. 
  5. ^Vargas, Julie S. (2013). Behavior Analysis for effective Teaching. New York: Routledge. p. 52. 
  6. ^B.F. Skinner (1979). The Shaping of a Behaviorist: Part Two of an Autobiography, p. 95.
  7. ^Riedel, Gernot; Platt, Bettina; Micheau, Jacques (2003-03-18). "Glutamate receptor function in learning and memory". Behavioural Brain Research. 140 (1–2): 1–47. doi:10.1016/s0166-4328(02)00272-3. ISSN 0166-4328. PMID 12644276. 
  8. ^Kim, Jee Hyun; Perry, Christina; Luikinga, Sophia; Zbukvic, Isabel; Brown, Robyn M.; Lawrence, Andrew J. (2015-05-01). "Extinction of a cocaine-taking context that protects against drug-primed reinstatement is dependent on the metabotropic glutamate 5 receptor". Addiction Biology. 20 (3): 482–489. doi:10.1111/adb.12142. ISSN 1369-1600. 
  9. ^Perry, Christina J; Reed, Felicia; Zbukvic, Isabel C; Kim, Jee Hyun; Lawrence, Andrew J (2016-01-01). "The metabotropic glutamate 5 receptor is necessary for extinction of cocaine associated cues". British Journal of Pharmacology. 173: 1085–1094. doi:10.1111/bph.13437. ISSN 1476-5381. PMID 26784278. 
  10. ^Abraham, Antony D.; Neve, Kim A.; Lattal, K. Matthew (2014-02-01). "Dopamine and extinction: A convergence of theory with fear and reward circuitry". Neurobiology of learning and memory. 108: 65–77. doi:10.1016/j.nlm.2013.11.007. ISSN 1074-7427. PMC 3927738. PMID 24269353. 
  11. ^Haaker, Jan; Lonsdorf, Tina B.; Kalisch, Raffael (2015-10-01). "Effects of post-extinction l-DOPA administration on the spontaneous recovery and reinstatement of fear in a human fMRI study". European Neuropsychopharmacology. 25 (10): 1544–1555. doi:10.1016/j.euroneuro.2015.07.016. ISSN 1873-7862. PMID 26238968. 
  12. ^Haaker, Jan; Gaburro, Stefano; Sah, Anupam; Gartmann, Nina; Lonsdorf, Tina B.; Meier, Kolja; Singewald, Nicolas; Pape, Hans-Christian; Morellini, Fabio (2013-06-25). "Single dose of L-dopa makes extinction memories context-independent and prevents the return of fear". Proceedings of the National Academy of Sciences of the United States of America. 110 (26): E2428–2436. doi:10.1073/pnas.1303061110. ISSN 1091-6490. PMC 3696794. PMID 23754384. 
  13. ^Ponnusamy, Ravikumar; Nissim, Helen A.; Barad, Mark (2005-07-01). "Systemic blockade of D2-like dopamine receptors facilitates extinction of conditioned fear in mice". Learning & Memory. 12 (4): 399–406. doi:10.1101/lm.96605. ISSN 1072-0502. PMC 1183258. PMID 16077018. 
  14. ^Zbukvic, Isabel C.; Ganella, Despina E.; Perry, Christina J.; Madsen, Heather B.; Bye, Christopher R.; Lawrence, Andrew J.; Kim, Jee Hyun (2016-03-05). "Role of Dopamine 2 Receptor in Impaired Drug-Cue Extinction in Adolescent Rats". Cerebral Cortex. 26: bhw051. doi:10.1093/cercor/bhw051. ISSN 1047-3211. PMC 4869820. PMID 26946126. 
  15. ^Madsen, Heather B.; Guerin, Alexandre A.; Kim, Jee Hyun. "Investigating the role of dopamine receptor- and parvalbumin-expressing cells in extinction of conditioned fear". Neurobiology of Learning and Memory. 145: 7–17. doi:10.1016/j.nlm.2017.08.009. 
  16. ^Abraham, Antony D.; Neve, Kim A.; Lattal, K. Matthew (2016-07-01). "Activation of D1/5 Dopamine Receptors: A Common Mechanism for Enhancing Extinction of Fear and Reward-Seeking Behaviors". Neuropsychopharmacology. 41 (8): 2072–2081. doi:10.1038/npp.2016.5. PMC 4908654. PMID 26763483. 
  17. ^Zbukvic, Isabel C.; Ganella, Despina E.; Perry, Christina J.; Madsen, Heather B.; Bye, Christopher R.; Lawrence, Andrew J.; Kim, Jee Hyun (2016-03-05). "Role of Dopamine 2 Receptor in Impaired Drug-Cue Extinction in Adolescent Rats". Cerebral Cortex. 26: bhw051. doi:10.1093/cercor/bhw051. ISSN 1047-3211. PMC 4869820. PMID 26946126. 
  18. ^Do-Monte, Fabricio H.; Manzano-Nieves, Gabriela; Quiñones-Laracuente, Kelvin; Ramos-Medina, Liorimar; Quirk, Gregory J. (2015-02-25). "Revisiting the Role of Infralimbic Cortex in Fear Extinction with Optogenetics". The Journal of Neuroscience. 35 (8): 3607–3615. doi:10.1523/JNEUROSCI.3137-14.2015. ISSN 0270-6474. PMC 4339362. PMID 25716859. 
  19. ^Ganella, Despina E.; Drummond, Katherine D.; Ganella, Eleni P.; Whittle, Sarah; Kim, Jee Hyun (2018). "Extinction of Conditioned Fear in Adolescents and Adults: A Human fMRI Study". Frontiers in Human Neuroscience. 11. doi:10.3389/fnhum.2017.00647. ISSN 1662-5161. 
  20. ^Yap, C.S., Richardson, R. (2007). "Extinction in the developing rat: an examination of renewal effects". Developmental Psychobiology. 49: 565–575. doi:10.1002/dev.20244 – via Wiley Online LIbrary. 
  21. ^Ganella, Despina E; Kim, Jee Hyun (2014-10-01). "Developmental rodent models of fear and anxiety: from neurobiology to pharmacology". British Journal of Pharmacology. 171 (20): 4556–4574. doi:10.1111/bph.12643. ISSN 1476-5381. PMC 4209932. PMID 24527726. 
  22. ^Kim, Jee Hyun; Richardson, Rick (2008-02-06). "The Effect of Temporary Amygdala Inactivation on Extinction and Reextinction of Fear in the Developing Rat: Unlearning as a Potential Mechanism for Extinction Early in Development". The Journal of Neuroscience. 28 (6): 1282–1290. doi:10.1523/JNEUROSCI.4736-07.2008. ISSN 0270-6474. PMID 18256248. 
  23. ^Kim, Jee Hyun; Richardson, Rick (2007). "A developmental dissociation in reinstatement of an extinguished fear response in rats". Neurobiology of Learning and Memory. 88: 48–57. doi:10.1016/j.nlm.2007.03.004. PMID 17459734. 
  24. ^Kim, Jee Hyun; Hamlin, Adam S.; Richardson, Rick (2009-09-02). "Fear Extinction across Development: The Involvement of the Medial Prefrontal Cortex as Assessed by Temporary Inactivation and Immunohistochemistry". The Journal of Neuroscience. 29 (35): 10802–10808. doi:10.1523/JNEUROSCI.0596-09.2009. ISSN 0270-6474. PMID 19726637. 
  25. ^Kim, Jee Hyun; Li, Stella; Richardson, Rick (2010-06-24). "Immunohistochemical Analyses of Long-Term Extinction of Conditioned Fear in Adolescent Rats". Cerebral Cortex. 21: bhq116. doi:10.1093/cercor/bhq116. ISSN 1047-3211. PMID 20576926. 
  26. ^Kim, Jee Hyun; Ganella, Despina E. (2015). "A Review of Preclinical Studies to Understand Fear During Adolescence". Australian Psychologist. 50 (1): 25–31. doi:10.1111/ap.12066. 

Association in psychology refers to a mental connection between concepts, events, or mental states that usually stems from specific experiences.[1] Associations are seen throughout several schools of thought in psychology including behaviorism, associationism, psychoanalysis, social psychology, and structuralism. The idea stems from Plato and Aristotle, especially with regard to the succession of memories, and it was carried on by philosophers such as John Locke, David Hume, David Hartley, and James Mill.[2] It finds its place in modern psychology in such areas as memory, learning, and the study of neural pathways.[3]

Learned associations[edit]

Associative learning is when a subject creates a relationship between stimuli (auditory or visual) or behavior (auditory or visual) and the original stimulus (auditory or visual). The acquisition of associations is the basis for learning.[4] This learning is seen in classical and operant conditioning. 

Law of Effect[edit]

Edward Thorndike did research in this area and developed the law of effect, where associations between a stimulus and response are affected by the consequence of the response.[5] For example, behaviors increase in strength and/or frequency when they have been followed by reward. This occurs because of an association between the behavior and a mental representation of the reward (such as food). Conversely, receiving a negative consequence lowers the frequency of the behavior due to the negative association.[5] An example of this would be a rat in a cage with a bar lever. If pressing the lever results in a food pellet, the rat will learn to press the lever to receive food. If pressing the lever resulted in an electric shock on the floor of the cage, the rat would learn to avoid pressing the lever.

Classical conditioning[edit]

Classical conditioning is an example of a learned association. The classical conditioning process consists of four elements: unconditioned stimulus (UCS), unconditioned response (UCR), conditioned stimulus (CS), and conditioned response (CR).[1]

Without conditioning, there is already a relationship between the unconditioned stimulus and the unconditioned response. When a second stimulus is paired with the unconditioned stimulus, the response becomes associated with both stimuli. The secondary stimulus is known as the conditioned stimulus and elicits a conditioned response.[6]

The strength of the response to the conditioned stimulus increases over the period of learning, as the CS becomes associated with UCS. The strength of the response can diminish if CS is presented without UCS.[6] In his famous experiment, Pavlov used the unconditioned response of dogs salivating at the sight of food (UCS), and paired the sound of a bell (CS) with receiving food, and later the dog salivated (CR) to the bell alone, indicating that an association had been established between the bell and food.[7][8]

Operant conditioning[edit]

In operant conditioning, behaviors are changed due to the experienced outcomes of those behaviors. Stimuli do not cause behavior, as in classical conditioning, but instead the associations are created between stimulus and consequence, as an extension by Thorndike on his Law of Effect.[8][9]

B.F. Skinner was well known for his studies of reinforcers on behavior. His studies included the aspect of contingency, which refers to the connection between a specific action and the following consequence or reinforcement.[9] Skinner described three contingencies: positive reinforcement, negative reinforcement, and punishment. Reinforcements create a positive association between the action and consequence in order to promote the continuation of the action. This is done in one of two ways, positive reinforcers introduce a rewarding stimulus, whereas negative reinforcers remove an aversive stimulus to make the environment less aversive. Punishments create a negative relationship between the action and the consequence so that the action does not continue.[9]


Watson introduced a white fluffy rabbit to an infant, and created a connection between the rabbit and a loud noise. This experience for Little Albert associated a feeling of fear with the rabbit.[10]


Memory seems to operate as a sequence of associations: concepts, words, and opinions are intertwined, so that stimuli such as a person’s face will call up the associated name.[11][12] Understanding the relationships between different items is fundamental to episodic memory, and damage to the hippocampal region of the brain has been found to hinder learning of associations between objects.[13]

Testing associations[edit]

Associations in humans can be measured with the Implicit Association Test, a psychological test which measures the implicit (subconscious) relation between two concepts. It has been used in investigations of subconscious racial bias. The test measures the associations between different ideas, such as race and crime. Reaction time is used to distinguish associations; faster reaction time is an indicator of a stronger association.[14]

See also[edit]



Boring, E. G. (1950) "A History of Experimental Psychology" New York, Appleton-Century-Crofts

Crisp & Turner, R. N, R. J (2007). Attitude formation.In Essential social psychology. SAGE. p. 77. 

Gallistel, C. R. & Gibbon, J. (2002) "The Symbolic Foundations of Conditioned Behavior" Mahwah New Jersey:Erlbaum.

Gazzaniga, M. S.; Ivry, R. B. & Mangun, G. R (2009). Learning and memory. In Cognitive neuroscience: The biology of the mind. W.W. Norton. p. 312. 

Greenwald, A. G; McGhee, D. E.; Schwartz, J. L. K. (1998). "Measuring individual differences in implicit cognition: The implicit association test". Journal of Personality and Social Psychology. 74 (6): 1464–1480. doi:10.1037/0022-3514.74.6.1464. PMID 9654756. 

Klein, Stephen (2012). Learning: Principles and Applications (6 ed.). SAGE Publications. ISBN 978-1-4129-8734-9.

Shettleworth, S. J. (2010) "Cognition, Evolution and Behavior" New York, Oxford

Smith, E. E. & Kosslyn, S. M. (2007) "Cognitive Psychology: Mind and Brain", Upper Saddle River, New Jersey: Pearson/Prentice Hall

Stark, C. E. L; Bayley, P. J.; Squire, L. R. (2002). "Recognition memory for single items and for associations is similarly impaired following damage to the hippocampal region". Learning & Memory. 5 (9): 238–242. doi:10.1101/lm.51802. 

Timberlake, W (1994). "Behavior systems, associationism, and pavlovian conditioning". Psychonomic Bulletin & Review. 4 (1): 405–420. doi:10.3758/bf03210945. 

Watier, N.; Collin, C. (2012). "The effects of distinctiveness on memory and metamemory for face–name associations". Memory. 1 (20): 73–88. doi:10.1080/09658211.2011.637935. 

External links[edit]

  1. ^ abKlein, Stephen (2012). Learning: Principles and Applications (6 ed.). SAGE Publications. ISBN 978-1-4129-8734-9.
  2. ^Boring, E. G. (1950)
  3. ^Smith, E. E. & Kosslyn, S. M. (2007)
  4. ^Eich, Eric; Forgas, Joseph (2003). "Mood, Cognition, and Memory". In Healy, Alice; Proctor, Robert. Handbook of Psychology. 4. New Jersey: John Wiley & Sons, Inc. 
  5. ^ abMiller, Ralph; Grace, Randolph (2003). "Conditioning and Learning". In Healy, Alice. Handbook of Psychology. 4. New Jersey: John Wiley & Sons, Inc. 
  6. ^ abKlein, Stephen (2012). Learning: Principles and Applications (6 ed.). SAGE Publications. ISBN 978-1-4129-8734-9. 
  7. ^Timberlake, 1994
  8. ^ abShettleworth, S. J. (2010)
  9. ^ abcKlein, Stephen (2012). Learning: Principles and Applications (6 ed.). SAGE Publications. ISBN 978-1-4129-8734-9. 
  10. ^Eich, Eric; Forgas, Joseph (2003). "Mood, Cognition, and Memory". In Healy, Alice; Proctor, Robert. Handbook of Psychology. 4. New Jersey: John Wiley & Sons, Inc. 
  11. ^Watier & Collin 2012
  12. ^Gazzaniga, Ivry & Mangun, 2009
  13. ^Stark, Bayley & Squire, 2002
  14. ^Greenwald, McGhee & Schwartz, 1998

0 thoughts on “Associative Learning Example Psychology Personal Statement

Leave a Reply

Your email address will not be published. Required fields are marked *