- Previous research has shown that STM for word sequences is affected by acoustically similar words but less affected by semantically similarity
Aims:
- To test whether LTM will be similarly affected as STM by word sequences
- To test the effect of semantic and acoustic similarity on the learning and recalling of word sequences in LTM, using a control to prevent rehearsal between presentation and testing to minimise STM effects
Participants:
- Males and females selected from the Applied Psychology Research Unit
- Condition A: 21 participants, 10 acoustically similar words
- Condition B: 20 participants, 10 acoustically dissimilar words
- Condition C: 16 participants, 10 semantically similar words
- Condition D: 21 participants, 10 semantically dissimilar words
Procedure:
- The four conditions were each given the 10 words, shown one at a time for 3 seconds each on a slide show
- After they saw the words, they were given a numerical interference task where they had to write down three different number sequences after they were read aloud, being given 8 seconds to write down each sequence
- The participants were then given 1 minute to write down the word order of their 10 words, the words were still available around the room so the test was not to learn the words but to learn the order in which they appeared
- Having completed this task 4 times, they were then given another interference task, but this time for 15 minutes
- Finally, they were given a 5th surprise recall task
Results:
- There was only a 6 percent difference between acoustically similar and dissimilar word lists
- However, there was a significant 27 percent difference between the semantically similar and dissimilar words by the 5th recall task, suggesting that LTM encodes semantically
- Semantically similar words showed a 59 percent accuracy by the 5th trial whereas semantically dissimilar words showed an 86 percent accuracy
Conclusion:
- LTM is different to STM in terms of how it encodes information as encoding in STM is largely acoustic whereas encoding is LTM is largely semantic