Beaujean & Osterlind. Assessing the Lynn-Flynn effect in the College Basic Academic Subjects Examination.
Program Abstract
- This study examined the Lynn-Flynn Effect (LFE) using data from the Mathematics section of the College Basic Academic Subjects Examination (Osterlind, & Merz, 1990) from 1996 to 2001. This study used Item Response Theory (IRT) methods to assess the magnitude of change in cognitive abilities, because, as Beaujean (2005) showed, under certain conditions, score comparison methods derived from Classical Test Theory (CTT) are unable to distinguish between real rises in cognitive abilities (Lynn, 1989) and mere psychometric artifacts (Burt, 1952; Brand, 1989)---a limitation IRT comparison methods were able to overcome. This study found a trend similar to that of Sundet, Barlaug, and Torjussen (2004) and Teasdale and Owen (in press), namely a dysgenic effect since the mid-1990s.
Live presentor comments
- IRT can help ascertain true differences in abilities over time which can be confounded by changes in item difficulties over time (in CTT).
- IRT can be used to equate item difficulties on a common scale across test editions--to allow generation of ability estimates, across test editons, on a common ability (latent trait) scale.
- [Blogster editorial comment - Caution in interpretation. Sample size n= 619 and measures are achievement/academic measures]
- Found reverse Flynn Effect. Speculated that Flynn Effect research based on CTT (which is most of it) may have overestimated the Flynn Effect.
- From the audience (Earl Hunt) - need to consider that effects may be due to changes in the college populations over times.
- From the audience (Bouchard) - study demonstrates the importance of researchers/organizations finding a way to archive items to allow this better methodology across time.