Thursday, June 13, 2013

Journal Alert - INTELLIGENCE

Title:
> Convergent and divergent validity of integrative versus mixed model measures of emotional intelligence
>
> Authors:
> Webb, CA; Schwab, ZJ; Weber, M; DelDonno, S; Kipman, M; Weiner, MR;
> Killgore, WDS
>
> Source:
> *INTELLIGENCE*, 41 (3):149-156; MAY-JUN 2013
>
> Abstract:
> The construct of emotional intelligence (EI) has garnered increased
> attention in the popular media and scientific literature. Several
> competing measures of El have been developed, including self-report and
> performance-based instruments. The current study replicates and expands
> on previous research by examining three competing El measures
> (Mayer-Salovey-Caruso Emotional Intelligence Test, MSCEIT; Bar-On
> Emotion Quotient Inventory, EQ-i; and Self-Rated Emotional Intelligence
> Scale, SREIS) and their relationships with cognitive functioning
> (Wechsler Abbreviated Scale of Intelligence; WASI), Big Five personality
> traits (NEO-PI-R) and emotional well-being (Beck Depression Inventory,
> BDI and Positive and Negative Affect Schedule, PANAS). Results indicated
> that significant variability in the self-report El measures was
> accounted for by personality and emotional well-being measures, whereas
> the MSCEIT was more strongly associated with IQ. Overall, nearly
> two-thirds (62%) of the variance in EQ-i scores was accounted for by Big
> Five personality traits, emotional well-being and full scale IQ; whereas
> only 14% of the variance in MSCEIT scores was accounted for by these
> same variables. The present findings raise questions regarding the
> divergent validity of self-report El measures from existing personality
> and emotional well-being measures. The implication of these results and
> directions for future research are discussed. (c) 2013 Elsevier Inc. All
> rights reserved.
>
> ========================================================================
>
>
> *Pages: 157-168 (Article)
> *View Full Record: http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=Alerting&SrcApp=Alerting&DestApp=CCC&DestLinkType=FullRecord;KeyUT=CCC:000318833500002
> *Order Full Text [ ]
>
> Title:
> Explanatory item response modeling of children's change on a dynamic test of analogical reasoning
>
> Authors:
> Stevenson, CE; Hickendorff, M; Resing, WCM; Heiser, WJ; de Boeck, PAL
>
> Source:
> *INTELLIGENCE*, 41 (3):157-168; MAY-JUN 2013
>
> Abstract:
> Dynamic testing is an assessment method in which training is
> incorporated into the procedure with the aim of gauging cognitive
> potential. Large individual differences are present in children's
> ability to profit from training in analogical reasoning. The aim of this
> experiment was to investigate sources of these differences on a dynamic
> test of figural matrix analogies. School children (N = 252, M = 7 years,
> SD = 11 months, range 5-9 years) were dynamically tested using a
> pretest-training-posttest design. The children were randomly allocated
> to a training condition: graduated prompts (N = 127) or feedback (N =
> 123). All children were presented with figural analogies without help or
> feedback during the pretest. The children were trained to solve
> analogies with either graduated prompts, a form of step-wise elaborated
> feedback in which increasingly detailed instruction is provided on how
> to solve the task, or feedback, in which only correctness of each
> solution was provided. A comparable figural analogies test was
> administered during the posttest measure. Explanatory IRT models were
> used to investigate sources of individual differences in initial ability
> and improvement after training. We found that visual and verbal working
> memory and age-group were related to initial ability. Improvement after
> training was influenced by training-type, whereby graduated prompts
> trained children improved more than feedback-trained, but also by
> initial ability, where children with lower initial scores improved more
> in both groups. Furthermore, there was an additive effect of math
> achievement on degree of improvement; where higher achieving children
> improved more from pretest to posttest. Potential to learn as measured
> by dynamic tests is not often included in traditional cognitive
> assessment. However, learning potential does appear to be an important
> construct to include in psychoeducational testing. (c) 2013 Elsevier
> Inc. All rights reserved.
>
> ========================================================================
>
>
> *Pages: 169-177 (Article)
> *View Full Record: http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=Alerting&SrcApp=Alerting&DestApp=CCC&DestLinkType=FullRecord;KeyUT=CCC:000318833500003
> *Order Full Text [ ]
>
> Title:
> The Flynn effect and population aging
>
> Authors:
> Skirbekk, V; Stonawski, M; Bonsang, E; Staudinger, UM
>
> Source:
> *INTELLIGENCE*, 41 (3):169-177; MAY-JUN 2013
>
> Abstract:
> Although lifespan changes in cognitive performance and Flynn effects
> have both been well documented, there has been little scientific focus
> to date on the net effect of these forces on cognition at the population
> level. Two major questions moving beyond this finding guided this study:
> (1) Does the Flynn effect indeed continue in the 2000s for older adults
> in a UK dataset (considering immediate recall, delayed recall, and
> verbal fluency)? (2) What are the net effects of population aging and
> cohort replacement on average cognitive level in the population for the
> abilities under consideration?
> First, in line with the Flynn effect, we demonstrated continued
> cognitive improvements among successive cohorts of older adults. Second,
> projections based on different scenarios for cognitive cohort changes as
> well as demographic trends show that if the Flynn effect observed in
> recent years continues, it would offset the corresponding age-related
> cognitive decline for the cognitive abilities studied. In fact, if
> observed cohort effects should continue, our projections show
> improvements in cognitive functioning on a population level until
> 2042-in spite of population aging. (c) 2013 Elsevier Inc. All rights
> reserved.
>
> ========================================================================
>
>
> *Pages: 178-180 (Editorial Material)
> *View Full Record: http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=Alerting&SrcApp=Alerting&DestApp=CCC&DestLinkType=FullRecord;KeyUT=CCC:000318833500004
> *Order Full Text [ ]
>
> Title:
> FLynn effect in Turkey: A comment on Kagitcibasi and Biricik (2011)
>
> Authors:
> Rindermann, H; Schott, T; Baumeister, AEE
>
> Source:
> *INTELLIGENCE*, 41 (3):178-180; MAY-JUN 2013
>
> Abstract:
> Kagitcibasi and Biricik (2011) presented generational IQ gains for
> Turkey on the Goodenough Draw-a-Man test (Draw-a-Person). Following
> their results (their Table 1) the mean IQ gain from 1977 to 2010 (33
> years) across three different population groups was 5.24 IQ (per decade
> dec = 1.59 IQ points). However, Kagitcibasi and Biricik did not
> acknowledge the changing social composition of society: the share of
> groups which had a lower IQ in 1977 and 2010 decreased, and the share of
> groups which had a higher IQ in 1977 and 2010 increased. Considering
> this, we came to an estimate of dec = 3.52 IQ points for the FLynn
> effect in Turkey over the past decades. (c) 2013 Elsevier Inc. All
> rights reserved.
>
> ========================================================================
>
>
> *Pages: 181-192 (Article)
> *View Full Record: http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=Alerting&SrcApp=Alerting&DestApp=CCC&DestLinkType=FullRecord;KeyUT=CCC:000318833500005
> *Order Full Text [ ]
>
> Title:
> Quantitative differences in retest effects across different methods used to construct alternate test forms
>
> Authors:
> Arendasy, ME; Sommer, M
>
> Source:
> *INTELLIGENCE*, 41 (3):181-192; MAY-JUN 2013
>
> Abstract:
> Allowing respondents to retake a cognitive ability test has shown to
> increase their test scores. Several theoretical models have been
> proposed to explain this effect, which make distinct assumptions
> regarding the measurement invariance of psychometric tests across test
> administration sessions with regard to narrower cognitive abilities and
> general mental ability. We modeled retest effects in four psychometric
> tests as a function of specific retest form and general mental ability
> in order to compare the validity of these models and their
> generalizability across three different kinds of retest forms. To do so
> automatic item generation was used to construct two kinds of alternate
> retest: (1) isomorphic retests and (2) psychometrically matched retests.
> A total of N = 358 respondents completed all four measures twice,
> receiving either identical retest forms, isomorphic retest forms or
> psychometrically matched retest forms at the second test administration
> session. Item response theory modeling supported strict measurement
> invariance across all test forms and time-points of measurement but
> indicated variation in respondents' retest score gains due to individual
> differences in general mental ability and the kind of retest form used.
> In general, retest effects were more pronounced for high-g respondents,
> identical retests and isomorphic retest forms and for mental rotation
> and algebra word problems. Latent mean and covariance structure analyses
> indicated that retesting did not affect the g-factor saturation of the
> four cognitive ability tests but revealed that retest score gains were
> hollow with respect to psychometric g. (c) 2013 Elsevier Inc. All rights
> reserved.
>
> ========================================================================
>
>
> *Pages: 193-202 (Article)
> *View Full Record: http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=Alerting&SrcApp=Alerting&DestApp=CCC&DestLinkType=FullRecord;KeyUT=CCC:000318833500006
> *Order Full Text [ ]
>
> Title:
> Allocation of talent in society and its effect on economic development
>
> Authors:
> Strenze, T
>
> Source:
> *INTELLIGENCE*, 41 (3):193-202; MAY-JUN 2013
>
> Abstract:
> Several studies in psychology and economics have demonstrated that the
> average cognitive ability (talent) of people living in a society affects
> the economic development of the society. There is, however, reason to
> expect that the economic development of societies depends not just on
> the average level of talent but also on the allocation of talent in
> society - societies that allocate people with different talents more
> efficiently should be more successful in economic terms. Efficient
> allocation of talent means that people with higher ability do jobs of
> higher complexity. The present paper constructed several measures of
> allocation of talent and analyzed their effect on the economic growth
> rate of countries and U.S. states. Overall, the analyses support the
> idea that the countries and states that have a better allocation of
> talent exhibit higher levels of economic growth. (c) 2013 Elsevier Inc.
> All rights reserved.
>
>

No comments: