Showing posts with label info proc. Show all posts
Showing posts with label info proc. Show all posts

Tuesday, April 17, 2007

Cognitive load, working memory and instruction


[double click on image to enlarge]

Yesterday the Eide Neurolearning blog had a nice post on "cognitive load" with many links to a news article, a PPT file, etc. I've been very intrigued with cognitive load theory (viz., "optimum learning occurs in humans when the load on working
memory is kept to a minimum to best facilitate the changes in long term
memory") for years, primarily because it appears to be a potential link from research on cognitive psychology (information processing theory) to instructional practices. More than once I've started blog posts....only to recognize that I needed to read the material deeper.

The ENL post has given me the idea that I should simply post the articles I've accumulated in hopes that readers can read and extract the information they need. Maybe someone will post some nice comments after reading these articles. Or...if someone wants to read them and do a guest blog post, contact me re: this possibility (iap@earthlink.net).

Paas, F., Renkl, A., Sweller, J. (2003). Cognitive load theory and instructional design: Recent developments. Educational Psychologist, 38, 1, 1-4. (click here to view)

Paas, F., Tuovinen, J.E., Tabbers, H., Van Gerven, P.W.M. (2003). Cognitive load measurement as a means to advance cognitive load theory. Educational Psychologist, 38, 1, 63-71. (click here to view)
  • Abstract
  • In this article, we discuss cognitive load measurement techniques with regard to their contribution to cognitive load theory (CLT). CLT is concerned with the design of instructional methods that efficiently use people's limited cognitive processing capacity to apply acquired knowledge and skills to new situations (i.e., transfer). CLT is based on a cognitive architecture that consists of a limited working memory with partly independent processing units for visual and auditory information, which interacts with an unlimited long-term memory. These structures and functions of human cognitive architecture have been used to design a variety of novel efficient instructional methods. The associated research has shown that measures of cognitive load can reveal important information for CLT that is not necessarily reflected by traditional performance-based measures. Particularly, the combination of performance and cognitive load measures has been identified to constitute a reliable estimate of the mental efficiency of instructional methods. The discussion of previously used cognitive load measurement techniques and their role in the advancement of CLT is followed by a discussion of aspects of CLT that may benefit by measurement of cognitive load. Within the cognitive load framework, we also discuss some promising new techniques.
Technorati Tags: , , , , , , , , , , , ,

Powered by ScribeFire.

Monday, April 09, 2007

The dark side of being an expert

An interesting (brief) article in Psychological Science re: how the advantage experts enjoy in a specific domain (domain-specific expertise) can also have a slight "dark side"....namely, having a well organized body of knowledge can result in "intrusion errors" when recalling information. All-in-all......being an expert is what we all strive for in given domains.

This may also explain why experts may make errors when interviewed live on TV. Their recall is so automatic that they may not be alert to possible errors in their recalled information.

I'll now use this article to explain any recall errors I make when making a live professional presentation and I mispeak....it will be nice to blame my slight errors on this being "the price paid to be an expert." :)


Technorati Tags: , , , , , , , , , , , , , , ,


Powered by ScribeFire.

Reading fluency, automaticity and prosody

Reading Fluency has recently been mentioned as being an important reading subskill in the identification of students with with significant reading disorders. In CHC theory this factor has been labeled "Reading Speed" (RS), and is defined as "the abiilty to silently read and comprehend connected text (e.g., a series of short sentences; a passage) rapidly and automatically (with little conscious attention to the mechanics of reading)."

I just skimmed the intro section of the following article (I often find the intro sections of articles very informative even if I'm not interested in the primary purpose of the study) and found the definition of the components of reading fluency informative, esp. the emphasis on the cognitive process of automaticity. The one new bit of information I learned was the potential importance of the concept of "prosody" in the definition of reading fluency.
  • Kuhn, M. R., Schwanenflugel, P. J., Morris, R. D., Morrow, L. M., Woo, D. G., Meisinger, E. B., Sevcik,R. A., Bradley, B. A., & Stahl, S. A. (2006). Teaching children to become fluent and automatic readers. Journal of Literacy Research, 38(4), 357-387. (click to view)
Select extracted text
  • Fluent reading is typically defined by three constructs (Kuhn & Stahl, 2003; National Reading Panel, 2000). Most commonly, these constructs include quick and accurate word recognition (Jenkins, Fuchs, van den Broek, Espin, & Deno, 2003), and, when oral reading is considered, the appropriate use of prosody (Cowie, Douglas-Cowie, & Wichmann, 2002; Schwanenflugel, Hamilton, Kuhn, Wisenbaker, & Stahl, 2004). Some definitions also include comprehension as part of fluent reading (Fuchs, Fuchs, Hosp, & Jenkins, 2001; Wolf & Katzir-Cohen, 2001), as fluency is seen as a factor in readers’ ability to understand and enjoy text (e.g., Jenkins et al., 2003; Rasinski & Hoffman, 2003; Samuels, 2006). According to automaticity theorists, reading is composed of several concurrent elements, including decoding and comprehension (LaBerge & Samuels, 1974). However, individuals have a limited amount of attentional resources available for reading (or any other cognitive task). As a result, attentional resources spent on decoding are necessarily unavailable for comprehension (Kintsch, 1998; Stanovich, 1984). Fortunately, as word recognition becomes automatic, less attention needs to be expended on decoding and more cognitive resources can be devoted to the construction of meaning.
  • According to automaticity theory, the most effective way for students to develop such automatic word recognition is through extensive exposure to print (Adams, 1990; Samuels, 1979; Stanovich, 1984). Such practice leads to familiarity with a language’s orthographic patterns and allows learners to recognize words with increasing accuracy and automaticity, thereby permitting readers to focus on text meaning rather than simply on the words.
  • In addition to automatic word recognition, prosody may be an important indicator of fluent reading (Schwanenflugel et al., 2004). Reading prosody consists of those elements that comprise expressive reading, including intonation, emphasis, rate, and the regularly reoccurring patterns of language (Hanks, 1990; Harris & Hodges, 1981, 1995). When readers are able to apply these elements to text, it serves as an indicator that they can transfer elements that are present in oral language to print (Dowhower, 1991; Schreiber, 1991). Some recent research has suggested that prosody in fluent reading may serve primarily as an indicator that a child has achieved automaticity in text reading (Miller & Schwanenflugel, 2006; Schwanenflugel et al., 2004). However, the exact role of prosody in reading comprehension is open to further research (e.g., Cowie et al., 2002; Levy, Abello, & Lysynchuk, 1997; Schwanenflugel et al., 2004; T. Shanahan, personal communication, December 2, 2004).
Technorati Tags: , , , , , , , , , , , , ,


Powered by ScribeFire.

Thursday, January 18, 2007

Brain processesing of numerical/quantitative information-new studies

New research summarized at Science Daily that is providing new insights into how the brain processes quantiative/numerical (Gq/Gf-RQ) information. Below is the first paragraph of the article.
  • Two studies in the January 18, 2007, issue of the journal Neuron, published by Cell Press, shed significant light on how the brain processes numerical information--both abstract quantities and their concrete representations as symbols. The researches said their findings will contribute to understanding how the brain processes quantitative information as well as lead to studies of how numerical representation in the brain develops in children. Such studies could aid in rehabilitating people who suffer from dyscalculia--an inability to understand, remember, and manipulate numbers. The researchers also said their findings offer insight into the mystery of how the brain learns to associate abstract symbols precisely with quantities.
Scientific American also provides coverage of these two studies

Technorati Tags: , , , , , , , , , , , , , ,

powered by performancing firefox

Saturday, November 11, 2006

New black white IQ (g) comparison study

Bias in intelligence testing, particularly black-white IQ differences, has been a contentious area of study in the field of intelligence. A new study by Edwards and Oakland (reference, abstract, select findings, and article link provided below) adds new information to this area of study.

Briefly, using the K-12 (school-age) standardization data from the Woodcock-Johnson--Third Edition (WJ III; conflict of interest disclosure: I'm a co-auther of the WJ III), Edwards and Oakland examined, across blacks and whites, (a) the similarity of the structure of g (general intelligence), (b) mean differences in general intelligence, and (c) the similarity in predictive validity (do the WJ III g-scores predict achievement similarily across both groups). Given my potential conflict of interest, I'm only going to report the abstract and select direct quotes from the article. Readers are encouraged to read and digest the complete article.

The one comment I will make is not specifically WJ III related, but is theory related. Consistent with Carroll's (1993) conclusion that the structure of cognitive abilities is largely the same (invariant) as a function of gender and race, Edwards and Oakland's findings indicate that the structure of g (general intelligence), when operationalized by seven different CHC ability indicators (Gf, Gc, Glr, Gsm, Gv, Ga, Gs), is similar across whites and blacks.

Edwards. O. & Oakland, T. (2006). Factorial Invariance of Woodcock-Johnson III Scores for African Americans and Caucasian. Journal of Psychoeducational Assessment, 24 (4), 358-366. (click here to view)


Abstract
  • Bias in testing has been of interest to psychologists and other test users since the origin of testing. New or revised tests often are subject to analyses that help examine the degree of bias in reference to group membership based on gender, language use, and race/ethnicity. The pervasive use of intelligence test data when making critical and, at times, life-changing decisions warrants the need by test developers and test users to examine possible test bias on new and recently revised intelligence tests. This study investigates factorial invariance and criterionrelated validity of the Woodcock-Johnson III for African American and Caucasian American students. Data from this study suggest that although their mean scores differ, Woodcock-Johnson III scores have comparable meaning for both groups.
Select author conclusions from the article
  • Results from factor analysis, SEM, congruence coefficients, correlations coefficients, and Fisher’s Z statistic are uniform in indicating the factor structure of the WJ III is consistent for African Americans and Caucasian Americans.
  • The high congruence coefficient of .99 suggests the g factor structure is essentially identical for African Americans and Caucasian Americans. In addition, all fit indices are > .95, indicative of excellent fit and suggests covariant structural equivalence between the two groups. Although the mean IQs for the groups differ, the WJ III scores from the Cognitive Battery have comparable meaning for African American and Caucasian American students. Additionally, correlations between GIA and three achievement clusters and nine achievement subtests are similarly high and statistically significant for both groups.
  • The collective findings from this and other studies using the WJ III provide some support for Carroll’s (1993) assertion that CHC theory, one that forms the theoretical basis for the WJ III, is essentially invariant across racial/ethnic groups.
  • Thus, when using the WJ III Cognitive with African American and Caucasian American students, practitioners can be somewhat assured that possible score differences reflect differences in the underlying latent constructs rather than variations in the measurement operation itself (Watkins & Canivez, 2001).
Technorati Tags: , , , , , , , , , , , , , , , , , ,

Wednesday, August 16, 2006

WJ III NU (normative update): "Specification of cognitive processes involved in tests" publication

Dr. Fred Schrank, a colleague and co-author of the WJ III [this is a conflict of interest disclosure for both of us], has just published a new Assessment Service Bulletin (ASB # 7): "Specification of the Cognitive Processes Involved in Performance on the Woodcock-Johnson III NU". (click here to view/download).

The paragraph below, which was extracted from the ASB, is self-explanatory. Enjoy
  • This bulletin integrates information on the Woodcock-Johnson III (WJ III®), Cattell-Horn-Carroll (CHC) theory, and selected research in cognitive psychology—the branch of psychology that is based on the scientific study of human cognitive processes. Support for a specification of the cognitive processes involved in performance on the WJ III is described in terms of an integration of CHC theory with selected classic and contemporary cognitive and neuroscience research.
This publication was made available to IAP (and this blog) by Riverside Publishing.
  • "Woodcock-Johnson III NU Assessment Service Bulletin Number 7 used with permission of the publisher. Copyright 2006 by the Riverside Publishing Company. All rights reserved"


Technorati Tags: , , , , , , , , , , , , , , , , , , ,

powered by performancing firefox

Tuesday, August 01, 2006

Expertise, ACT-R, CLT (cognitive load theory): Instructional design

In my readings I frequently run across references to Anderson's ACT-R framework for describing the acquisition of expertise. If pushed hard, I would be hard to provide a concise explanation/description of this model. Thus, I found a quick skim of a recent article dealing with CLT (cognitive load theory) a pleasant surprise. The following concise summary (italics added by blogmaster) of the ACT-R framework was provided.

  • "Using worked examples in problem-solving instruction is consistent with a four-stage model of expertise that is based on the well-known ACT-R framework (Anderson, Fincham,& Douglass, 1997). In this model, learners who are in the first stage of skill acquisition solve problems by analogy; they use known examples of problems, and try to relate those problems to the new problem to be solved. At the second stage, learners have developed abstract declarative rules or schemas, which guide them in future problem solving. At the third stage, with sufficient practice, the schemas become proceduralised, leading to the fourth stage of expertise where automatic schemas and analogical reasoning on a large pool of examples are combined to successfully solve a variety of problem types. Empirical evidence has shown that learning with worked examples is most important during initial skill acquisition stages for well-structured domains such as physics, programming, and mathematics (Van-Lehn, 1996)."
Although not the primary purpose of this post, readers may find the complete article, which deals with CLT, of interest. I've been collecting articles on CLT but have yet to devote sufficient time to understanding the implications of CLT (which would allow me to do some intelligent posting). All I can say is that I think CLT appears to have significant implications for instructional interventions when framed within a cognitive information processing framework.

Technorati Tags: , , , , , , , , , , ,

powered by performancing firefox

Thursday, November 03, 2005

Visual system (Gv) information bottleneck research review


I just skimmed an interesting review article in Trends in Cognitive Sciences on the information processing bottleneck of the visual system (Gv?).

The abstract is posted below, plus some clarifying information from the introduction. The figure in this post represents the author neat attempt to summarize the research literature re: the areas of the brain associated with the three primary sources of the bottleneck (I like the way they graphically show the areas of the brain via the overlay of research study citations). If you double click on the image, your browser should present you with a larger more readable version.

Marois, R. & Ivanoff, J. (2005). Capacity limits of information processing in the brain Trends in Cognitive Sciences, 9(6), 296-305.

Abstract
  • Despite the impressive complexity and processing power of the human brain, it is severely capacity limited. Behavioral research has highlighted three major bottlenecks of information processing that can cripple our ability to consciously perceive, hold in mind, and act upon the visual world, illustrated by the attentional blink (AB), visual short-term memory (VSTM), and psychological refractory period (PRP) phenomena, respectively. A review of the neurobiological literature suggests that the capacity limit of VSTM storage is primarily localized to the posterior parietal and occipital cortex, whereas the AB and PRP are associated with partly overlapping fronto-parietal networks. The convergence of these two networks in the lateral frontal cortex points to this brain region as a putative neural locus of a common processing bottleneck for perception and action.
Additional information from the introduction
  • A rich history of cognitive research has highlighted three major processing limitations during the flow of information from sensation to action, each exemplified by a specific experimental paradigm. The first limitation concerns the time it takes to consciously identify and consolidate a visual stimulus in visual short-term memory (VSTM), as revealed by the attentional blink paradigm.This process can take more than half a second before it is free to identify a second stimulus.
  • A second, severely limited capacity is the restricted number of stimuli that can be held in VSTM, as exemplified by the change detection paradigm.
  • A third bottleneck arises when one must choose an appropriate response to each stimulus. Selecting an appropriate response for one stimulus delays by several hundred milliseconds the ability to select a response for a second stimulus (the‘psychological refractory period’).

Tuesday, May 03, 2005

Cognitive Efficiency and achievement

The following article, which is "in press" in Intelligence, provides interesting information regarding the potential importance of measures of cognitive efficiency in predicting/explaining school achievement. The abstract is printed below along with a few highlights from the study.

Luo, D., Thompson, L. A., & Detterman, D. K. (2005). The criterion validity of tasks of basic cognitive processes. Intelligence, In Press, Corrected Proof.

Abstract
  • The present study evaluated the criterion validity of the aggregated tasks of basic cognitive processes (TBCP). In age groups from 6 to 19 of the Woodcock-Johnson III Cognitive Abilities and Achievement Tests normative sample, the aggregated TBCP, i.e., the processing speed and working memory clusters, correlate with measures of scholastic achievement as strongly as the conventional indexes of crystallized intelligence and fluid intelligence. These basic processing aggregates also mediate almost exhaustively the correlations between measures of fluid intelligence and achievement, and appear to explain substantially more of the achievement measures than the fluid ability index. The results from the Western Reserve Twin Project sample using TBCP with more rigorous experimental paradigms were similar, suggesting that it may be practically feasible to adopt TBCP with experimental paradigms into the psychometric testing tradition. Results based on the latent factors in structural equation models largely confirmed the findings based on the observed aggregates and composites.
Summary/Comments
  • The measures of TBCP in the present study were taken from two data sources, Woodcock-Johnson III Cognitive Abilities and Achievement Tests (W-J III; Woodcock et al., 2001a, 2001c; Woodcock, McGrew, & Mather, 2001b) normative data and the Western Reserve Twin Project (WRTP) data. The WJ III results will be the focus of this post.
  • Luo et al. examined (via multiple regression and SEM) the extent to which measures of what the WJ III authors (myself included – see home page conflict of interest disclosure) call “cognitive efficiency” (CE - Gs and Gsm tests/clusters) add to the prediction of total achievement, above and beyond Gc and Gf.
  • These researchers found that CE measures/abilities demonstrated substantial correlations with scholastic performance (WJ III Total Achievement). The CE-Ach correlations were similar to correlations between conventional test composites and scholastic performance. These results suggested that measures of CE provide incremental predictive validity beyond Gc. Collectively, CE+Gc accounted for approximately about 60% of the variability in achievement when observed measures were analyzed (multiple regression) and up to 70% or more of the variance when the SEM latent traits were analyzed. The authors concluded that these levels of prediction were “remarkable”
  • Gf measures did NOT contribute significantly to the prediction of achievement beyond that already accounted for by the CE measures. [Editorial note – my prior research with the WJ III and WJ-R suggests this result may be due to the authors using “total achievement” as their criterion. My published research has consistently found that Gf is an important predictor/causal variable in the domain of mathematics].
  • A potential explanation for the power of the CE measures/variables has previously been published and posted to the web (click here).
  • The current results, IMHO, fit nicely within CHC-based information model frameworks that have been suggested. Simplified schematic models (based on the work of Woodcock) can be viewed by clicking here.