Showing posts with label assessment. Show all posts
Showing posts with label assessment. Show all posts

Saturday, May 03, 2025

Book nook-chapter: Foundations of #AI in #Educational #Assessment

 


Abstract

This chapter explores the evolution and transformative potential of artificial intelligence (AI) in educational assessment, highlighting its ability to enhance the evaluation of student learning through adaptive, personalized, and dynamic approaches. AI technologies such as machine learning, natural language processing, and computer vision are revolutionizing assessment design by enabling the measurement of higher-order skills like critical thinking, problem-solving, and creativity. The chapter also addresses ethical and practical considerations, including algorithmic bias, data privacy, and equity in implementation, emphasizing the importance of responsible innovation. By examining historical assessment practices alongside contemporary AI applications, this chapter provides a comprehensive foundation for understanding how AI is reshaping education and establishing a roadmap for its equitable adoption.

Tuesday, November 05, 2024

Now Available: #ATP-#NCTA #Assessment Industry #Standards and Best Practices for #OnlineObservationofTests (“Standards and #BestPractices”)


The National College Testing Association (NCTA) and the Association of Test Publishers (ATP) represent two leading organizations dedicated to the advancement and study of testing and assessment. The ATP-NCTA Assessment Industry Standards and Best Practices for Online Observation of Tests (“Standards and Best Practices”) provides consensus-based requirements and considerations for the online observation of tests, with or without a proctor. It includes current thinking regarding fast moving technologies, such as artificial intelligence, biometrics, and advanced algorithms, and evolving regulations concerning privacy and accessibility.  Click here for more info.


The Standards and Best Practices were contributed to and reviewed by testing professionals globally. These guidelines are intended for use by test sponsors, such as certification bodies and online observation providers, to outline responsibilities and best practices that ensure privacy, test security, and data validity. The Standards and Best Practices were developed during rapid changes in the assessment industry, driven by the COVID-19 pandemic. As online monitoring evolved, the document was revised multiple times and will be updated periodically due to the fast-paced advancements in assessment technologies.


ATP and NCTA members can access this document for free as a membership benefit by logging into the members-only section of the website and navigating to "Publications Discounts," while non-members can purchase it for $19.95. Click here for more info.


Friday, August 19, 2011

One of the best texts on intellectual assessment has been revised: Flanagan & Harrison's Contemporary Intellectual Assessment

Just in time for your XMAS shopping!!!! One of the best texts on intellectual assessment and theories related to the practice of intellectual assessment.

The publisher has given me permission to post this information. The text below does not show all the formatting in the original document sent to me, so if you want a nicer PDF version to share with others, click here.


Conflict of interest disclosure: I have coauthored a chapter in the book and will be splitting an honorarium check (not big, trust me) and will be receiving a free copy. But, I get no royalties (I wish I did).


Kudos to Dr. Flanagan and Harrison for revising what I consider one of the best texts on intellectual assessment.





NEW FROM THE GUILFORD PRESS (​Revised and Expanded!)

Contemporary Intellectual Assessment, Third Edition
Theories, Tests, and Issues
Edited by Dawn P. Flanagan, PhD, Department of Psychology, St. John's University; and
Patti L. Harrison, PhD, Department of Educational Studies in Psychology, Research Methodology, and Counseling, University of Alabama

Available to Ship: December 2011
Copyright: 2012
Pages: 926
Size: 7" x 10"
Hardcover: ISBN 978-1-60918-995-2
Hardcover Price: $95.00 tentative/short discount
Prior edition copyright: 2005
Prior edition cloth ISBN: 978-1-59385-125-5
Website Category: EDUCATION: Educational Psychology; School Psychology. PSYCHOLOGY: School Psychology; Neuropsychology & Neuroscience; Child/Adolescent Clinical Psychology & Psychiatry.
Subject Areas/Keywords: adults, assessment instruments, batteries, children, cognitive assessments, developmental, educational psychology, evaluations, intellectual assessments, intellectual disabilities, intelligence testing, learning disabilities, response to intervention, RTI, tests
Internal Code: F
Date Issued: August 15, 2011

CRITICAL ACCLAIM
"This is the most comprehensive, authoritative, and up-to-date text on intellectual assessment available. It covers current theories of intelligence, methods of intelligence testing, and their applications to special populations. The chapters are all written by leading scholars and combine clear research summaries with practical advice, making this a valuable book for graduate students and professionals interested in both research and practice."—Paul J. Frick, PhD, University Distinguished Professor and Chair, Department of Psychology, University of New Orleans

"A single source for essential, detailed information on the past, present, and future of intellectual and cognitive assessment practices. The table of contents provides an objective sweep of all major theories, tests, and evaluation procedures at a glance; the expertise of the chapter authors results in a work that is consistently outstanding. The third edition reflects the latest efforts in clinical inquiry that explore cognitive abilities and processes as they relate to the real world. It emphasizes cross-cultural issues in assessment and incorporates new approaches and instruments related to learning and developmental problems. This is an essential text for graduate-level assessment courses."
—Elaine Fletcher-Janzen, EdD, Department of School Psychology, Chicago School of Professional Psychology

"This updated volume is a valuable addition to the field of intellectual and psychological assessment. The editors have assembled the leaders in the field to present the most up-to-date information available. Many of the chapters are written by the test authors and theory creators themselves; readers will benefit from the firsthand approach to test and theory interpretation. The book is written in a way that will appeal to both experienced practitioners and graduate students just starting out in the field."—Andrew S. Davis, PhD, Department of Educational Psychology, Ball State University

"A superb theoretical and clinical overview....The standards of theoretical and methodological rigor, comprehensive topical coverage, balanced and objective critical analysis, life span cognitive evaluation, and advocacy for special populations...are beautifully balanced. One could not ask for more from a review and critical evaluation of this extensive, rich, and complex literature."—PsycCRITIQUES

"Should be required reading in all intelligence testing courses and by anyone involved in the assessment of human and cognitive abilities."—Psychotherapy in Private Practice

DESCRIPTION
In one volume, this authoritative reference presents a current, comprehensive overview of intellectual and cognitive assessment, with a focus on practical applications. Leaders in the field describe major theories of intelligence and provide the knowledge needed to use the latest measures of cognitive abilities with individuals of all ages, from toddlers to adults. Evidence-based approaches to test interpretation, and their relevance for intervention, are described. The book addresses critical issues in assessing particular populations—including culturally and linguistically diverse students, gifted students, and those with learning difficulties and disabilities—in today's educational settings.

New to This Edition
*Incorporates major research advances and legislative and policy changes.
*Covers recent test revisions plus additional tests: the NEPSY-II and the WNV.
*Expanded coverage of specific populations: chapters on autism spectrum disorders, attention-deficit/hyperactivity disorder, sensory and physical disabilities and traumatic brain injury, and intellectual disabilities.
*Chapters on neuropsychological approaches, assessment of executive functions, and multi-tiered service delivery models in schools.

KEY POINTS
> An authoritative reference, revised and expanded: features 11 new chapters.
> Comprehensive and current: covers all major tests and how they relate to educational services and policy.
> Chapters are written by the test developers themselves.
> A successful practitioner handbook and graduate-level text.

AUDIENCE
Practitioners, students, and researchers in school, educational, cognitive, and child clinical psychology.

COURSE USE
Serves as a primary text in graduate-level intellectual/cognitive assessment courses.

CONTENTS
I. The Origins of Intellectual Assessment
1. A History of Intelligence Assessment: The Unfinished Tapestry, John D. Wasserman
2. A History of Intelligence Test Interpretation, Randy W. Kamphaus, Anne Pierce Winsor, Ellen W. Rowe, and Sangwon Kim
II. Contemporary Theoretical Perspectives
3. Foundations for Better Understanding of Cognitive Abilities, John L. Horn and Nayena Blankson
4. The Cattell–Horn–Carroll (CHC) Model of Intelligence, W. Joel Schneider and Kevin S. McGrew
5. Assessment of Intellectual Profile: A Perspective from Multiple-Intelligences Theory, Jie-Qi Chen and Howard Gardner
6. The Triarchic Theory of Successful Intelligence, Robert J. Sternberg
7. Planning, Attention, Simultaneous, Successive (PASS): A Cognitive Processing–Based Theory of Intelligence, Jack A. Naglieri, J. P. Das, and Sam Goldstein
III. Contemporary Intelligence, Cognitive, and Neuropsychological Batteries (and Associated Achievement Tests)
8. The Wechsler Adult Intelligence Scale–Fourth Edition (WAIS-IV) and the Wechsler Memory Scale–Fourth Edition (WMS-IV), Lisa Whipple Drozdick, Dustin Wahlstrom, Jianjun Zhu, and Lawrence G. Weiss
9. The Wechsler Preschool and Primary Scale of Intelligence–Third Edition (WPPSI–III), the Wechsler Intelligence Scale for Children–Fourth Edition (WISC–IV), and the Wechsler Individual Achievement Test–Third Edition (WIAT–III), Dustin Wahlstrom, Kristina C. Breaux, Jianjun Zhu, and Lawrence G. Weiss
10. The Stanford–Binet Intelligence Scales, Fifth Edition (SB5), Gale H. Roid and Mark Pomplun
11. The Kaufman Assessment Battery for Children–Second Edition (KABC-II) and the Kaufman Test of Educational Achievement–Second Edition (KTEA-II), Jennie Kaufman Singer, Elizabeth O. Lichtenberger, James C. Kaufman, Alan S. Kaufman, and Nadeen L. Kaufman
12. The Woodcock–Johnson III Normative Update (WJ III NU): Tests of Cognitive Abilities and Tests of Achievement, Fredrick A. Schrank and Barbara J. Wendling
13. The Differential Ability Scales–Second Edition (DAS-II), Colin D. Elliott
14. The Universal Nonverbal Intelligence Test (UNIT): A Multidimensional Nonverbal Alternative for Cognitive Assessment, R. Steve McCallum and Bruce A. Bracken
15. The Cognitive Assessment System (CAS): From Theory to Practice, Jack A. Naglieri and Tulio M. Otero
16. The Reynolds Intellectual Assessment Scales (RIAS) and the Reynolds Intellectual Screening Test (RIST), Cecil R. Reynolds, Randy W. Kamphaus, and Tara C. Raines
17. The NEPSY-II, Robb N. Matthews, Cynthia A. Riccio, and John L. Davis
18. The Wechsler Nonverbal Scale of Ability (WNV): Assessment of Diverse Populations, Jack A. Naglieri and Tulio M. Otero
IV. Contemporary Interpretive Approaches and Their Relevance for Intervention
19. The Cross-Battery Assessment (XBA) Approach: An Overview, Historical Perspective, and Current Directions, Dawn P. Flanagan, Vincent C. Alfonso, and Samuel O. Ortiz
20. Cognitive Hypothesis Testing (CHT): Linking Test Results to the Real World, Catherine A. Fiorello, James B. Hale, and Kirby L. Wycoff
21. Processing Approaches to Interpreting Information from Cognitive Ability Tests: A Critical Review, Randy G. Floyd and John H. Kranzler
22. Testing with Culturally and Linguistically Diverse Populations: Moving beyond the Verbal–Performance Dichotomy into Evidence-Based Practice, Samuel O. Ortiz, Salvador Hector Ochoa, and Agnieszka M. Dynda
23. Linking Cognitive Abilities to Academic Interventions for Students with Specific Learning Disabilities (SLD), Nancy Mather and Barbara J. Wendling
V. Assessment of Intelligence and Cognitive Functioning in Different Populations
24. Cognitive Assessment in Early Childhood: Theoretical and Practical Perspectives, Laurie Ford, Michelle L. Kozey, and Juliana Negreiros
25. Use of Intelligence Tests in the Identification of Giftedness, David E. McIntosh, Felicia A. Dixon, and Eric E. Pierson
26. Use of Ability Tests in the Identification of Specific Learning Disabilities (SLD) within the Context of an Operational Definition, Dawn P. Flanagan, Vincent C. Alfonso, Jennifer T. Mascolo, and Marlene Sotelo-Dynega
27. Assessment of Intellectual Functioning in Autism Spectrum Disorder (ASD), Laura Grofer Klinger, Sarah E. O’Kelley, Joanna L. Mussey, Sam Goldstein, and Melissa DeVries
28. Cognitive and Neuropsychological Assessment of ADHD: Redefining a Disruptive Behavior Disorder, James B. Hale, Megan Yim, Andrea N. Schneider, Gabrielle Wilcox, Julie N. Henzel, and Shauna G. Dixon
29. Intellectual and Neuropsychological Assessment of Individuals with Sensory and Physical Disabilities and Traumatic Brain Injury, Scott L. Decker, Julia A. Englund, and Alycia M. Roberts
30. Use of Intelligence Tests in the Identification of Children with Intellectual and Developmental Disabilities (IDD), Kathleen Armstrong, Jason Hangauer, and Joshua Nadeau
VI. Contemporary and Emerging Issues in Intellectual Assessment
31. Using Joint Test Standards to Evaluate the Validity Evidence for Intelligence Tests, Jeffery P. Braden and Bradley C. Niebling
32. Using Confirmatory Factor Analysis (CFA) to Aid in Understanding the Constructs Measured by Intelligence Tests, Timothy Z. Keith and Matthew R. Reynolds
33: The Emergence of Neuropsychological Constructs into Tests of Intelligence and Cognitive Abilities, Daniel C. Miller and Denise E. Maricle
34. The Role of Cognitive and Intelligence Tests in the Assessment of Executive Functions, Denise E. Maricle and Erin Avirett
35. Intelligence Tests in the Context of Emerging Assessment Practices: Problem-Solving Applications, Rachel Brown-Chidsey and Kristina J. Andren
36. Intellectual, Cognitive, and Neuropsychological Assessment in Three-Tier Service Delivery Practices in Schools, George McCloskey, James Whitaker, Ryan Murphy, and Jane Rogers
Appendix. The Three-Stratum Theory of Cognitive Abilities, John B. Carroll

CONTRIBUTORS
Dynda Agnieszka, PsyD, Department of Psychology, St. John’s University, Jamaica, New York
Vincent C. Alfonso, PhD, Graduate School of Education, Fordham University, New York, New York
Kristina J. Andren, PsyD, School Psychology Program, University of Southern Maine, Gorham, Maine
Kathleen Armstrong, PhD, Department of Pediatrics, University of South Florida, Tampa, Florida
Erin Avirett, BA, Department of Psychology and Philosophy, Texas Women’s University, Denton, Texas
Nayena Blankson, PhD, Department of Psychology, Spelman College, Atlanta, Georgia
Bruce A. Bracken, PhD, School of Education, The College of William and Mary, Williamsburg, Virginia
Jeffery P. Braden, PhD, Department of Psychology, North Carolina State University, Raleigh, North Carolina
Kristina C. Breaux, PhD, The Psychological Corporation, San Antonio, Texas
Rachel Brown-Chidsey, PhD, School Psychology Program, University of Southern Maine, Gorham, Maine
John B. Carroll, PhD, Emeritus Professor of Psychology, University of North Carolina, Chapel Hill, North Carolina
Jie-Qi Chen, PhD, Erikson Institute, Chicago, Illinois
J. P. Das, PhD, Department of Educational Psychology, University of Alberta, Edmonton, Alberta, Canada
John L. Davis, MA, Department of Educational Psychology, Texas A&M University, College Station, Texas
Scott L. Decker, PhD, Department of Psychology, Barnwell College, University of South Carolina, Columbia, South Carolina
Melissa DeVries, PhD, Neurology, Learning, and Behavior Center, Salt Lake City, Utah
Felicia A. Dixon, PhD, Department of Educational Psychology, Ball State University, Muncie, Indiana
Shauna G. Dixon, MS, Graduate School of Education, Harvard University, Cambridge, Massachusetts
Lisa Whipple Drozdick, PhD, The Psychological Corporation, San Antonio, Texas
Agnieszka M. Dynda, PsyD, Department of Psychology, St. John’s University, Jamaica, New York
Colin D. Elliot, PhD, The Gervitz School of Education, University of California, Santa Barbara, California
Julia A. Englund, BA, Department of Psychology, University of South Carolina, Columbia, South Carolina
Catherine A. Fiorello, PhD, NCSP, School Psychology Program and Department of Psychological Studies in Education, College of Education, Temple University, Philadelphia, Philadelphia
Dawn P. Flanagan, PhD, Department of Psychology, St. John’s University, Jamaica, New York
Randy G. Floyd, PhD, Department of Psychology, University of Memphis, Memphis, Tennessee
Laurie Ford, PhD, Department of Educational and Counseling Psychology, University of British Columbia, Vancouver, British Columbia, Canada
Howard Gardner, PhD, Graduate School of Education, Harvard University, Cambridge, Massachusetts
Sam Goldstein, PhD, Neurology, Learning, and Behavior Center, Salt Lake City, Utah
James B. Hale, PhD, School Psychology Program, Philadelphia College of Osteopathic Medicine, Philadelphia, Philadelphia
Jason Hangauer, EdS, Department of Pediatrics, University of South Florida, Tampa, Florida
Julie N. Henzel, PsyD, The Nisonger Center, Ohio State University, Columbus, Ohio
John L. Horn, PhD (deceased), Department of Psychology, University of Southern California, Los Angeles, California
Randy W. Kamphaus, PhD, College of Education, Georgia State University, Atlanta, Georgia
Alan S. Kaufman, PhD, Child Study Center, Yale University, New Haven, Connecticut
James C. Kaufman, PhD, Department of Psychology, California State University, San Bernardino, California
Nadeen L. Kaufman, PhD, Child Study Center, Yale University, New Haven, Connecticut
Timothy Z. Keith, PhD, Department of Educational Psychology, University of Texas at Austin, Austin, Texas
Sangwon Kim, PhD, Graduate School of Education, Fordham University, New York, New York
Laura Grofer Klinger, PhD, Department of Psychology, University of Alabama, Tuscaloosa, Alabama
Michelle L. Kozey, MA, Department of Educational and Counseling Psychology, University of British Columbia, Vancouver, British Columbia, Canada
John H. Kranzler, PhD, Special Education Program, College of Education, University of Florida, Gainesville, Florida
Elizabeth O. Lichtenberger, PhD, private practice, Carlsbad, California
Denise E. Maricle, PhD, Department of Psychology and Philosophy, Texas Women’s University, Denton, Texas
Jennifer T. Mascolo, PsyD, Department of Psychology, St. John’s University, Jamaica, New York
Nancy Mather, PhD, Department of Disability and Psychoeducational Studies, College of Education, University of Arizona, Tucson, Arizona
Robb N. Matthews, MA, Department of Educational Psychology, Texas A&M University, College Station, Texas
R. Steve McCallum, PhD, Department of Educational Psychology and Counseling, University of Tennessee, Knoxville, Tennessee
George McCloskey, PhD, Department of Psychology, Philadelphia College of Osteopathic Medicine, Philadelphia, Pennsylvania
Kevin S. McGrew, PhD, Institute for Applied Psychometrics, St. Cloud, Minnesota
David E. McIntosh, PhD, Department of Educational Psychology, Ball State University, Muncie, Indiana
Daniel C. Miller, PhD, Department of Psychology and Philosophy, Texas Women’s University, Denton, Texas
Ryan Murphy, EdS, Department of School Psychology, Philadelphia College of Osteopathic Medicine, Philadelphia, Pennsylvania
Joanna L. Mussey, MA, Department of Psychology, University of Alabama, Tuscaloosa, Alabama
Joshua Nadeau, MS, Department of Pediatrics, University of South Florida, Tampa, Florida
Jack A. Naglieri, PhD, ABAP, Department of Psychology, George Mason University, Fairfax, Virginia
Juliana Negreiros, MA, Department of Educational and Counseling Psychology, University of British Columbia, Vancouver, British Columbia, Canada
Bradley C. Niebling, PhD, Midwest Instructional Leadership Council, Urbandale, Iowa
Salvador Hector Ochoa, PhD, Department of Educational Psychology, University of Texas–Pan American, Edinburg, Texas
Sarah E. O’Kelly, PhD, Department of Psychiatry and Behavioral Neurobiology, University of Alabama, Tuscaloosa, Alabama
Samuel O. Ortiz, PhD, Department of Psychology, St. John’s University, Jamaica, New York
Tulio M. Otero, PhD, School Psychology Program, Chicago School of Professional Psychology, Chicago, Illinois
Eric E. Pierson, PhD, NCSP, HSPP, Department of Educational Psychology, Ball State University, Muncie, Indiana
Mark Pomplun, PhD, Riverside Publishing, Itasca, Illinois
Tara C. Raines, PsyS, Gwinnett County Public Schools, Gwinnett County, Georgia
Cecil R. Reynolds, PhD, Department of Educational Psychology, Texas A&M University, College Station, Texas
Matthew R. Reynolds, PhD, Department of Psychology and Research in Education, University of Kansas, Lawrence, Kansas
Cynthia A. Riccio, PhD, Department of Educational Psychology, Texas A&M University, College Station, Texas
Alycia M. Roberts, BA, Department of Psychology, University of South Carolina, Columbia, South Carolina
Jane Rogers, PsyD, Department of Psychology, Philadelphia College of Osteopathic Medicine, Philadelphia, Pennsylvania
Gale H. Roid, PhD, Department of Institutional Research, Warner Pacific College, Portland, Oregon
Ellen W. Rowe, PhD, Center for Psychological Services, George Mason University, Fairfax, Virginia
Andrea N. Schneider, BA, Department of Psychology, University of Victoria, Victoria, British Columbia, Canada
W. Joel Schneider, PhD, Department of Psychology, Illinois State University, Normal, Illinois
Fredrick A. Schrank, PhD, Woodcock–Munoz Foundation, Olympia, Washington
Jennie Kaufman Singer, PhD, College of Health and Human Services, Sacramento State University, Sacramento, California
Marlene Sotelo-Dynega, PhD, Department of Psychology, St. John’s University, Jamaica, New York
Robert J. Sternberg, PhD, Provost, Oklahoma State University, Stillwater, Oklahoma
Dustin Wahlstrom, PhD, The Psychological Corporation, San Antonio, Texas
John D. Wasserman, PhD, Department of Psychology, George Mason University, Fairfax, Virginia
Lawrence G. Weiss, PhD, The Psychological Corporation, San Antonio, Texas
Barbara J. Wendling, MA, Consulting Services, Dallas, Texas
James Whitaker, PsyD, Department of Psychology, Philadelphia College of Osteopathic Medicine, Philadelphia, Pennsylvania
Gabrielle Wilcox, PsyD, Department of School Psychology, Philadelphia College of Osteopathic Medicine, Philadelphia, Pennsylvania
Anne Pierce Winsor, PhD, Department of Educational Psychology, University of Georgia, Athens, Georgia
Kirby L. Wycoff, EdM, NCSP, Graduate School of Applied and Professional Psychology, Rutgers, The State University of New Jersey, Piscataway, New Jersey
Megan Yim, BA, Department of Psychology, Victoria Island University, Victoria, British Columbia, Canada
Jianjun Zhu, PhD, The Psychological Corporation, San Antonio, Texas

GUILFORD PUBLICATIONS, INC.
72 Spring Street, New York, NY 10012
Tel: (212) 431-9800​Toll Free: (800) 365-7006
Fax: (212) 966-6708​E-mail: info@guilford.com
Visit our website: www.guilford.com


- iPost using BlogPress from Kevin McGrew's iPad


Generated by: Tag Generator

Saturday, April 24, 2010

Journal of Psych Assessment: New editor focus on measurement



Kudos to Cecil Reynolds.  I like the new emphasis on basic measurement for the journal.

Reynolds, C. (2010).  Measurement and Assessment:  An Editorial View.  Psychological Assessment, 22(1), 1-4

If a thing exists, it can be measured. Measurement is a central component of assessment if we believe that fear, anxiety, intelligence, self-esteem, attention, and similar latent variables exist and are useful to us in developing an understanding of the human condition and leading us to ways to improve it. Much of what is published in Psychological Assessment deals with the development and the application of measurement devices of various sorts with the end goal of applications in assessment practice. What is submitted but not published largely deals with the same topics. As the new Editor writing the inaugural editorial, I am focusing on this topic for two major reasons. The first is that the most frequent reason why manuscripts are rejected in the peer-review process for Psychological Assessment and other high-quality journals devoted to clinical or neuropsychological assessment is inadequate attention to sound and high-quality measurement practices. The second reason is my surmise that measurement as a science is no longer taught with the rigor that characterized the earlier years of professional psychology. One of the tasks of Psychological Assessment is to promote a strong science of clinical assessment as practiced throughout professional psychology. To that end, I have attempted to pull together an eclectic group of Associate Editors and Consulting Editors. Our hope is to attract more and better manuscripts that deal with issues focusing on all aspects of clinical assessment

Technorati Tags: , , , , , ,

Tuesday, August 04, 2009

Historical Brazil psychological assessment instruments

While attending the Brazilian assessment/measurement conference we were treated to a visit to a collection of historical psychological assessment instruments. Below are pictures of a few...and a few select comments. I've requested the name of the professional who has these instruments in her collection and will post her name as soon as I receive it. I found this very interesting....as I also collect old measures of cognitive functioning.

The materials were organized by Dr. Maria do Carmo Guedes, Pontifical Catholic University at Sao Paulo, Coordinator of the Center for Studies on the History of Psychology




An obvious concept formation category test (Gf)




A tool to measure some aspect of psychomotor strength (Gp)




A measure of psychomotor dexterity and speed (Gp, Gps)




A very neat HUGE single blog assembly test (Gv)




A measure of psychomotor coordination (remember the game "Operation"---very similar feedback concept)




The classic Kohs blocks




An ingenious test. You need to remove one piece at a time without the thing falling apart. Most likely a mixed measure of Gv, Gf (planning), psychomotor dexterity and executive functions.





Some kind of Gv assembly task. Reminded me of our old "erector set" toy from childhood.



A "lock box" test. You must figure out how to unlock the box, via the various locking mechanisms as quickly as you can.





Technorati Tags: , , , , , , , , ,

Tuesday, July 07, 2009

WJ III CHC cluster g an specificity characteristics: Floyd et al. (2009) published

The following article, which was previously previewed in an earlier post, has now been published. Check the prior post for links to supplementary tables mentioned in the article. [Conflict of interest disclosure - I am a co-author of the WJ III, which was the focus of this publication]
  • Floyd, R., McGrew, K., Barry, A., Rafael, F & Rogers, J. (2009) General and Specific Effects on Cattell–Horn–Carroll Broad Ability Composites: Analysis of the Woodcock–Johnson III Normative Update CHC Factor Clusters Across Development. School Psychology Review, 38(2), 249-265
Abstract
Many school psychologists focus their interpretation on composite scores from intelligence test batteries designed to measure the broad abilities from the Cattell-Hom-Carroll theory. The purpose of this study was to investigate the general factor loadings and specificity of the broad ability composite scores from one such intelligence test battery, the Woodcock-Johnson Hm Tests of Cognitive Abilities Normative Update (Woodcock, McGrew, Schrank, & Mather, 2007). Results from samples beginning at age 4 and continuing through age 60 indicate that Comprehension-Knowledge, Long-Term Retrieval, and Fluid Reasoning appear to be primarily measures of the general factor at many ages. In contrast, Visual-Spatial Thinking, Auditory Processing, and Processing Speed appear to be primarily measures of specific abilities at most ages. We offer suggestions for considering both the general factor and specific abilities when interpreting Cattell- Horn-Carroll broad ability composite scores.
Technorati Tags: , , , , , , , , , , ,

Wednesday, July 01, 2009

Applied Psych Test Development Series: Parts C/D--Develop norm plan and calculating norms

The fourth and fifth in the series Art and Science of Applied Test Development is now available.

The fourth module (Part D--Develop norm [standardization] plan) is now available.

The fifth module (Part E--Calcuate norms and derived scores) is also now available.


These are the fourth and fifth in a series of PPT modules explicating the development of psychological tests in the domain of cognitive ability using contemporary methods (e.g., theory-driven test specification; IRT-Rasch scaling; etc.). The presentations are intended to be conceptual and not statistical in nature. Feedback is appreciated.

This project can be tracked on the left-side pane of the blog under the heading of Applied Test Development Test Development Series.

The first module (Part A: Planning, development frameworks & domain/test specification blueprints) was posted previously and is accessible via SlideShare.

The second module (Part B: Test and item development) was posted previously and is accessible via SlideShare.

The third module (Part C--Use of Rasch scaling technology) was posted previously and is accessible via Slideshare.

You are STRONGLY encouraged to view them in order as concepts, graphic representation of concepts and ideas, build on each other from start to finish.

Enjoy...more to come.

Technorati Tags: , , , , , , , , , , , , , , , , ,

Monday, June 29, 2009

Applied Psych Test Development Series: Part C--Use of Rasch scaling technology

The third in the series Art and Science of Applied Test Development is now available. The third module (Part C: Test and Item Development--Use of Rasch Scaling Technology) is now available.

This is the third in a series of PPT modules explicating the development of psychological tests in the domain of cognitive ability using contemporary methods (e.g., theory-driven test specification; IRT-Rasch scaling; etc.). The presentations are intended to be conceptual and not statistical in nature. Feedback is appreciated.

This project can be tracked on the left-side pane of the blog under the heading of Applied Test Development Test Development Series.

The first module (Part A: Planning, development frameworks & domain/test specification blueprints) was posted previously and is accessible via SlideShare.

The second module (Part B: Test and item development) was posted previously and is accessible via SlideShare.

You are STRONGLY encouraged to view them in order as concepts, graphic representation of concepts and ideas, build on each other from start to finish.

Enjoy...more to come.

Technorati Tags: , , , , , , , , , , , , , , , ,

Wednesday, June 17, 2009

WMF Press: Dean-Woodcock Neuropsych Report Software


The Woodcock-Munoz Foundation (WMF) Press has just made its first publication--- a piece of neuropsychological assessment report software.

The Dean-Woodcock Neuropsychological Report is a scoring and interpretive program that assists qualified evaluators and service providers in creating neuropsychological reports based on test results from the Dean-Woodcock Neuropsychological Battery, the Woodcock-Johnson® III, and the Batería III NU Woodcock-Muñoz®. Click here to be taken to the page, which also lists requirements necessary to download the software for free.

[Conflict of interest - I'm the Research Director for WMF and am also a coauthor of the Woodcock-Johnson III.

Technorati Tags: , , , , , , , , , , , , , , ,

Tuesday, June 16, 2009

WCST: Does it really measure frontal lobe executive functions?



Does the WCST measure executive functioning? There is little doubt that the WCST is one of the predominant tests used in neuropsychological assessment to assess executive functions. However, studies have recently questioned the validity of drawing inferences about the site of executive functions (the frontal lobes of the brain) from performance on the WCST. The following "in press" article, which presents a nice review of the literature, suggests that in it's current administration formats the WCST is not the sensitive measure of frontal lobe executive functioning as is often thought. Below I present the abstract, a few select passages, and the primary conclusion from this excellent review article.

Nyhus, E., & Barceló, F. (in press). The Wisconsin Card Sorting Test and the cognitive assessment of prefrontal executive functions: A critical update. Brain and Cogntion.

Abstract

For over four decades the Wisconsin Card Sorting Test (WCST) has been one of the most distinctive tests of prefrontal function. Clinical research and recent brain imaging have brought into question the validity and specificity of this test as a marker of frontal dysfunction. Clinical studies with neurological patients have confirmed that, in its traditional form, the WCST fails to discriminate between frontal and non-frontal lesions. In addition, functional brain imaging studies show rapid and widespread activation across frontal and non-frontal brain regions during WCST performance. These studies suggest that the concept of an anatomically pure test of prefrontal function is not only empirically unattainable, but also theoretically inaccurate. The aim of the present review is to examine the causes of these criticisms and to resolve them by incorporating new methodological and conceptual advances in order to improve the construct validity of WCST scores and their relationship to prefrontal executive functions. We conclude that these objectives can be achieved by drawing on theory-guided experimental design, and on precise spatial and temporal sampling of brain activity, and then exemplify this using an integrative model of prefrontal function [i.e., Miller, E. K. (2000). The prefrontal cortex and cognitive control. Nature Reviews Neuroscience, 1, 59–65.] combined with the formal information theoretical approach to cognitive control [Koechlin, E., & Summerfield, C. (2007). An information theoretical approach to prefrontal executive function. Trends in Cognitive Sciences, 11, 229–235.].

According to the authors, there are at least two different systems of administration and scoring of the WCST. There is the "standard version by Grant and Berg (1948) with Milner´s (1963) correction criteria and the shortened version by Heaton (Heaton, 1981; Heaton, Chelune, Talley, Kay, & Curtis, 1993 ). Furthermore, the test has been administered in modified versions by Nelson (1976), Delis, Squire, Bihrle, and Massman (1992), and Barceló (1999, 2003)."

In the conventional administration:
the WCST consists of four key cards and 128 response cards with geometric figures that vary according to three perceptual dimensions (color, form, or number). The task requires subjects to find the correct classification principle by trial and error and examiner feedback. Once the subject chooses the correct rule they must maintain this sorting principle (or set) across changing stimulus conditions while ignoring the other – now irrelevant – stimulus dimensions. After ten consecutive correct matches, the classification principle changes without warning, demanding a flexible shift in set. The WCST is not timed and sorting continues until all cards are sorted or a maximum of six correct sorting criteria have been reached.
Conclusions
The present interest in prefrontal cortex function has renewed the use of the WCST in clinical and experimental settings. However, much criticism has questioned the utility of this test as a marker of prefrontal function. A critical review of clinical studies suggests that the original WCST does not distinguish between frontal and non-frontal lesions. Likewise, functional neuroimaging studies confirm that delivery of negative feedback during WCST rule transitions activates a widespread network of frontal and non-frontal regions within a split-second time scale. New methodological and conceptual advances from theory-guided experimental designs, precise spatial and temporal sampling of brain activity, and modern integrative models of prefrontal function (Miller, 2000) combined with a formal information theoretical approach to cognitive control (Koechlin & Summerfield, 2007) can improve our understanding of the WCST and its relationship to prefrontal executive functions. These advances suggest that simple modifications of the original version of the WCST may offer more valid and reliable measures of key component operations, such as the maintenance, shifting, and updating of task-set information over trials. Fast brain imaging techniques help us put into perspective the specificity of the test as a marker of prefrontal function as a key node within the widely distributed and tightly interconnected neural networks subserving human cognition.

Technorati Tags: , , , , , , , , , , ,

Thursday, June 11, 2009

Art and Science of Test Development: Encyclopedia of Gv (visual-spatial) tests



Yesterday I blogged about my "Art and Science of Test Development" project.  Today I continue with another tidbit.

When trying to develop new tests to measure cognitive abilities, a test developer often looks at existing tests (whats been done before).  Many years ago I was given a Xeroxed copy of the large and out-of-print  International Directory of Spatial Tests", which is a godsend if you are thinking of developing a new measure in the domain of Gv (visual processing).  It includes images and brief descriptions of over 390 different visual-spatial tests grouped into 13 categories.  We (authors of the WJ III) used it to decide on what new Gv test we might add to the WJ III.  We ended up with WJ III Block Rotation test (in the WJ III Diagnostic Supplement).  Although dated, and thus not inclusive of Gv test innovations since 1983, it is tremndous resource.

If you are not looking to develop new Gv tests, reviewing at all the various tests (and permutations of common formats) is very interesting and you can often see the historical roots of individual tests in many contemporary intelligence batteries.

While searching the net yesterday I discovered that this publication is now available via download as a PDF file.  You can find it at the link above.  Be forwarned...it is a huge file (35 MB)...don't download from wifi connection is my advice.

Technorati Tags: , , , , , , , , , , ,

Wednesday, May 27, 2009

European Journal of Psych Assessment V25(2)

A new issue is available for the following Hogrefe & Huber journal:

European Journal of Psychological Assessment

Volume 25, Issue 2

Assessing cognitive failures.
Pages 69-72
Efklides, Anastasia; Sideridis, Georgios D.
FACT-2—The Frankfurt Adaptive Concentration Test: Convergent validity with self-reported cognitive failures.
Pages 73-82
Goldhammer, Frank; Moosbrugger, Helfried; Krawietz, Sabine A.
Situational variability of experiential and rational information-processing styles in stressful situations.
Pages 107-114
Claes, Laurence; Witteman, Cilia; van den Bercken, John
The Blank in the Mind Questionnaire (BIMQ).
Pages 115-122
Moraitou, Despina; Efklides, Anastasia
Realism of confidence judgments.
Pages 123-130
Stankov, Lazar; Lee, Jihyun; Paek, Insu
Technorati Tags: , , , , , , , , , ,