EVALUATING DIMENSIONALITY OF SCALES: IMPLICATIONS FOR TOOL UTILITY

Arnold S1, Kolobe THA1, Smith E2
1University of Oklahoma Health Sciences Center, Department of Rehabilitation Sciences, Oklahoma City, United States, 2University of Chicago Illinois, Deptartment of Psychology, Chicago, United States

Background: Physical therapy (PT) literature and practice rely on outcome measures that use total or subscale scores. While the responsiveness of these outcome measure is at the center of controversy regarding inconclusive clinical findings and patient change, their validation process has received less attention. True Score Theory (TST) approaches validate most PT outcome measures despite limitations of criterion scores and difficulty interpreting change score when using raw scores. To measure change, item difficulty, and responsiveness is as important as subscale or total score.

Purpose: To evaluate the dimensionality of the School Outcomes Measure (SOM), that was constructed and validated with the TST approach, using the Rasch Analysis Model. The SOM is a minimal dataset program outcome measure of PT and occupational therapy (OT) services provided to students with disabilities in school settings. Rasch analysis, a form of item response theory, assesses item difficulty and a range of abilities that are specific to individual performance, and uses item fit statistics to measure underlying scale dimensions.

Methods: We used Rasch to analyze SOM data of 100 elementary and 108 secondary students to determine scale dimensionality and compute student ability and item difficulty estimates. We examined item fit statistics to evaluate if the data adhere to model requirements of a unidimensional model. An Outfit MnSq of > 1.3 that occurred with an Outfit Zstd > 2.0 represented misfit. We further assessed model requirements through principal component analysis (PCA) of the residuals to detect departures from the unidimensionality requirement of Rasch. We inspected the item maps to see if the SOM item difficulties and thresholds were spaced along the continuum, and whether items were too easy or too difficult, relative to students' abilities. We considered items well-targeted if estimates of item difficulties and thresholds fell within +/-2 logits of a person's ability.

Results: The results supported the construct validity and multidimensionality of the SOM for both groups. However, the PCA revealed two subscales for elementary (Mobility and Manipulation in Learning) and three for secondary students (Mobility, Manipulation in Learning, and Behavior), compared to the original five. Furthermore some of the items that misfit the Rasch Model for elementary were different to those that misfitted the model for secondary students (3 and 5, respectively). Several items showed redundancy in terms of difficulty and were eliminated.

Conclusion(s): Our findings, while supporting the construct validity of the SOM, revealed scalar and psychometric problems that have been shown to negatively affect responsiveness of outcome measures: First, was the relatively high number of items with redundant difficulty levels. Second, the items functioned differentially for elementary and secondary students. Third, several items misfit the model. The new subscale items fit the Rasch model, formed predictable hierarchies, and showed limited redundancy in item difficulty. Further analysis should examine dimensionality based on disability severity levels.

Implications: The SOM has accountability implications for PT. The revised SOM increases the potential to measure change in student performance compared to the previously used total raw score that could obscure change. The results can also inform responsiveness studies of the SOM.

Keywords: Outcome measure, Rasch analysis, school-based therapy

Funding acknowledgements: Presbyterian Health Foundation
U.S. Department of Education, Institute of Education Sciences

Topic: Outcome measurement; Paediatrics; Disability & rehabilitation

Ethics approval required: Yes
Institution: The University of Oklahoma Health Sciences Center (OUHSC)
Ethics committee: OUHSC Institutional Review Board
Ethics number: 1825; 7032


All authors, affiliations and abstracts have been published as submitted.

Back to the listing