Date of Award
12-2016
Degree Type
Dissertation
Degree Name
Doctor of Philosophy (PhD)
Department
Psychological Sciences
First Advisor
Deborah E. Rupp
Committee Chair
Deborah E. Rupp
Committee Member 1
Brian Hoffman
Committee Member 2
Jeffrey D. Karpicke
Committee Member 3
Rong Su
Abstract
Decades of assessment center (AC) research has resulted in an inevitable “validity ceiling” whereby increasing the validity of the AC method is becoming increasingly difficult. To overcome this challenge, new avenues for collecting and evaluating AC participant behaviors must be explored, with a particular focus on overcoming the inherent limitations of human observation—a hallmark of the AC method. This study examines detailed logs of AC participant behaviors captured automatically and unobtrusively during a computer-based simulation assessment. Using a decision making framework, basic characteristics of the new behavioral data are tested against existing theories of decisional efficacy. The construct-related validity of the new decision-related behavioral data is examined through the effect of decisional efficacy on overall assessor ratings of simulation performance. Results provide support for the validity of these new behaviors containing information related to decisional processes. Additionally, multiple types of cluster analysis are considered in order to identify general patterns within the decisional process data that impact overall simulation performance as rated by assessors. Depending on the method used, either a 7-group, 3-group, or 2-group solution seemed to fit the data, and both the 7-group and 3-group solution identified groups with mean differences in assessor ratings, suggesting that despite not finding distinct, stable groups, patterns in behavioral data are related to simulation performance. Furthermore, clusters identified in all analyses seem to be driven by both time-related characteristics of decisional process steps and patterns related to how time is distributed among multiple tasks within the simulation. Conclusions, immediate next steps, and general directions for future research are discussed.
Recommended Citation
Guidry, Brett W., "Finding the ghost with the machine: Breaking through the assessment center validity ceiling by exploring decisional processes using new sources of behavioral data within virtual assessments" (2016). Open Access Dissertations. 927.
https://docs.lib.purdue.edu/open_access_dissertations/927