- Multilevel/multidimensional/mixture Item response theory (IRT) models for both dichotomous and polytomous responses (e.g., Likert scales, nominal responses), with applications in achievement testing, aptitude testing, personality assessment, and health measurement
- Development of new scales/instruments
- Refine and validate scales (e.g., check for measurement invariance across different cultures, groups, etc)
- Computerized adaptive testing (CAT) and applications
- Cognitive diagnostic modeling and applications in classroom assessment and learning
- General statistical methods and applications including adaptive design, longitudinal models, etc.
Areas of Interest
My scientific career is broadly situated in the field of educational and psychological measurement, with specific devotion to methodology advancement that leads to better assessment with higher reliability/fidelity, fairness, and security. My passion is about improving the methods for measuring a wide range of educational and psychological variables, as well as developing, refining, and extending methods for analyzing multivariate data that are widely used in the behavioral sciences. The first thrust of my work has been centered on resolving challenges emerged from the wide-ranging implementation of CAT. The second line of my core research agenda has been focused on developing innovative models/methods to better understand nonlinear relationships among observed and latent variables using state-of-art latent variable methods, including multidimensional and/or multilevel item response theory models, cognitive diagnostic models, and mixture models. I also look forward to any potential collaborations to apply the advanced psychometric methods in education, psychology, and health research broadly.
B.S. in Psychology, Peking University (Beijing, China)
M.S. in Statistics, University of Illinois at Urbana-Champaign
Ph.D. in Quantitative Psychology, University of Illinois at Urbana-Champaign
2018, Best Reviewer Award for Psychometrika (Psychometric Society)
2017, McKnight Presidential Fellow, University of Minnesota
2017, Early Career Award, Psychometric Society
2016, Best Reviewer Award for Psychometrika (Psychometric Society)
2016, Outstanding Reviewer Award for Journal of Educational and Behavioral Statistics (AERA & ASA)
2015, Early Career Award, AERA Division D (Quantitative Research Methodology)
2014, Early Career Award, International Association for Computerized Adaptive Testing
2014, Post-doctoral Fellow, National Academy of Education/Spencer Foundation
2014, Jason Millman Promising Measurement Scholar Award, NCME
2013, State-of-the-Art Lecturer Award, Psychometric Society
2013, Alicia Cascallar Best Paper Award, NCME
Latest articles (* indicate student advised)
Wang, C., Xu, G., & Zhang, X.* (Accepted). Correction for Item Response Theory Latent Trait Measurement Error in Linear Mixed Effects Models. Psychometrika.
Lu, J.*, Wang, C., Zhang, J., & Tao, J. (Accepted). A mixture model for responses and response times with a higher-order ability structure to detect rapid guessing behavior. British Journal of Mathematical and Statistical Psychology.
Wang, C., David, W., & Su, S. (2019). Modeling Response Time and Responses in Multidimensional Health Measurement. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2019.00051
Wang, C., David, W., & Shang, Z. (2018). Variable-length stopping rules for multidimensional computerized adaptive testing. Psychometrika. https://doi.org/10.1007/s11336-018-9644-7
Wang, C., & Zhang, X.* (2018). A note on the conversion of item parameters standard errors. Multivariate Behavioral Research. https://doi.org/10.1080/00273171.2018.1513829
Zhu, Z.*, Wang, C., Tao, J. (2018). A two-parameter logistic extension model---An efficient variant of the three-parameter logistic model. Applied Psychological Measurement. https://doi.org/10.1177/0146621618800273
For a full list of publications, please see my Google Scholar page
For the source code of the projects, please see my Lab (Psychometrics/Measurement Lab) page