Lee, Young-Sun (sly2003)

Young-Sun Lee

Associate Professor of Psychology and Education
212-678-8304

Office Location:

549 GDodge

Office Hours:

By appointment. Students can book office hours with me through https://yslee.youcanbook.me/

Educational Background

Ph.D. University of Wisconsin-Madison, Educational Psychology (Educational Measurement & Statistics), 2002.
M.A. Ewha Womans University, Seoul, South Korea, Educational Measurement & Evaluation, 1995.
B.A. Ewha Womans University, Seoul, South Korea, Education, 1992.

Scholarly Interests

Psychometrics(Classical Test Theory, Item Response Theory, & Cognitive Diagnosis Modeling), Educational and Psychological Measurement, and Applied Statistics

Selected Publications

von Davier, M., & Lee, Y.-S. (Eds.). (May, 2019). Handbook of Diagnostic Classification Models. New York: Springer.  

von Davier, M., & Lee, Y.-S. (2019). Introduction: From Latent Class Analysis to DINA and Beyond. In von Davier, M., & Lee, Y.-S. (Eds.), Handbook of Diagnostic Classification Models (pp. 1-17). New York: Springer.  

Park, Y. S., & Lee, Y.-S. (2019). Explanatory Diagnostic Models. In von Davier, M., & Lee, Y.-S. (Eds.), Handbook of Diagnostic Classification Models (pp. 207-222). New York: Springer.  

Lee, Y.-S., & Luna Bazaldúa, D. (2019). How to Conduct a Study with Diagnostic Models. In von Davier, M., & Lee, Y.-S. (Eds.), Handbook of Diagnostic Classification Models (pp. 525-545). New York: Springer.  

Park, Y. S., Xing, K., & Lee, Y.-S. (2018). Explanatory Cognitive Diagnostic Models: Incorporating Latent and Observed Predictors. Applied Psychological Measurement, 42(5), 376-392. 

Luna Bazaldúa, Lee, Y.-S., D. A., Keller, B., & Fellers, L. (2017). Assessing the Performance of Classical Test Theory Item Discrimination Estimators in Monte Carlo   Simulations. Asia Pacific Education Review,

18(4), 585-598.

Park, J. Y., Lee, Y.-S., & Johnson, J. (2017). An Efficient Standard Error Estimator of the DINA Model Parameters When Analyzing Clustered Data. International Journal of Quantitative Research in Education, 4, 159-190. 

Schnur, J. B., Chaplin, W. F., Khurshid, K., Mogavero, J.N., Goldsmith, R. E., Lee, Y.-S., Litman, L., & Montgomery, G, H. (2017). Development of the Healthcare Triggering Questionnaire in Adult Sexual Abuse Survivors. Psychological Trauma: Theory, Research, Practice, and Policy.

Chiang, H.-M., Ni, X., & Lee, Y.-S. (2017). Life Skills Training for Middle and High School Students with Autism. Journal of Autism and Developmental Disorders47(4), 1113-1121.  

Lee, Y.-S., Park, Y. S., & Ginsburg, H.P. (2016). Socio-Economic Status Differences in Mathematics Accuracy, Strategy Use, and Profiles in the Early Years of Schooling. ZDM: The International Journal on Mathematics Education, 48(7), 1065-1078.

Lembke, E., Lee, Y.-S., Park, Y. S., & Hampton, D. (2016). Longitudinal Growth on Curriculum-Based Measurements Mathematics Measures for Early Elementary Students. ZDM: The International Journal on Mathematics Education, 48(7), 1049-1063.

Ginsburg, H.P., Lee, Y.-S., & Pappas, S. (2016). Using the Clinical Interview and Curriculum Based Measurement to Examine Risk Levels. ZDM: The International Journal on Mathematics Education, 48(7), 1031-1048.  

Lee, Y.-S., & Lembke, E. (2016). Developing and evaluating a Kindergarten to third grade CBM mathematics assessment. ZDM: The International Journal on Mathematics Education, 48(7), 1019-1030. 

Ginsburg, H.P., Lee, Y.-S., & Pappas, S. (2016). A Research-Inspired and Computer-Guided Clinical Interview for Math Assessment: Introduction, Reliability and Validity. ZDM: The International Journal on Mathematics Education48(7), 1003-1018. 

Park, Y. S., Lee, Y.-S., & Xing, K. (2016). Investigating the Impact of Item Parameter Drift for Item Response Theory Models with Mixture Distributions. Frontiers in Psychology, Quantitative Psychology and Measurement, 7(255), DOI 10.3389/fpsyg.2016.00255.

Lee, Y.-S. (2016). Psychometric analyses of the Birthday Party. ZDM: The International Journal on Mathematics Education, 48(7), 961-975. 

Park, J. Y., Johnson, M., & Lee, Y.-S. (2015). Posterior predictive model checks for cognitive diagnostic models. International Journal of Quantitative Research in Education, 2(3/4),     244-264.  

Choi, K. M., Lee, Y.-S., & Park, Y. S. (2015). What CDM can tell about what students have learned: An analysis of TIMSS eighth grade mathematics. Eurasia Journal of Mathematics, Science and Technology Education, 11(2), 219-231.

Liu, H.-T., & Lee, Y.-S. (2015). Measuring self-regulation in second language learning: Rasch analysis. SageOpen. July-September 2015: 1–12, DOI: 10.1177/2158244015601717

Park, Y. S., & Lee, Y.-S.(2014). An Extension of the DINA model using covariates: Examining factors affecting response probability and latent classification. Applied Psychological Measurement, 38(5), 376-390 

de la Torre, J., & Lee, Y.-S. (2013). Evaluating the Wald test for item-level comparison of saturated and reduced models in cognitive diagnosis. Journal of Educational Measurement, 50(4), 355-373. 

Lee, Y.-S., Park, Y. S., Song, M. Y., Kim, S. E., Lee, Y. J., & In, B. R. (2012). Investigating Score Reporting of Attribute Profiles from the National Assessment of Educational Achievement using Cognitive Diagnostic Models. Journal of Educational Evaluation, 25(3), 411-433(Written in Korean)

Lee, Y.-S., de la Torre, J., & Park, Y. S. (2012). Cognitive diagnosticity of IRT-constructed assessment: An empirical investigation. Asia Pacific Education Review, 13(2), 333-345. 

Park, Y. S., Lee, Y.-S., Lee, Y. J., In, B. R., Kim, S. E., & Song, M. Y. (2012). Multilevel analysis of a cognitive diagnostic model using National Assessment of Educational Achievement: Examining differences between regions. Journal of Educational Evaluation, 25(2), 193-212. (Written in Korean)

Kim, S. E., Park, Y. S., & Lee, Y.-S. (2012). Application of Latent Class Model toMultiple Strategy CDM Analysis. Journal of Educational Evaluation, 25(1), 49-68. (Written in Korean)

Lee, Y.-S., Krishnan, A., & Park, Y. S. (2012). Psychometric properties of the Children's Depression Inventory: An IRT analysis across age in a non-clinical, longitudinal, adolescent sample. Measurement and Evaluation in Counseling and Development. 45(2), 84-100.

Lee, J., & Lee, Y.-S. (2012). The Effects of Testing. In Hattie, J., & Anderman, E., International Guide to Student Achievement. New York: Routledge Publishers.

Lee, Y.-S.
, Lembke, E., Moore, D., Ginsburg, H., & Pappas, S. (2012). Item-Level and Construct Evaluation of Early Numeracy Curriculum-Based Measures. Assessment for Effective Intervention. 37(2), 107-117.

Hampton, D. D., Lembke, E. S., Lee, Y.-S., Pappas, S., Chiong, C., & Ginsburg, H. (2012). Technical Adequacy of Early Numeracy Curriculum-Based Progress Monitoring Measures for Kindergarten and First-Grade Students. Assessment for Effective intervention, 37(2), 118-126. 

Lee, Y.-S.
, Park, Y. S., & Taylan, D. (2011). A cognitive diagnostic modeling of attribute mastery in Massachusetts, Minnesota, and the U.S. national sample using the TIMSS 2007. International Journal of Testing, 11, 144-177.

Ginsburg, H. P., Pappas, S., Lee, Y.-S., & Chiong, C. (2011). mCLASS:Math: Insights into Children's Mathematical Minds and Performance. In Noyce, P., & Hickey, D. T., Formative Assessment in Learning Contexts, the Next Generation (pp. 46-97). Harvard Education Press.

Park, Y. S., & Lee, Y.-S. (2011). Diagnostic Cluster Analysis: An Empirical Study of Mathematics Skills via TIMSS 2007. IERI Monograph Series: Issues and Methodologies in Large-Scale Assessments, 4. 75-108.

Kim, S.-H., Sherry, A., Lee, Y.-S., & Kim, C.-D. (2011). Psychometric Properties of a Translated Korean Adult Attachment Measure. Measurement and Evaluation in Counseling and Development, 44, 135-150. 

Lee, Y.-S.
, & Park, Y. S. (2011). Examining the Mastery of Mathematics Skills in Italy Using a Cognitive Diagnostic Model. RicercAzione, 3(1). 59-74.

de la Torre, J., & Lee, Y.-S. (2010). A Note on the Invariance of the DINA Model Parameters. Journal of Educational Measurement, 47(1), 115-127.

Xu, X., Douglas, J., & Lee, Y.-S. (2010). Linking with Nonparametric IRT Models. In von Davier, A. A. (Ed.), Statistical Models for Test Equating, Scaling, and Linking (pp. 243-260). New York: Springer Verlag.  

Lee, Y.-S.
, Cohen, A., & Toro, M. (2009). Examining Type I error and power for detection of differential item and testlet functioning. Asia Pacific Education Review, 10, 365-375.  

Lee, Y.-S., Wollack, J., & Douglas, J. (2009). On the use of nonparametric ICC estimation techniques for checking parametric model fit. Educational and Psychological Measurement, 69, 181-197. 

Cervellione, K., Lee, Y.-S., & Bonanno, G. A. (2009). Rasch Modeling of the Self-Deception Scale of the Balanced Inventory of Desirable Responding. Educational and Psychological Measurement, 69, 438-458.

Lee, Y.-S., Grossman, J., & Krishnan, A. (2008). Cultural Relevance of Adult Attachment: Rasch Modeling of the Revised Experiences in Close Relationships in a Korean Sample. Educational and Psychological Measurement, 68, 824-844.

Lee, Y.-S. (2007). A Comparison of Methods for Nonparametric Estimation of Item Characteristic Curves for Binary Items. Applied Psychological Measurement, 31(2), 121-134. 

Lee, Y.-S., Douglas, J., & Chewning, B. (2007). Techniques For Developing Health Quality of Life Scales For Point of Service Use. Social Indicators Research: An International and Interdisciplinary Journal for Quality-Of-Life Measurement, 83(2), 331-350.  

TECHNICAL REPORTS   

Kim, C.-D., Lee, Y.-S., Lee, J. Y., Yoo, H. S., Lee, D. H., Oh, I., & Lee, S. M. (2012). Multiphasic Developmental Potential Inventory-Seoul Form: MDPI-S. Seoul Metropolitan Office of Education, Seoul, Republic of Korea. 

Song, M.-Y., Lee, Y.-S., & Park, Y. S.(2011). Analysis and score reporting based oncognitivediagnostic models using the National Assessment Educational Achievement. Seoul,Republic of Korea: Korea Institute for Curriculum and Evaluation (KICE)Research Report RRE 2011-8.

Lee, Y.-S., Pappas, S., & Chiong, C., & Ginsburg, H. (2011). mCLASS:MATH -Technical Manual. Brooklyn, NY: Wireless Generation, Inc.

Lee, Y.-S.
, Park, Y. S., & Lee, S. Y. (2010). Post-Smoothing by Kernel Equating to CompareCollege Scholastic Aptitude Test Performance between Offices of Education viaMultilevel Methods. Seoul, Republic of Korea: Ministry of Education, Science and Technology. 

Romero, M., & Lee, Y.-S. (2008). How Maternal, Family and Cumulative Risk Affect Absenteeism in Early Schooling: Facts for Policymakers. New York, NY: National Center for Children in Poverty, Columbia University, Mailman School of Public Health. (available  at http://www.nccp.org/publications/pdf/text_802.pdf)

Romero, M., & Lee, Y.-S. (2008). Brief #2: The Influence of Maternal and Family Risk on Chronic Absenteeism in Early Schooling. "What data tell us about the role of chronic absenteeism in early schooling?"  New York, NY: National Center for Children in Poverty, Columbia University, Mailman School of Public Health. (available at http://www.nccp.org/publications/pdf/text_792.pdf)

Lee, Y.-S., Lembke, E., Moore, D., Ginsburg, H., & Pappas, S. (2007). mCLASS:MATH - Identifying technically adequate early mathematics measures. Brooklyn, NY: Wireless Generation, Inc.  

Romero, M., & Lee, Y.-S. (2007). Brief #1: A National Portrait of Chronic Absenteeism in the Early Grades. "What data tell us about the role of chronic absenteeism in early schooling?"  New York, NY: National Center for Children in Poverty, Columbia University, Mailman School of Public Health. (available at http://www.nccp.org/publications/pdf/text_771.pdf

Dr. Young-Sun Lee joined the department in the fall of 2002. She received her Ph.D in Educational Measurement and Statistics (Department of Educational Psychology) with a minor in Statistics at the University of Wisconsin -Madison. Dr. Lee's research interests are focused primarily on psychometric approaches to solve practical problems in educational and psychological testing. Studies currently in progress focus on development/applications of mixture IRT models, cognitive diagnostic models, international comparative studies using large scale assessment data, and test construction/scale develeopment for young children. Dr. Lee currently teaches courses in test theory (HUDM 6051 - Psychometric Theory I (Classical Test Theory) and HUDM 6052 - Psychometric theory II (Item Response Theory)) and statistical methods (HUDM 4120 - Basic Concepts in Statistics, HUDM 4122 - Probability and Statistical Inference, and HUDM 5122 - Applied Regression Analysis).

Emerging Research-Empirical: Development and Application of a Multilevel Multiple-Group CDM to Compare Cognitive Attribute Distributions based on Eighth Grade TIMSS Mathematics (NSF funded) 
 
Students in the United States have consistently performed below many of their international peers on mathematics assessments like TIMSS and PISA (AFT, 1999). This gap in performance is described in terms of a few very broad content domains (e.g., Algebra, Geometry), but it remains unclear exactly what math skills U.S. students are missing. The proposed research will develop and apply methods based on cognitive diagnostic modeling to understand exactly what math skills U.S. students lack, and compare the distributions of skills across countries and within countries across years. The research team will also investigate the relationship between the presence or absence of these skills and core background variables (e.g., gender). The proposed research focuses on the TIMSS eighth grade mathematics assessments from 1999, 2003, and 2007.

mCLASS:Math: Development and analysis of an integrated screening, progress monitoring, and cognitive assessment system for K-3 mathematics (IES funded)
 
We conduct a series of studies to evaluate the reliability and validity of an integrated assessment system for K-3 mathematics and to modify it appropriately. The assessment includes Curriculum Based Measurement (CBM) screening and progress monitoring measures and diagnostic cognitive interviews. It is intended especially for use with under-achieving students from a variety of ethnic, linguistic, and economic backgrounds. The two assessment methods use a proven technology platform-a hand-held computer that guides teachers' assessments-to shorten and simplify the assessment administration process, and to make the resulting data readily available to teachers.
 
Over four years, we will: (1) evaluate the reliability and validity of all items and measures in both the CBM and the diagnostic interviews, and make any necessary revisions; (2) create growth models that describe students' typical trajectories and "aim-lines" on the CBM measures; (3) establish cut-points for the CBM, to aid teachers in identifying students in need of special help; (4) investigate student profiles within and across the CBM and diagnostic interviews; and (5) conduct predictive validity studies to establish which combinations of CBM measures and diagnostic questions best predict student performance in later years.
 
Computer Guided Comprehensive Mathematics Assessment for Young Children (NIH funded)
 
Our first aim is to develop an Early Mathematics Assessment System (EMAS) appropriate for young children (3- to 5-year olds) that will serve three major functions. The long form (L-EMAS) can be used to (1) evaluate the effectiveness of a variety of curricula, (2) and to provide immediate and specific cognitive process information that can be used to guide instruction. (3) The short form (S-EMAS) can be used as a screening instrument for identifying children at risk for mathematical difficulties and those who might require comprehensive assessment and intervention. The EMAS will have several key features. It will measure a broad range of mathematical content, assessing number, operations, shape, space, measurement and pattern. It also will measure a broad range of mathematical proficiency, including performance; cognitive processes underlying performance; comprehension and use of mathematical language; and children's "motivation"-attentiveness and affect. The EMAS will be research-based, drawing on modern cognitive science, developmental and educational research. It will engage and motivate young children through the use of games or purposeful activities such as a birthday party. This type of context should ensure that the EMAS, which will be translated into Spanish, can be used with a diverse population.
 
Our second aim is to develop innovative technology whereby a personal digital assistant (PDA) will guide assessors in administering the EMAS. The technology will help them to use flexible probes modeled on the clinical interview to gain greater insight into student proficiency.
 
Our third aim is to use statistical procedures to ensure that the EMAS is reliable and valid. The psychometric properties of both the EMAS will be investigated using both classical test theory (CTT) and item response theory (IRT) approaches to ensure high reliability and validity of the scale and quality of the items. We will conduct reliability studies and will use IRT to provide item and test information. We will also conduct validation studies. Once their reliability and validity have been shown to be strong, the L-EMAS and S-EMAS will be ready for norming. 
 
Our fourth aim is to help assessors to use the EMAS and to investigate their use of it. We will design and evaluate professional developmental activities to enable early childhood professionals to use the PDA comfortably to administer the EMAS and to interpret the results. We will also investigate how assessors use the EMAS and learn from it. The EMAS will contribute to the evaluation and improvement of mathematics education for young children.

Principal Investigator: "An Extension of the DINA Model Using Covaiates: Examining Attribute Mastery Prevalence and Matsery Profile on 4th Grade TIMSS Science", Dean's Grant for Tenured Faculty Research, Teachers College, Columbia University, Dates: 9/1/2012 - 8/31/2013, a semester research leave with salary & $5,000 research funds.

Co-Principal Investigator: "Developing an instrument for measuring Multiphasic Developmental Potential Inventory (MDPI)" (다면적 성장잠재력 검사 개발 연구), Seoul Metropolitan Office of Education, Seoul, Republic of Korea, Chang-Dai Kim (PI), Dates: 7/1/2011-2/29/2012, 50,000,000 won (approximately $50,000) (Written in Korean).  
 
Principal Investigator: "A Study on Examining and Improving the National Assessment of Educational Achievement using Cognitive Diagnostic Models"(인지진단모형을 통한 국가수준 학업성취도 평가자료의 분석 및 결과보고 개선 연구), Korea Institute for Curriculum and Evaluation (KICE), Dates: 6/1/2011 – 11/31/2011, 48,000,000 won (approximately $44,200) (Written in Korean).
 
Co-Principal Investigator: "Emerging Research-Empirical: Development and Application of a Multilevel Multiple-Group CDM to Compare Cognitive Attribute Distributions based on Eighth Grade TIMSS Mathematics", Johnson (PI), National Science Foundation (NSF), Dates: 9/1/2010 – 8/31/2012, $1,399,144 requested.
 
Principal Investigator: "Post-Smoothing by Kernel Equating to compare College Scholastic Aptitude Test Performance between Offices of Education via Multilevel Methods"(비모수 커널 추정법을 통한 대학수학능력시험의 검사동등화 및 다층모형 분석: 시도교육청간의 학업 성취도 비교 연구), Korea Institute for Curriculum and Evaluation (KICE) - The College Scholastic Aptitude Test (CSAT) Research Project, Dates: 8/15/2010 - 12/7/2010, 20,000,000 won (approximately $18,000) (Written in Korean).
 
Co-Principal Investigator: "mCLASS:Math: Development and analysis of an integrated screening, progress monitoring, and cognitive assessment system for K-3 mathematics", Institute of Education Sciences (IES; R305B070325), Ginsburg (PI), Dates: 9/1/2007 - 6/30/2011, total costs $1,565,455.
 
Co-Principal Investigator: "Computer Guided Comprehensive Mathematics Assessment for Young Children", National Institute of Health (NIH; 1 R01 HD051538-01), Ginsburg (PI), Dates: 10/01/2005 - 8/31/2010, total costs $3,170,839. 
 
Statistician: "What Data Tell Us About the Role of Chronic Absenteeism in Early School", Annie E. Casey Foundation, Knitzer & Romero (PI), National Center for Children in Poverty, School of Public Health, Columbia University, Dates: 3/01/2007 - 2/28/2008, total costs $35,000.
PROFESSIONAL ACTIVITIES

Member, Editorial Board, Early Childhood Education & Care (ECEC), The Korean Society for Early Childhood Education & Care, 2016 – Present

Member, Board of Directors, Korean-American Educational Researchers Association (KAERA), 2018 – Present

Member, Editorial Board, Korean Journal of School Education Research (KJSER), 2017 – Present 

Associate Editor, Journal of Asia Pacific Counseling (JAPC), 2012 – Present

Advisory Editor, Asia Pacific Education Review (APER), 2016 – Present

Review Editor, Frontiers in Applied Mathematics and Statistics and Psychology (Quantitative Psychology and Measurement), 2016 – Present

Review Panel, Methodology, Measurement, and Statistics (MMS) Program, National Science Foundation (NSF), 2016

Junior Faculty Mentor, New Faculty Mentoring Program, Korean-American Educational Researchers Association (KAERA), 2012 – Present. The primary role is to provide professional guidance and support for junior faculty members.

Graduate Student Mentor, KAERA Mentoring Program, 2012 – Present. The primary role is to provide professional guidance and support for international graduate students.

Served as a contributor to the joint project of the Korean-American Educational Researchers Association (KAERA) and Korean Society for Educational Evaluation (KOSEEV) that translated the new edition of ‘Educational Measurement (4th ed.)’ edited by R. L. Brenna. The Korean version of the book was published in 2015.

Nomination Committee, KAERA, whose tasks include nominating candidates for vice-president and candidates for the two position of the board of directors, 2013 – 2014.

Mentoring Committee, KAERA, whose tasks include drafting a mentoring plan and guidelines for the KAERA mentoring program, 2013 - 2018

Treasurer, Korean American Educational Researchers Association (KAERA), 2010 – 2013

Committee Member, Brenda H. Loyd Outstanding Dissertation Award Committee, National Council on Measurement in Education (NCME), 2007-2011

Review Panel, Child-related interventions research applications (ZMH1 ERB-P(01)) for the National Institute of Mental Health (NIMH), 2005

Ad-hoc Reviewer for Applied Psychological Measurement, Asia Pacific Education Review, Educational and Psychological Measurement, International Journal of Testing, Journal of Educational and Behavioral Statistics, Journal of Educational Evaluation (Korean), Journal of Educational Measurement, Psychological Methods

Conference Proposal Reviewer for American Educational Research Association (AERA), AERA Division D Graduate Student In-progress Research Gala, National Council on Measurement in Education (NCME), and Psychometric Society Meetings.

American Educational Research Association (AERA)
American Psychological Association (APA)
National Council on Measurement in Education (NCME)
Psychometric Society
Society for Research in Child Development (SRCD)
PRESENTATIONS AT MEETINGS AND CONFERENCES  
*designates graduate student

111 Yoo, N.*, Bezirhan, U.*, & Lee, Y.-S. (April, 2019). Investigating Item Position and Psychological Factor Effects with Missing Data. Poster presented at the annual meeting of the National Council on Measurement in Education, Toronto, Canada.

110 Bai, Y.*, & Lee, Y.-S. (2018, July). Comparison of group-based and model-based approaches to person-fit analysis. Paper to be presented at the 83rdInternational Meeting of Psychometric Society, New York, NY. 

109 Yoo, N.*, & Lee, Y.-S. (2018, July). Investigating item-position and psychological factor effects on item parameter estimation. Paper to be presented at the 83rdInternational Meeting of Psychometric Society, New York, NY.

108 Bezirhan, U.*, & Lee, Y.-S. (2018, July). Analysis of Graphical Diagnostic Classification Models with Covariates. Paper to be presented at the 83rdInternational Meeting of Psychometric Society, New York, NY.

107 Yoo, N.*, Bezirhan, U.*, & Lee, Y.-S. (2018, April). Effects of Item Positions and Psychological Factors on Item Parameter Estimation. Poster presented at the annual meeting of the National Council on Measurement in Education, New York, NY.

106 Bezirhan, U.*, Bai, Y.*, & Lee, Y.-S. (2018, April). Handling Missing Data with Imputation in Cognitive Diagnostic Models. Poster presented at the annual meeting of the National Council on Measurement in Education, New York, NY. 

105 Cintron, D. W.*, & Lee, Y.-S. (2018, April). Diagnostic Models of Science Attribute Mastery: An International Multi-group Study. Paper presented at the annual meeting of the American Educational Research Association, New York, NY.

104 Park, Y. S., & Lee, Y.-S. (2018, April). Multidimensional Higher-Order Models for Skills Diagnosis: Descriptive and Explanatory Approaches. Paper presented at the annual meeting of the National Council on Measurement in Education, New York, NY.

103 Luna Bazaldúa, D. A.*, Lee, Y.-S., & Park, Y. S. (2016, April). Development of a reparametrized compensatory cognitive diagnostic model with covariates. Poster presented at the annual meeting of the American Educational Research Association, Washington, DC.  

102 Liu, X.*, Lee, Y.-S., & Zhao, Y.* (2016, April). Bayesian Inferences of Q-matrix with Presence of Anchor Items. Paper presented at the annual meeting of the National Council on Measurement in Education, Washington, DC.  

101 Park, Y. S., Lee, Y.-S., & Xing, K.* (2016, April). Incorporating latent and observed predictors in cognitive diagnostic models. Paper presented at the annual meeting of the National Council on Measurement in Education, Washington, DC.  

100 Lee, Y.-S., Park, J. Y.*, & Yoo, N.* (2015, July). Parameterizing the General Diagnostic Model to Examine Differences in Attribute Distributions between Multiple Groups. Paper presented at the 80th annual meeting of the Psychometric Society, Beijing, China.

99 Park, J. Y.*, Johnson, M., & Lee, Y.-S. (2015, July). Bayesian Model Checking Methods for Cognitive Diagnosis Models. Paper presented at the 80th annual meeting of the Psychometric Society, Beijing, China.

98 Park, Y. S., Lee, Y.-S., & Xing, K.* (2015, July). Mixture Higher-Order Diagnostic Classification Model with Covariates. Paper presented at the 80th annual meeting of the Psychometric Society, Beijing, China.

97 Park, J. Y.*, Johnson, M., & Lee, Y.-S. (2015, April). The Robust Sandwich Variance Estimators for the DINA Model. Paper presented at the annual meeting of the National Council on Measurement in Education, Chicago, IL.  

96 Luna Bazaldúa, D. A.*, & Lee, Y.-S. (2015, April). Classification accuracy of Mixture IRT and Cognitive Diagnostic Models. Paper presented at the annual meeting of the National Council on Measurement in Education, Chicago, IL. 

95 Luna Bazaldúa, D. A. *, Fellers, L.*, Keller, B., & Lee, Y.-S. (2015, March). Comparative study on item discrimination through the use of Classical Test Theory and Item Response Theory. Symposium at the II Latin-American Congress on Measurement and Educational Evaluation. Federal District, Mexico.

94 Park, J. Y.*, Johnson, M., & Lee, Y.-S. (2014, July). Robust Variance Estimation for Cognitive Diagnostic Models. Poster presented at the 9th conference of the International Test Commission at San Sebastian, Spain.

93 Park, Y. S., & Lee, Y.-S. (2014, July). Analysis of Covariates using TIMSS data based on Multiple-Groups Higher-Order Reparameterized DINA model. Poster presented at the 9th conference of the International Test Commission at San Sebastian, Spain.  

92 Lee, Y.-S., Lee, S. Y., & Park, Y. S. (2014, April). A Mixture IRT Model Analysis in Exploring Kindergarteners’ Mathematical Strategy. Paper presented at the annual meeting of the National Council on Measurement in Education, Philadelphia, PA.

91 Park, Y. S., Seong, T.-J., & Lee, Y.-S. (2014, April). Parameterizing Covariates Affecting Attribute Classification in Multiple-Groups DINA. Paper presented at the annual meeting of the National Council on Measurement in Education, Philadelphia, PA.

90 Kwon, S.-H.*, Lee, Y.-S., & Lee, S. Y. (2014, January). The Development of Kindergarten Teacher Resilience Scale using Rasch Modeling. Poster presented at the annual meeting of the Hawaii International Conference on Education, Honolulu, HI.

89 Park, Y. S., & Lee, Y.-S. (2014, January). Studying the Effect of Covariates on Mathematics Skill Mastery: Analysis of TIMSS 4th Grade Mathematics using the Multiple-Groups RDINA model. Poster presented at the annual meeting of the Hawaii International Conference on Education, Honolulu, HI.

Program Director in Measurement & Evaluation, 2017 – Present

Advisor, Korean Graduate Students Association (TCKGSA), 2002 – Present

Committee Member & Chair, Faculty Executive Committee – Personnel Subcommittee, 2017 – Present  

Committee Member – Benefit Committee, 2014 – Present

Committee Member, Dean’s Grant for Student Research Committee, 2005 – 2006, 2013 – 2017

Committee Member, Research Dissertation Fellowship Committee, 2016 – 2017

Committee Member, Counseling Psychology Search Committee, 2013 – 2014

Committee Member, Institutional Review Board (IRB), 2011 – 2014

Committee Member, Affirmative Action Committee (AAC), 2004 – 2005, 2011 – 2012

Committee Member, TESOL Search Committee, 2005 – 2006

549 Grace Dodge

Related Articles

TC at AERA, 2008

Hank Levin is giving the Distinguished Lecture; Janet Miller is receiving a lifetime achievement award; Susan Fuhrman, Amy Wells, Jeanne Brooks-Gunn and Edmund Gordon are speaking in Presidential Sessions, and Gordon and colleagues are part of "A Scholar's Evening in Harlem." And then there's the research.

Testing New Standards for Standardized Testing

Can diagnosing students' misconceptions lead to personalized teaching?

TC at AERA 2013

Some 172 faculty, students and others affiliated with Teachers College will present at "Education and Poverty: Theory, Research, Policy and Praxis," this year's meeting of the American Educational Research Association (AERA), which will be held in San Francisco from April 27-May 1.

When Less is More

The creation of a new Education Policy and Social Analysis Department, announced in January, clearly strengthens Teachers College's expertise in the policy arena. But it also creates a more logical and focused alignment within another department -- Human Development -- that is giving up some of its faculty members to the new effort.

Back to skip to quick links