Madhabi Chatterji
Office Location:
282 GDodgeTC Affiliations:
Faculty Expertise:
Educational Background
1990 Ph.D. University of South Florida, Tampa, Florida
1980 M.Ed. St. Christopher's College, University of Madras, Madras, India
1975 B.Ed. University of Bombay, Bombay, India
1973 B.Sc. (Honors) Lady Brabourne College, University of Calcutta, W.B., India
Scholarly Interests
Madhabi Chatterji is the Professor Emerita of Measurement, Evaluation, and Education at Columbia University’s Teachers College (TC) where she founded and still directs the Assessment and Evaluation Research Initiative, a center dedicated to promoting meaningful use of assessment-evaluation information to improve equity and the quality of practices and policies in education, psychology and the health professions (AERI, www.tc.edu/aeri). She retired from TC on August 31, 2022, following almost 22 years of service (2001 – 2022), prior to which she was an assistant professor of educational measurement and research at the University of South Florida (1996 – 2000), and the supervisor of research and evaluation services at the Pasco County School District, Florida (1988 – 1995). She is an award-winning and internationally recognized methodologist and educationist. Her 100+ publications focus on these themes:
- Instrument design, validation, validity and test use issues
- Evidence based practices (EBP), Evidence standards, the “evidence debate”, improving evidence-gathering and evidence-synthesis methods on “what works”
- Educational equity; closing students' learning and achievement gaps with proximal, diagnostic assessments
- Standard-based education reforms
- Assessment policies in U.S. and global settings
Selected Publications
Note: Published as Madhabi Banerji from 1990-2000, and as Madhabi Chatterji from January, 2001-present.
REFEREED PUBLICATIONS BY THEME:
I. Instrument Design, Validation and Validity Issues
Chatterji, M. (2024, in press). User-Centered Assessment Design: An Integrated Methodology for Diverse Populations. New York, NY: Guilford Press. [BOOK]
Chatterji, M. (Ed.) (2013). Validity and Test Use: An International Dialogue on Educational Assessment, Accountability, and Equity. Bingley, UK: Emerald Group Publishing Limited. [BOOK]
Chatterji, M. (2003). Designing and Using Tools for Educational Assessment. Boston, MA: Allyn & Bacon/Pearson. [BOOK]
Chatterji, M., & Lin, M. (2018). Designing non-cognitive construct measures that improve mathematics achievement in grade 5-6 learners. A user-centered approach. Quality Assurance in Education, 26(1), 70-100.
Chatterji, M., Tripken, J., Johnson, S., Koh, N. J., Sabain, S., Allegrante, J.P., & Kukafka, R. (2017). Development and validation of a health information technology curriculum: Toward more meaningful use of electronic health records. Pedagogy in Health Promotion, 3(3) 154–166. Electronic release: © 2016 Society for Public Health Education
Wyer, P. W. & Chatterji, M. (2013). Designing outcome measures for the accreditation of medical education programs as an iterative process combining classical test theory and Rasch measurement. The International Journal of Educational and Psychological Assessment, 13 (2), 35-61.
Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G. (2002). Using an iterative validation model to conceptualize, pilot-test, and validate scores from an instrument measuring Teacher Readiness for Educational Reforms. Educational and Psychological Measurement, 62, 442-463.
Banerji, M., Smith, R. M., & Dedrick, R. F. (1997). Dimensionlity of an early childhood scale using Rasch analysis and confirmatory factor analysis. Journal of Outcome Measurement, 1 (1), 56-86. [Received the Distinguished Paper Award from the Florida Education Research Association, 1993]
Banerji, M. & Ferron, J. (1998). Construct validity of a developmental assessment made up of mathematical patterns tasks. Educational and Psychological Measurement, 58 (4), 634-660.
II. Evidence based practices, Evidence Standards, the “Evidence Debate”, Improving Evidence-gathering and Evidence-synthesis Methods
Chatterji, M. (2016). Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges, Evaluation and Program Planning, 56 (6) 128–140.
Chatterji, M., Green, L. W., & Kumanyika, S. (2014). L.E.A.D.: A framework for evidence gathering and use for the prevention of obesity and other complex public health problems. Health Education & Behavior, 41 (1) 85-99. First released on June 19, 2013.
Chatterji, M. (2008). Synthesizing evidence from impact evaluations in education to inform action: Comments on Slavin. Educational Researcher, 37 (1) 23-26.
Chatterji, M. (2007). Grades of Evidence: Variability in quality of findings in effectiveness research on complex field interventions. American Journal of Evaluation, 28 (3), 3-17.
Chatterji, M. (2004). Evidence on “what works”: An argument for extended-term mixed method (ETMM) evaluation designs. Educational Researcher, 33 (9), 3-13. (Reprinted in Educational Researcher, 34 (5), 14-24, 2005). [Received an Outstanding Publication Award for Advances in Research Methodology from the American Educational Research Association, 2004]
III. Standards-based Reforms and Educational Equity
Chatterji, M. (2012). Development and validation of indicators of teacher proficiency in diagnostic classroom assessment. The International Journal of Educational and Psychological Assessment, 9 (2), 4-25. Special Issue on Teacher Assessments.
Chatterji, M. (2006). Reading achievement gaps, correlates and moderators of early reading achievement: Evidence from the Early Childhood Longitudinal Study (ECLS) kindergarten to first grade sample. Journal of Educational Psychology, 98 (3), 489-507.
Chatterji, M. (2005). Achievement gaps and correlates of early mathematics achievement: Evidence from the ECLS K-first grade sample. Educational Policy Analysis Archives, 13 (46).
Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009) Closing learner gaps proximally with teacher- mediated diagnostic assessment. Research in the Schools, 16 (2), 60-77.
Chatterji, M. (2002). Models and methods for examining standards-based reforms: Have the tools of inquiry answered the pressing questions on improving schools? Review of Educational Research, 72 (3), 345-386.
IV. Evaluation Research and Evaluation Methods
Chatterji, M. (2005). Applying the Joint Committee's 1994 standards in international contexts: A case study of educational evaluations in Bangladesh. Teachers College Record, 107 (10), 2373-2400. Special Issue on New Perspectives in Program Evaluation.
Banerji, M. & Dailey, R.A. (1994). A study of the effects of an inclusion program of Elementary students with specific learning disabilities. Journal of Learning Disabilities, 28 (8), 511-522.
Pearson, C.L. & Banerji, M. (1993). Effects of a ninth-grade dropout prevention program on student academic achievement, school attendance, and dropout rate. Journal of Experimental Education, 61 (3), 247-256.
V. Assessment Policy
Chatterji, M. (2019). A Consumer’s Guide to Testing under the Every Student Succeeds Act (ESSA): What Can the Common Core and Other ESSA Assessments Tell Us?” University of Colorado, Boulder: National Educational Policy Center. See: https://nepc.colorado.edu/publication/rd-assessment-guide [BOOK]
Chatterji, M. (2014). Validity Counts: Let’s mend, not end, educational testing. Education Week, Issue 24. Published on March 12, 2014, and archived at: www.edweek.org. [Op-Ed]
Chatterji, M. & Harvey J. (2014). (Co-facilitators). Assessing the Assessments: K-12 Measurement and Accountability in the 21st Century. A blog featuring debate and dialogue between scholars and K-12 school officials/practitioners at Education Week’s blog site: http://blogs.edweek.org/edweek/assessing_the_assessments [Blog]
Chatterji, M. (2013). Bad tests or bad test use? A case of SAT® use to examine why we need stakeholder conversations on validity. Teachers College Record, 115 (9), 1-7. [Foreword]
Popham, W. J., Berliner, D.C., Kingston, N., Fuhrman, S.H., Ladd, S.M., Charbonneau, J. & Chatterji, M. (2014). Can today's standardized tests yield instructionally useful data? Challenges, promises and the state of the art. Quality Assurance in Education, 22 (4), 300-315. [Moderated Policy Discussion]
Pizmony-Levy, O., Harvey, J., Schmidt, W., Noonan, R., Engel, L., Feuer, M.J., Santorno, C., Rotberg, I., Ash, P., Braun, H., Torney-Purta, J., & Chatterji, M. (2014). On the merits of, and myths about, international assessments. Quality Assurance in Education, 22 (4), 316-335. [Moderated Policy Discussion]
Gordon, E. W., McGill, M.V., Sands, D.I., Kalinich, K., Pellegrino, J.W. , & Chatterji, M. (2014). Bringing formative assessment to schools and making it count. Quality Assurance in Education, 22 (4), 336-350. [Moderated Policy Discussion]
Chatterji, M. (2010). Review of “Closing the Racial Achievement gap: Learning from Florida’s Reforms.” Boulder, CO: National Education Policy Center. Available at http://nepc.colorado.edu/thinktank/learnin-from-florida. [Policy Brief]
Chatterji, M. (2005, April). Closing Florida’s achievement gaps: Florida Institute of Education (FIE) Policy Brief 4. Jacksonville, FL: Florida Institute of Education at the University of North Florida. [Policy Brief]
Chatterji, M. (2004, April). Good and bad news about Florida student achievement: Performance trends on multiple indicators since passage of the A+ legislation. Educational Policy Brief Research Unit, Doc No. EPSL-0401-105-EPRU, Tempe, AZ: Educational Policy Studies Laboratory. [Policy Brief]
NOTE: Publications are available on: PubMed, ERIC, PsycINFO and Psychological Abstracts.
Biographical Information
MADHABI CHATTERJI, Ph.D.
Madhabi Chatterji, Ph.D., M.Ed., B.Sc. (Hons.) is the Professor Emerita of Measurement, Evaluation, and Education at Columbia University’s Teachers College (TC) where she founded and directs the Assessment and Evaluation Research Initiative, a center dedicated to promoting meaningful use of assessment-evaluation information to improve equity and the quality of practices and policies in education, psychology and the health professions (AERI, www.tc.edu/aeri). She retired from TC on August 31, 2022, following almost 22 years of service. Prior to joining TC, Chatterji was an Assistant Professor at the Department of Educational Measurement and Research at the College of Education, University of South Florida (1996-2000), and Specialist/Supervisor of Research and Evaluation Services at the Pasco County School System in Florida (1988-1995). She emigrated to the U.S. as a doctoral student in January, 1985 with her then-young daughters following shortly after. They are now settled permanently in the U.S. as naturalized citizens.
An award-winning and internationally-recognized methodologist and educationist, Chatterji has taught and mentored numerous doctoral students and post-doctoral researchers over her 35+ year career. Her “signature course” at TC was Instrument Design and Validation, which drew participants pursuing advanced graduate and professional degrees in various fields, including students and faculty from other universities in and around New York City. Her added academic interests include: improving methods and technical standards for Evidence Based Practices (EBP); standards based education reforms; educational equity; and cognitively-based proximal models of diagnostic assessment for detecting and closing students’ learning gaps across the lifespan. Her 100+ publications on these themes, to date, include over 50 refereed articles in top-tier academic journals, two peer-reviewed books, multiple edited volumes and special issues of journals, policy briefs, blogs and numerous technical reports. Refereed publications with the largest volume of scholarly citations to date, include: her prior assessment book, Designing and Using Tools for Educational Assessment (2003, Allyn & Bacon/Pearson) and articles in The Journal of Educational Psychology (2006), Educational Researcher (2004/05 and 2008), Review of Educational Research (2002); Journal of Learning Disabilities (1994); Journal of Outcome Measurement (1997); Educational and Psychological Measurement (1998;1999; 2002) and the American Journal of Evaluation (2007).
A public intellectual, Professor Chatterji has spoken out frequently on the limitations of large scale, standardized tests and the adverse social consequences of misused high stakes, educational assessments. Her long-standing scholarly interests lie in instrument design, validation, validity and test use issues, the central thrust of her forthcoming 12 chapter-textbook: User-Centered Assessment Design: An Integrated Methodology for Diverse Populations and Settings (Guilford Publishers, NY, in press). Chatterji’s policy briefs and a book-length guide on educational testing, “A Consumer’s Guide to Testing Under the Every Student Succeeds Act (ESSA): What Can the Common Core and Other Assessments Tell us?” were published by the National Education Policy Center (NEPC) where she is a Fellow, and via op-eds and blogs in the Education Week. Her membership as a methodological scientist on an Institute of Medicine expert consensus committee (now the National Academy of Sciences, Engineering and Medicine) led to new evidence standards for decision-making in obesity prevention, and a systems-based, multi-method framework for evidence synthesis and evidence generation to address major public health problems (published in the Health Education and Behavior, 2014). She has served on numerous international and national advisory panels and journal editorial boards in measurement-evaluation, including flagship journals of the American Educational Research Association (AERA) and the National Council on Measurement in Education (NCME).
Professor Chatterji’s notable list of recognitions, includes a Fulbright Research Scholar Award (2007-08) for studies examining gender equity issues in primary schools of selected Bengali-speaking regions in India and Bangladesh; an Outstanding Publication Award (2004) from the American Educational Research Association (AERA, 2004) for her lead article in the Educational Researcher, titled: Evidence on “What Works”—An Argument for Extended Term Mixed Methods Evaluation Designs; a Distinguished Paper Award from the Florida Educational Research Association (1993) for demonstrating the combined utility of Rasch and confirmatory factor analysis models to examine the dimensionality and construct validity of test-based data (published in the Journal of Outcome Measurement, 1997); and Reviewer Recognitions from the Educational Researcher and the AERA publications committee (2006), Journal of Graduate Medical Education (2012, 2013) and Studies in Educational Evaluation (2019).
At her center, AERI at TC, Professor Chatterji served as Principal Investigator (PI) or Co-PI on numerous projects supported by competitive research grants from the National Science Foundation, the Stemmler Fund of the National Board of Medical Examiners, various non-profit/state/federal government agencies, including the Educational Testing Service, and most recently, the William T. Grant Foundation and Spencer Foundation.
Chatterji is also a frequently invited speaker at international conferences and forums sponsored by governments, non-governmental organizations and major national universities in the U.S. and abroad. Most recently, she served as Co-Editor of Quality Assurance in Education, an international peer-reviewed journal in educational evaluation. She hopes to continue as an active member of the Faculty Steering Committee of the Columbia Global Centers, a select cadre of university-wide scholars with an international reach and impact of their work.
Note: Madhabi Chatterji’s academic degrees and scholarly publications prior to December, 2000 are listed under the name, Madhabi Banerji; from 2001 onwards, they are under her current name, Madhabi Chatterji.
Grants
Funded Project and Sponsor/Funder |
Project Goals, Years, Grant Amount and Role |
1. AERI Partnership with the Provost’s Office and the Institute for Urban and Minority Education, Teachers College for a centennial conference supported by the Spencer, Hewlett and William T. Grant foundations. Title: Learning and Thriving Across the Lifespan: A Centennial Celebration of the Intellectual Life and Legacy of Dr. Edmund W. Gordon— The Minister of Education.
|
Co-hosted a virtual conference with plans to publish proceedings for the E.W. Gordon centennial celebration. AERI Project, 2021-2022. The Spencer Foundation: $75,000, Principal Investigator The William T. Foundation: $25,000, Principal Investigator The Hewlett Foundation: $20,000, Co-Principal Investigator
|
2. Contract with the Department of Anesthesiology at the Columbia University Medical Center
|
Supported the career development goals of Anesthesiology faculty by providing technical support/consultation. Annually renewable contract of $10,000-$15,000, AERI Project, 2018-2023. |
3. Foundation for Anesthesia Education Research (FAER). Sub-award. Title: A mixed-methods, randomized controlled trial comparing two methods of debriefing for a serious game designed to teach novice anesthesia residents to perform anesthesia for emergency caesarian delivery. |
Served as Primary Faculty Mentor to guide research proposed by Allison Lee, M.D., Department of Anesthesiology at the Columbia University Medical Center. $4918 (year 1); $7500 (year 2); $7500 (year 3); $7500 (year 4). Total=$27,418, 2017-23. |
4. Teachers College Global Investment Fund. Seed grant. Title: Addressing inequities through comprehensive, ecologically-based models of primary education: A capacity-building effort to support teacher education institutions in India. |
Delivered lectures/workshops at the Columbia Global Center (CGC)-Mumbai to serve higher education institutions in India. $8000, AERI Project, 2014-15. Principal Investigator |
5. Subcontract with the International Medical Corps (IMC). Title: Evaluating comprehensive mental health and psychosocial support services for vulnerable refugees. |
Selected and validated outcome measures, and designed a randomized field trial to evaluate the effectiveness of IMC’s new health intervention model for displaced Syrian refugees at camps and urban centers in Amman, Jordan. $52,683. AERI Project, 2013-15. Co-Principal Investigator (with the Department of Clinical and Counseling Psychology, TC). |
6. Subcontract with Barnard College on the Howard Hughes Medical Institute project. Title: Hughes Science Pipeline Project for middle schools in New York City. |
Supported the development and evaluation of the Hughes Science Pipeline Project. $31,998. AERI Project, 2013-17. Principal Investigator. |
7. National Science Foundation (NSF) REESE Program Award. Title: Improving validity at the nexus of assessment design and use: A proposal for an international conference and capacity-building institute in assessment and evaluation. |
Supported the design/hosting of AERI’s inaugural conference and publication of a volume with conference proceedings (with the Educational Testing Service). $124,747. AERI Project, 2011-12. Principal Investigator. |
8. Provost’s Investment Fund Award, Teachers College, Columbia University. Title: Building capacity at home and abroad: A proposal for rotating institutes and conferences to disseminate cutting-edge knowledge in the assessment and evaluation sciences. |
Designed and delivered AERI’s inaugural training institute and publication of policy briefs. $20,000. AERI Project, 2011-12 Principal Investigator. |
9. Educational Testing Service (ETS). Title: Educational assessment, accountability and equity—Conversations on validity around the world. |
Co-designed/co-hosted AERI’s inaugural conference with ETS at TC and published proceedings. $52,200. AERI Project, 2011-12. Principal Investigator. |
10. Subcontract to TC and AERI from Office of the National Coordinator, United States Department of Health and Human Services, Washington, D.C. Curriculum Development Center award to the Department of Biomedical Informatics, Columbia University (Lead investigator). |
Co-led the development and validation of curriculum goal frameworks and educational assessments in health information technology, and an evaluation protocol for future workforce training programs. $204,000, AERI Project, 2010-2012. Co-Principal Investigator (with the Department of Health and Behavior Studies, TC). |
11. The Nand and Jeet Khemka Foundation, India. Title: Design and Evaluation of a Curriculum for the Foundation’s Life Skills and Global Leadership Programme. |
Developed student outcome frameworks and curriculum-based assessments, performed a formative evaluation of pilot programs, and provided training/capacity-building in assessment and evaluation to staff. $754,000 (approx. half for the assessment and evaluation components). AERI Project. 2008-11. Co-Principal Investigator (with the Program of Social Studies and the President’s office, TC). |
12. Fulbright Research Scholar - Award Competition #7410. Center for International Exchange of Scholars, Washington D.C. Title: Education for All (EFA) –case studies on gender equity in selected primary schools in West Bengal, India and Bangladesh. |
Conducted case study research on gender equity issues in primary schools under the state/national government’s EFA policy. $13,837. Fulbright Commission, 2007-08. Principal Investigator |
13. Stemmler Fund of the National Board of Medical Examiners. Title: Designing cognitive measures of practice-based learning and improvement as an iterative process combining Rasch and classical measurement methods. |
Created and validated competency assessments for resident physicians in compliance with the Accreditation Council of Graduate Medical Association’s standards. $145,000, AERI Project, 2006-09. Co-Principal Investigator (with the Center for Educational Research and Evaluation, Columbia University- College of Physicians and Surgeons).
|
14. Community Foundation of Elmira/Corning/Finger Lakes areas. Title: The Chemung County School Readiness Studies. |
Conducted an evaluation of the county-wide school readiness project and supported local instrument design/research efforts to examine inequities in preschool programs. $94,000, AERI Project, 2006-09. Principal Investigator |
15. National Science Foundation-EREC Program Award 03-542. Title: Improving mathematics achievement in elementary/middle school students with systemic use of proximal assessment data. |
Conducted research, development and field-testing of the Proximal Assessment for Learner Diagnosis (PALD) model for closing learning gaps in Black/minority students, with classroom teachers in four schools in East Ramapo, NY. $501,925, 2005-2009. Principal Investigator. |
16. Sub-contract with the Family Services of Westchester, NY/U.S. Department of Education-funded program. Title: School-based mentoring program evaluation. |
Evaluated the long-term effects of the mentoring program on minority adolescents in the Peekskill School District, NY. $10,000 per year, 2004-07. Principal Investigator. |
17. Contract with the Carnegie Learning Corporation. Title: Cognitive Tutor program evaluation. |
Evaluated the effects of the Cognitive Tutor math program on student performance at 13 Brooklyn high schools. $20,155, 2003-2004. Principal Investigator. |
18. National Center for Educational Statistics (NCES). AERA Statistical Analysis and Policy Institute. April, 2002. |
Received training with the ECLS database for conducting research on early childhood achievement gaps. $1500, Spring, 2002. |
19. Kumon North America, Inc. Title: Kumon program evaluations at Public School 180 in the Chancellor's District, Harlem, New York. |
Evaluated the effects of the Kumon supplementary math and reading programs. $28,750, 2001-2003. Principal Investigator.
|
20. Pinellas County Schools, Florida, Goals 2000, district-level training grant. Title: Data-based decision-making in the classroom. |
Developed a training manual for teachers/leaders in basic statistical analysis and use of assessment data for educational decision-making. $29,000, 1999-2001. Principal Investigator |
21. Bureau of Teacher Education, Florida Department of Education. Title: Teacher Readiness for Statewide Assessment Reforms and its Influences on School Practices and Outcomes |
Conducted a large scale survey to evaluate needs related to state-initiated reforms in nine Florida school districts. $25,000, 1999-2001 Principal Investigator |
22. University of South Florida, Division of Sponsored Research, Creative Scholarship Grants Competition for Faculty Title: Teacher and School Leader Readiness Levels for Statewide Assessment Reforms and its Influences on School Practices and Outcomes |
Conducted a large scale survey to evaluate needs related to state-initiated reforms in nine Florida school districts. $7500, 1999-2001 Principal Investigator |
23. University of South Florida, Instructional Technology Grants Competition for Faculty. Center for Teaching Enhancement Title: Designing and Validating Educational Assessments: A Computer-based Module |
Designed technology-based modules to teach courses in assessment/test design. $7500, 1997-1998 Principal Investigator |
24. Bureau of Curriculum, Instruction, and Assessment, Florida Department of Education. Invitational grant awarded to the Pasco County School System Title: Developing Teacher-friendly Guides to teach with Florida’s Goal 3 Standards. |
Developed teacher guides at Levels 1-4, and delivered training to elementary and secondary school teachers. $67,000, 1995-1996. Project Leader and Primary Author. |
25. Florida Educational Research Council. Title: Evidence of Consequential Validity of Alternative Assessments Aligned to an Elementary Mathematics Curriculum: A Pilot Study. |
Conducted validation studies on the Pasco 2001 mathematics assessments for students. $2,060, 1994-95. Principal Investigator. |
Honors and Awards
Fulbright Research Scholar, 2007-08
Center for International Exchange of Scholars, Washington, D.C.
Study title: A study of gender equity in primary education in Bengali-speaking regions of India and Bangladesh: Evaluating access, opportunities, and factors affecting school outcomes and completion rates.
Outstanding Publication Award
American Educational Research Association, 2004
Advances in Research Methodology-Division H
Paper title: Evidence on what works: An argument for Extended-term Mixed Methods (ETMM) Evaluation Designs
Note: This paper was published as a lead article in the Educational Researcher in 2004; reprinted in 2005.
Fellow
National Educational Policy Center (NEPC), 2006-present
University of Colorado at Boulder
Previously Fellow, Educational Policy Research Unit (EPRU), Arizona State University.
Distinguished Paper Award
Florida Educational Research Association, 1993
Paper title: Examining dimensionality of data generated from an early childhood scale using Rasch analysis and confirmatory factor analysis (wih R.F. Dedrick and R,M. Smith)
Note: This paper was published as a lead article in the Journal of Outcome Measurement in 1997.
Outstanding Reviewer
American Educational Research Association, 2006.
Publications Committee, Educational Researcher
Reviewer Recognitions
Journal of Graduate Medical Education- 2012, 2013
Studies in Educational Evaluation- 2019
Institutional Awards
TC Global Investment Fund Award, 2014
Teachers College, Columbia University
Provost’s Investment Fund Award, 2011
Teachers College, Columbia University
Creative Scholarship Award,
University of South Florida, 1999.
Instructional Technology Award,
University of South Florida, 1997.
Early Academic Recognitions
Elected Member, Phi Kappa Phi (Academic Honor Society), University of South Florida.
Dean’s List of Scholars, 1986-1989, University of South Florida.
Elected Member, Delta Kappa Gamma (International Academic Honor Society for Educators), 1987, University of South Florida.
Principal Publications
Note: Published as Madhabi Banerji from 1990-2000, and as Madhabi Chatterji from January, 2001-present.
REFEREED PUBLICATIONS BY THEME:
I. Instrument Design, Validation and Validity Issues
Chatterji, M. (2024, in press). User-Centered Assessment Design: An Integrated Methodology for Diverse Populations. New York, NY: Guilford Press. [BOOK]
Chatterji, M. (Ed.) (2013). Validity and Test Use: An International Dialogue on Educational Assessment, Accountability, and Equity. Bingley, UK: Emerald Group Publishing Limited. [BOOK]
Chatterji, M. (2003). Designing and Using Tools for Educational Assessment. Boston, MA: Allyn & Bacon/Pearson. [BOOK]
Chatterji, M., & Lin, M. (2018). Designing non-cognitive construct measures that improve mathematics achievement in grade 5-6 learners. A user-centered approach. Quality Assurance in Education, 26(1), 70-100.
Chatterji, M., Tripken, J., Johnson, S., Koh, N. J., Sabain, S., Allegrante, J.P., & Kukafka, R. (2017). Development and validation of a health information technology curriculum: Toward more meaningful use of electronic health records. Pedagogy in Health Promotion, 3(3) 154–166. Electronic release: © 2016 Society for Public Health Education
Wyer, P. W. & Chatterji, M. (2013). Designing outcome measures for the accreditation of medical education programs as an iterative process combining classical test theory and Rasch measurement. The International Journal of Educational and Psychological Assessment, 13 (2), 35-61.
Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G. (2002). Using an iterative validation model to conceptualize, pilot-test, and validate scores from an instrument measuring Teacher Readiness for Educational Reforms. Educational and Psychological Measurement, 62, 442-463.
Banerji, M., Smith, R. M., & Dedrick, R. F. (1997). Dimensionlity of an early childhood scale using Rasch analysis and confirmatory factor analysis. Journal of Outcome Measurement, 1 (1), 56-86. [Received the Distinguished Paper Award from the Florida Education Research Association, 1993]
Banerji, M. & Ferron, J. (1998). Construct validity of a developmental assessment made up of mathematical patterns tasks. Educational and Psychological Measurement, 58 (4), 634-660.
II. Evidence based practices, Evidence Standards, the “Evidence Debate”, Improving Evidence-gathering and Evidence-synthesis Methods
Chatterji, M. (2016). Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges, Evaluation and Program Planning, 56 (6) 128–140.
Chatterji, M., Green, L. W., & Kumanyika, S. (2014). L.E.A.D.: A framework for evidence gathering and use for the prevention of obesity and other complex public health problems. Health Education & Behavior, 41 (1) 85-99. First released on June 19, 2013.
Chatterji, M. (2008). Synthesizing evidence from impact evaluations in education to inform action: Comments on Slavin. Educational Researcher, 37 (1) 23-26.
Chatterji, M. (2007). Grades of Evidence: Variability in quality of findings in effectiveness research on complex field interventions. American Journal of Evaluation, 28 (3), 3-17.
Chatterji, M. (2004). Evidence on “what works”: An argument for extended-term mixed method (ETMM) evaluation designs. Educational Researcher, 33 (9), 3-13. (Reprinted in Educational Researcher, 34 (5), 14-24, 2005). [Received an Outstanding Publication Award for Advances in Research Methodology from the American Educational Research Association, 2004]
III. Standards-based Reforms and Educational Equity
Chatterji, M. (2012). Development and validation of indicators of teacher proficiency in diagnostic classroom assessment. The International Journal of Educational and Psychological Assessment, 9 (2), 4-25. Special Issue on Teacher Assessments.
Chatterji, M. (2006). Reading achievement gaps, correlates and moderators of early reading achievement: Evidence from the Early Childhood Longitudinal Study (ECLS) kindergarten to first grade sample. Journal of Educational Psychology, 98 (3), 489-507.
Chatterji, M. (2005). Achievement gaps and correlates of early mathematics achievement: Evidence from the ECLS K-first grade sample. Educational Policy Analysis Archives, 13 (46).
Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009) Closing learner gaps proximally with teacher- mediated diagnostic assessment. Research in the Schools, 16 (2), 60-77.
Chatterji, M. (2002). Models and methods for examining standards-based reforms: Have the tools of inquiry answered the pressing questions on improving schools? Review of Educational Research, 72 (3), 345-386.
IV. Evaluation Research and Evaluation Methods
Chatterji, M. (2005). Applying the Joint Committee's 1994 standards in international contexts: A case study of educational evaluations in Bangladesh. Teachers College Record, 107 (10), 2373-2400. Special Issue on New Perspectives in Program Evaluation.
Banerji, M. & Dailey, R.A. (1994). A study of the effects of an inclusion program of Elementary students with specific learning disabilities. Journal of Learning Disabilities, 28 (8), 511-522.
Pearson, C.L. & Banerji, M. (1993). Effects of a ninth-grade dropout prevention program on student academic achievement, school attendance, and dropout rate. Journal of Experimental Education, 61 (3), 247-256.
V. Assessment Policy
Chatterji, M. (2019). A Consumer’s Guide to Testing under the Every Student Succeeds Act (ESSA): What Can the Common Core and Other ESSA Assessments Tell Us?” University of Colorado, Boulder: National Educational Policy Center. See: https://nepc.colorado.edu/publication/rd-assessment-guide [BOOK]
Chatterji, M. (2014). Validity Counts: Let’s mend, not end, educational testing. Education Week, Issue 24. Published on March 12, 2014, and archived at: www.edweek.org. [Op-Ed]
Chatterji, M. & Harvey J. (2014). (Co-facilitators). Assessing the Assessments: K-12 Measurement and Accountability in the 21st Century. A blog featuring debate and dialogue between scholars and K-12 school officials/practitioners at Education Week’s blog site: http://blogs.edweek.org/edweek/assessing_the_assessments [Blog]
Chatterji, M. (2013). Bad tests or bad test use? A case of SAT® use to examine why we need stakeholder conversations on validity. Teachers College Record, 115 (9), 1-7. [Foreword]
Popham, W. J., Berliner, D.C., Kingston, N., Fuhrman, S.H., Ladd, S.M., Charbonneau, J. & Chatterji, M. (2014). Can today's standardized tests yield instructionally useful data? Challenges, promises and the state of the art. Quality Assurance in Education, 22 (4), 300-315. [Moderated Policy Discussion]
Pizmony-Levy, O., Harvey, J., Schmidt, W., Noonan, R., Engel, L., Feuer, M.J., Santorno, C., Rotberg, I., Ash, P., Braun, H., Torney-Purta, J., & Chatterji, M. (2014). On the merits of, and myths about, international assessments. Quality Assurance in Education, 22 (4), 316-335. [Moderated Policy Discussion]
Gordon, E. W., McGill, M.V., Sands, D.I., Kalinich, K., Pellegrino, J.W. , & Chatterji, M. (2014). Bringing formative assessment to schools and making it count. Quality Assurance in Education, 22 (4), 336-350. [Moderated Policy Discussion]
Chatterji, M. (2010). Review of “Closing the Racial Achievement gap: Learning from Florida’s Reforms.” Boulder, CO: National Education Policy Center. Available at http://nepc.colorado.edu/thinktank/learnin-from-florida. [Policy Brief]
Chatterji, M. (2005, April). Closing Florida’s achievement gaps: Florida Institute of Education (FIE) Policy Brief 4. Jacksonville, FL: Florida Institute of Education at the University of North Florida. [Policy Brief]
Chatterji, M. (2004, April). Good and bad news about Florida student achievement: Performance trends on multiple indicators since passage of the A+ legislation. Educational Policy Brief Research Unit, Doc No. EPSL-0401-105-EPRU, Tempe, AZ: Educational Policy Studies Laboratory. [Policy Brief]
NOTE: Publications are available on: PubMed, ERIC, PsycINFO and Psychological Abstracts.
Professional Experiences
9/2022-present Professor Emerita of Measurement, Evaluation, and Education
Teachers College (TC), Columbia University
Founding Director,
Assessment and Evaluation Research Initiative (AERI)
Website: www.tc.edu/aeri
6/2015-8/2022 Professor of Measurement, Evaluation, and Education (tenured)
Teachers College (TC), Columbia University,
Dept. of Organization and Leadership, Program of Social and Organizational Psychology
Major responsibilities:
-
-
-
-
- Developed and taught graduate- and doctoral- level courses in areas of specialization; advise doctoral students on dissertations and theses
- Conducted research and engaged in innovative scholarship in areas of specialization
- Provided leadership in scholarly fields
- Provided service to the college/university, profession, and larger national and global community in areas of specialization
-
-
-
5/2006-present Director, Assessment and Evaluation Research Initiative (AERI)
Teachers College, Columbia University. Website: www.tc.edu/aeri
Major responsibilities:
-
-
-
-
- Established and directed a self-sustaining center guided by its mission.
-
-
-
AERI’s mission is to promote meaningful use of assessment and evaluation information to improve equity and the quality of practices and policies in education, psychology and the health professions, with education as the primary field of action.
-
-
-
-
- Sought external funding and contracts to support AERI’s activities and annual agenda. AERI’s activities include sponsored or contracted research and evaluation projects, conferences, training institutes, academic publications and blogs/policy briefs.
- Built a long-term vision and strategic plan for continuing AERI’s work
- Developed partnerships with leaders, scholars, policy-makers, practitioners, and organizations to pursue AERI’s agenda.
-
-
-
8/2015-6/2018 Co-editor, Quality Assurance in Education (QAE, Term: 2015-2018)
Emerald Group Publishing, UK.
For QAE’s Aims and Scope, see:
http://www.emeraldgrouppublishing.com/products/journals/journals.htm?id=qae
-
-
-
-
- Provided leadership in expanding the aims, scope, visibility, impact and standing of the journal
- Invited new thinking and publications by widening contributors, the reviewer database, and readership
- Added an assessment policy section, and create criteria for reviewing and accepting policy-relevant submissions.
- Invited prominent authors as guest-editors for producing special issues on innovative topics
- Upheld standards of rigor and originality of the journal
-
-
-
Guest Editor: QAE, Vol. 22, No. 1 (2014)
Guest Editor: QAE, Vol. 22, No. 4 (2014)
1/2006-5/2015 Associate Professor of Measurement, Evaluation, and Education (tenured)
Dept. of Organization and Leadership,
Teachers College (TC), Columbia University
1/2001-12/2005 Associate Professor of Measurement, Evaluation, and Education
(Tenured in December, 2005; Reappointed in 2003)
Dept. of Human Development,
Teachers College, Columbia University.
1/1996-12/2000 Assistant Professor
Department of Educational Measurement and Research,
College of Education, University of South Florida.
1/1988-12/1995 Supervisor, Research and Evaluation Services
District School Board of Pasco County
Land O' Lakes, FL 34639.
Accomplishments and Responsibilities:
-
-
-
-
- Led district or state-sponsored assessment development and standards-based reform projects, such as, Pasco 2001 and Goals 2000
- Conducted program evaluations, needs assessments, and research studies to address school, district, and state needs
- Provided training in assessment, evaluation, and research areas to department, district, and school-based staff
- Supervised doctoral student interns (from the University of South Florida) and department staff on various projects.
- Served on state and district-wide steering committees on testing, assessment, research, or evaluation issues
- Participated in departmental and district-wide planning activities
- Served on School Advisory Councils and assisted schools in using data for school improvement, planning and decision-making
- Wrote grant proposals, as needed
- Participated in district-wide and departmental budgeting activities.
-
-
-
1/1988-12/1995 Adjunct Professor, University of South Florida (periodic)
Department of Educational Measurement and Research, College of Education.
Taught graduate-level courses:
EDF 6432 - Foundations of Educational Measurement
EDF 6481 - Foundations of Educational Research
6/1991-7/1991 Adjunct Faculty, Center for Continuing Education, St. Leo College, Florida.
Taught a teacher recertification course in educational measurement.
8/1987-12/1987 Visiting Faculty Member (full-time, Fall semester), University of South Florida
Department of Educational Measurement and Research, College of Education.
Taught undergraduate and graduate-level courses in measurement.
EDF 4430 - Measurement for Teachers, 2 sections
EDF 6432 - Foundations of Educational Measurement, 2 sections
6/1975-5/1977 Secondary School Teacher in Chemistry
Jamnabai Narsee School, Bombay 400056, India.
7/1974-5/1975 Elementary School Teacher in English
Maneckji Cooper Education Trust School, Bombay 400056, India.
ADVISORY PANELS AND CONSULTING CONTRACTS (OUTSIDE AGENCIES)
2019 Technical Advisor/ Measurement-Evaluation Consultant
Accenture Global HR , New York, US
2016 Technical Advisor/ Measurement-Evaluation Consultant
Centro Nacional De Evaluación Para La Educación Superior (CENEVAL), Mexico City, Mexico, October 25-27, 2016.
2015-17 Member-Editorial Advisory Board for Publications
National Education Policy Center (NEPC) at the University of Colorado, Boulder.
2015-16 Member- Technical Advisory Board,
New York City Performance Series. Computer adaptive test development for bilingual populations for Scantron Inc., and NYC Board of Education
2006-12 Measurement and Evaluation Consultant,
Scholastic Publishing, Inc., New York.
Conduct critical reviews of national research reports and evaluations of Scholastic’s reading and mathematics programs (2008-10).
Conduct psychometric reviews of the Scholastic Reading Inventory and Scholastic Phonics Inventory (2014)
2010-11 Member, Advisory Panel, Great Public School Indicators—Educational Assessment Committee.
National Education Association, Washington, D.C.
2008-10 Member, Expert Consensus Committee on Evidence Frameworks for Obesity Prevention Decision-making,
Institute of Medicine, now The National Academies of Science, Engineering and Medicine, Washington, D.C. (Food and Nutrition Board)
Report: Bridging the evidence gap in obesity prevention: A framework for decision-making.
This report, published on April 23, 2010, was based on a two-year expert consensus study and co-authored by committee members.
2008-09 Member, National Advisory Panel (Research Methods Task Force)
Robert Wood Johnson Foundation’s study on synthesizing evidence for policies on nutrition awareness and obesity prevention.
11/2007 Member, Planning Committee on Evidence Frameworks for Obesity
Prevention Decision-making, Institute of Medicine, The National Academy of Sciences, Washington, D.C. (Food and Nutrition Board).
2006-10 Senior Research Advisor,
Center for Social and Emotional Education, New York.
Provide consultation on instrumentation and validation programs.
2005-08 Measurement/Evaluation Consultant, Bangladesh Rural Advancement Committee (BRAC).
Provided lectures and workshops at BRAC University- Institute for Educational Development, BRAC Education Programs and BRAC’s Research and Evaluation Division.
2005-08 Measurement and Evaluation Consultant,
Columbia University, School of Medicine-Center for Educational Research and Evaluation (CERE)
Provide consultation on instrument design, validation and assessment system development for accreditation of resident preparation and undergraduate medical education programs. Provided support via graduate students.
2/2004-6/2005 Measurement Consultant, Survey Research Group at The Channing-Bete Co., MA.
4/2004 Evaluation Consultant, Mid-continental Regional Educational Laboratory (MCREL), Oregon
Provide advice on conducting a research review on standards-based educational reforms in the United States.
9/2002-2006 Measurement Consultant, Ballard & Tighe Test Publishers, California
Serve on technical advisory board (design of large scale language tests).
2002-03 Member of International Advisory Panel
Support the launching of BRAC University- Institute for Educational Development (IED), Dhaka, Bangladesh.
8/2001-4/02 Evaluation Consultant, Columbia University, Asst. Provost's office and Metro Teaching and Learning Corp.
Develop proposals for formative and summative studies of the Metro reading curriculum at selected schools in District 9, Bronx, New York.
4/2002 Measurement Consultant, Columbia University, School of Business
Provide faculty development seminar in testing, assessment, and grading.
9/2001 Measurement Consultant, Ministry of Education, Jamaica
Contact: Charmers Thompson
Consultant for test development for GED (HEART) program.
1998-2001 Measurement Consultant, Department of Curriculum and Instruction Services,
Goals 2000: Student Achievement Grant, Pinellas County School System, Florida
Conduct assessment design workshops for teachers and content experts (1998-2000). Develop training manual and deliver training to help teachers use data for classroom decision-making (2001).
7/2000 Measurement Consultant, Department of Staff Development, Hernando County School System, Florida
Provide training in assessment design as a part of curriculum reform projects. Develop training manual and deliver training to help teachers use data for classroom decision-making.
6/1998-8/98 Measurement Consultant, IBM Corporation (Network Services and Skills Group), Tampa, Florida
Provide training in assessment design to trainers in the Network Services and Skills Group. Review tests for content-validity.
4/1998 Measurement Consultant, Osceola County School System, Florida
Provide training in assessment design as part of a local curriculum reforms.
1/1996-4/97 Measurement Consultant, Department of Curriculum and Instruction Services,
Pasco County School System, Florida
Author five studies on a district-initiated assessment reform project
(Pasco 2001 Assessment Project). Each report presents validity and reliability data on mathematics assessments for K-5 levels.
5/1996-11/96 Evaluation Consultant, Manatee County School System, Florida
Conduct a district-wide needs assessment study for Safe and Drug-free Schools programs.
5/1995-1/97 Project Leader and Primary Author, Bureau of Curriculum, Instruction and Assessment,
Florida Department of Education
(See also grants and technical manuals/reports.) Develop teacher-friendly guides for teaching and assessing with Florida's Goal 3 standards.
7/1992-7/96 Evaluation Consultant, Center for Excellence, Miami Museum of Science
Conduct three state and national program evaluation studies of the “Intech” program.
11/1992-95 Research Consultant, Department of Internal Medicine, University of South Florida College of Medicine
Assist with vaccine survey research, questionnaire development, and data analysis.
7/1990-12/91 Research Associate, School Management Institute led by Drs. K.J. Syder and R.H. Anderson,
University of South Florida
Write proposals for research grants; conduct applied research on the School Work Culture Profile and Managing Productive Schools Program.
2/1991-8/92 Evaluation Consultant, West Central Regional Management
Development Network, University of South Florida
Conduct formative and summative evaluation of the Hillsborough County Teacher Mentoring program.
8/1988-8/1992 Evaluation Consultant, Department of Health Education, Sarasota
County Schools, Florida (1989, 1990, 1992)
Assist with/conduct survey research on substance abuse prevention programs in middle and high schools.
5/1986-12/86 Measurement Consultant, Hillsborough County Sheriff's Office, Tampa, Florida.
Developed competency tests for law enforcement officer promotion.
Professional Organization Membership
American Evaluation Association (AEA)
American Educational Research Association (AERA)
Eastern Evaluation Research Society (EERS), and affiliate of the American Evaluation Association
National Council on Measurement in Education (NCME)
Florida Educational Research Association (FERA)
Florida Educational Research Council (FERC)
Florida Public Health Association (FPHA)
Professional Presentations
2018
Invited Lecture
Columbia Global Center (CGC), Mumbai, India on February 12, 2018
Audience: International audience of education scholars/faculty, policy-makers, leaders from K-12 and higher education institutions in India and Bangladesh.
Title: Evidence-based Approaches for Enhancing Educational Quality
Sponsor: CGC
2018
Panel Presentation
“Educational Leaders Data Analytics Summit” held on June 8, 2018 at Teachers College, Columbia University
Audience: Education leaders, data analytics scholars, K-12 educators in New York and the US.
Title: Test Scores, Evidence, and Educational Quality: Searching for the Best Indicators
2016
Keynote Address
The XIIth National Forum on Educational Evaluation held on October 28-29, 2016 at the Universidad Autonóma de San Luis Potosi (UASLP), Mexico, hosted by the Centro Nacional De Evaluación Para La Educación Superior (CENEVAL).
Audience: University presidents, ministers of education, university program directors, faculty, researchers and the media.
Title: Assuring Quality, Assessing Student Learning and Evaluating the Effects of
“Complex” Programs in Higher Education: Issues, Contemporary Methodologies and New Directions
Sponsor: CENEVAL, Mexico City.
2015
Invited Lecture
Audience: Faculty of Education, Departamento de Metodos de Investigacion y Diagnostico en Educacion, and the Doctoral Program of Education at Universidad Nacional de Educación a Distancia (UNED) in Madrid, Spain on January 12, 2015.
Lecture 1 Title: Mixed Methods Evaluations
Lecture 2 Title: Improving Learning with Classroom Assessment: The Proximal Assessment for Learner Diagnosis (PALD) Model
Sponsor: UNED (Departamento de Metodos de Investigacion y Diagnostico en Educacion).
2014
Keynote Speech
International Conference on Educational Assessment organized by the Ministry of Education and Culture and Yogyakarta State University, Indonesia on November 8, 2014.
Title: Issues in Implementing Classroom Assessment with the Proximal Assessment for Learner Diagnosis (PALD) Model
Audience: Academics, educators, teachers and administrators in higher education
Sponsor: Yogyakarta State University, Indonesia
2014
Invited Seminar
Calcutta University’s Department of Education, Alipore Campus, Kolkata, India on February 5, 2014.
Title: Measures and Correlates of Mathematics Self-efficacy, Mathematics Self-Concept and Mathematics-Anxiety in Elementary Students: An Instrument Design and Validation Study.
Audience: Faculty and graduate students of Calcutta University’s Department of Education
Host: Calcutta University
2012
Invited Workshop
Workshop series organized by Pearson and the World Bank for Indonesian education delegates, December 21, 2012.
at New York City, U.S.
Audience: Indonesian education delegates
Title: Validity Considerations with Large Scale Assessments.
Host/Sponsor: Pearson Education, New York, NY
2012
Plenary Lecture
International Conference on Educational Measurement-Evaluation Needs in Manila, Philippines. August 10, 2012.
Audience: Educators, university-based faculty, measurement researchers, PEMEA membership
Title: Teacher Proficiency Indicators in Cognitively-based Diagnostic
Classroom Assessment.
Sponsor and Host: Phillipine Educational Measurement and Evaluation Association (PEMEA)
2011
Keynote Speech
International Forum on Talent Cultivation in Higher and Vocational Education held at Ningbo City, China, sponsored by Ningbo Polytechnic Institute and Institute of Higher Education, Xiamen University, China, June 16, 2011.
Audience: Educators, faculty and researchers
Title: Talent and Expertise Development in Higher and Vocational Education: Utility of Diagnostic Classroom Assessments
Sponsor: Xiamen University, China and Ningbo Polytechnic, China
2010
Invited Lecture
At the Institute of Higher Education, Xiamen University, China on March 15, 2010.
Title: Models of Quality Assessment and Evaluation in Higher Education Systems
Sponsor: Institute of Higher Education, Xiamen University, China
2010
Fulbright Lecture
United States-India Educational Foundation (USIEF), Kolkata, India—60th Anniversary Seminar Series, on February 18, 2010.
Title: Gender Equity in Primary Education in West Bengal and Bangladesh: Educational Opportunities, Achievement Outcomes, and School Completion Rates.
Audience: Education scholars, university-based faculty, policy-makers, leaders from K-12 and higher education institutions, including local Fulbright alumni.
Sponsor: USIEF, Fulbright Commission, Washington, D.C.
2009
Panelist - Panel Discussion
Institute of Medicine, The National Academy of Sciences, now the National Academy of Science, Engineering and Medicine (NASEM), Washington, D.C. on January 8, 2009.
Title: Alternatives and Tradeoffs in Generating and Evaluating Evidence: Perspectives from Education.
Sponsor: Institute of Medicine, Washington, D.C.
2008
Invited Lecture
At the BRAC-Research and Evaluation Division (RED), Dhaka, Bangladesh on March 10, 2008.
Title: Assessing Student Learning: Building Assessment Capacity in Bangladesh’s Schools and Education Systems.
Audience: The Directorate of Primary Education, National Board of Textbook and Curriculum Development, Ministry of Secondary Education, BRAC University-Institute for Educational Development, BRAC Education Programs and BRAC-Research and Evaluation Division.
Sponsor: BRAC-RED and BRAC Education Programs
2008
Invited Special Lecture
At the 12th International and 43rd National Conference of the Indian Academy of Applied
Psychology, National Library, Kolkata, India on February 7, 2008.
Title: Mixed-method Designs for Studying Effects of Complex Field Interventions: Criteria for Screening the Type and Grade of Evidence.
Audience: International audience of educational scholars/faculty and psychologists in India.
Sponsor: Indian Statistical Institute, Kolkata, India
2007
Invited Lecture
At BRAC University-Institute for Educational Development, Dhaka, Bangladesh on February 12, 2007.
Title: Assessment and Evaluation in School Organizations.
Audience: Bangladeshi educational scholars/faculty, policy-makers, leaders from K-12 and higher education institutions.
Sponsor: BRAC University-Institute for Educational Development
2007
Invited Lecture, Psychometric Research Unit, Indian Statistical Institute (ISI), Kolkata, India on January 22, 2007 to ISI faculty and students.
Title: Using Structural Equation Modeling to Study the Internal Structure of Attitudinal Measures: The Teacher Readiness for Education Reform (TRER) Scales.
Audience: International audience of faculty and scholars from ISI.
Sponsor: ISI, Kolkata, India
2006
Panelist
Panel discussion at the Inaugural Session of the American Educational Research Association (AERA), Mixed Methods SIG, at the AERA annual meeting on April 7, 2006
Title: Grades of Evidence in Effectiveness Research and How Mixed-method Designs Help
Audience: AERA members.
2006
Panelist
Conference closing panel discussion with Director of the Institute for Education Sciences, U.S. Department of Education and others at the annual meeting of the Eastern Evaluation Research Society, an affiliate of the American Evaluation Association, on April 29, 2006.
Title: Rigorous Evaluations: Is There a Gold Standard?
Audience: Eastern Evaluation Research Society members.
2004
Colloquium Speaker
At a colloquium held at Fordham University, Department of Clinical Psychology and Psychometrics, Colloquium Series. February 28, 2004.
Title: Designing and Validating Construct Measures using a Unified Process Model.
2003
Panelist
At the American Educational Research Association (AERA)/Institute for Education Sciences (IES), Postdoctoral Fellows' Summer Retreat, August 15, 2003.
Lecture Title (1): Instrumentation and Validity of Indicators.
Lecture Title (2): Knowledge Production through Documentation and Evaluation (with Edmund W. Gordon).
Audience: Postdoctoral fellows and AERA/IES members
Host/Sponsor: AERA
2003
Invited Lecture
At the Eastern Evaluation Research Society-An Affiliate of the American Evaluation Association, April 28, 2003.
Title: Models and Methods for Examining Standards-based Reforms
This talk was based on a lead article published in the Review of Educational Research, 2002).
Sponsor: Eastern Evaluation Research Society
1991
Invited Panelist
At the American Association of University Women’s meeting, University of South Florida, 1991.
Title: Continuous progress: New directions in elementary education at Pasco County.
1989
Invited Presentation
At the Florida Council on Elementary Education, April, 1989.
Title: Results of the developmental kindergarten study.
NOTE: For papers presented at National and International Conferences, see CV.