The articles listed below use one or more of the assessments available on this site to measure student learning and answer a substantive research question. If you publish research using these assessments, please let us know so we can list it here.

"Racial and Gender Achievement Gaps in an Economics Classroom" (Bottan, McKee, Orlov, and McDougall) International Review of Economic Education, 2022. (AESA)

In this paper, we document gender and race/ethnic achievement gaps over four semesters of an intermediate-level economics course. We find that male under- represented minority (URM) students earned lower final exam scores than male non-URM students, but this gap disappears when we control for differences in prior preparation. In contrast, female URM students performed significantly worse than female non-URM students, even after controlling for prior preparation. We analyze scores on low-stakes assessments and surveys about study behavior and find that the theory of stereotype threat most consistently explains our results. As these issues are unlikely to be unique to our classroom, we offer several potential pedagogical solutions to address differences in prior preparation and stereotype threat that underlie observed achievement gaps.

"Learning during the COVID-19 pandemic: It's not who you teach, but how you teach" (Orlov, McKee, Berry, Boyle, DiCiccio, Ransom, Rees-Jones, and Stoye) Economics Letters, 2021. (ESSA, AESA, IESA-Micro)

We use unique data from seven intermediate economics courses taught at four R1 institutions to examine the effects of the COVID-19 pandemic on student learning. Because the same assessments of course knowledge mastery were administered across semesters, we can cleanly infer the impact of the unanticipated switch to remote teaching in Spring 2020. During the pandemic, total assessment scores declined by 0.2 standard deviations on average. However, we find substantial heterogeneity in learning outcomes across courses. Course instructors were surveyed about their pedagogy practices and our analysis suggests that prior online teaching experience and teaching methods that encouraged active engagement, such as the use of small group activities and projects, played an important role in mitigating this negative effect. In contrast, we find that student characteristics, including gender, race, and first-generation status, had no significant association with the decline in student performance in the pandemic semester.

"Identifying Students at Risk with a New Math Skills Assessment" (Orlov, McKee, Foster, Bottan, and Thomas) AEA Papers and Proceedings, 2021. (MESA-Foundations, MESA-Intermediate)

Math skills are critical for success in economics courses. Introductory courses require students to be comfortable with arithmetic, algebra, and working with graphs. Intermediate level courses add calculus to this list. Students with weak math skills usually perform poorly or drop their economics courses midway through the semester. If students-at-risk can be identified at the beginning of the term, they can be given the support they need to succeed. It may involve encouraging them to attend office hours regularly, use a department tutoring service, or enroll in a supplemental course. In this paper we present two new math assessments that economics instructors can use to evaluate students early in the semester. We then show how our assessments, given at the beginning of the term, predicted performance of 900 students in an introductory microeconomics class and 200 students in an intermediate microeconomic theory class. We analyze course completion and class grades in both courses and show the use of a variety of techniques that use math test scores and easy to collect covariates to identify students for remedial policy implementation.

"The Economic Statistics Skills Assessment (ESSA)" (McKee and Orlov) (under review) (ESSA)

Measurements of student knowledge and skills are highly useful both upon the entry of students into a course, so that the gaps in prerequisite knowledge can be addressed, as well as upon course completion, so that the impact of any interventions and changes to the course can be evaluated. Final examinations often do not provide the desired coverage and are difficult to compare across terms and institutions. Within most STEM fields, this problem is solved by the use of concept inventories, which are designed as low-stakes standardized assessments of students’ core knowledge. With the exception of the Test of Understanding College Economics (TUCE), which tests introductory economics knowledge, economics as a field did not have such assessments. In this paper, we document the design, development, and validation of a 20-question Economic Statistics Skills Assessment (ESSA) that we created to test the student knowledge and understanding of probability and statistics concepts. The assessment was reviewed by economics faculty across multiple public and private institutions, validated via think- aloud interviews with students, and taken by students at multiple institutions at the conclusion of their statistics for economics courses or the start of their econometrics courses. We demonstrate, using statistical analysis, that the items in ESSA capture whether students have developed the understanding of specific probability and statistics concepts.