The Teaching Practices Inventory (TPI) was modified to be valid for both face-to-face and online course modalities. The result is a new version we are calling the Faculty Inventory of Methods and Practices Associated with Competent Teaching, or F-IMPACT. The original TPI was found to lack face validity based on responses collected from 90 instructors teaching general education science, mathematics, and social science courses primarily online at a Midwestern research university during the anthropogenic disaster of the coronavirus pandemic (COVID-19). Several TPI items were rewritten by a team composed of discipline-based education researchers in the sciences and social sciences and education researchers. Further refinement and final content validity was established through external review by experts in online education and a diverse group of instructors. The resulting survey was administered to 92 instructors representing 12 departments and teaching via a variety of modalities (online, hybrid, and face-to-face), establishing face validity across modalities and disciplines. Concurrent validity was established via alignment of modified TPI items to items from the Online SUNY Course Quality Review (OSCQR) rubric. The resulting inventory provides a generalized self-report on the prevalence of high-impact practices valid for science, mathematics, and social science courses whether offered online, face-to-face, or a hybrid of the two.
This is the first version of the F-IMPACT survey and is the result of one complete validation round. The second validation round is currently ongoing.
This is the poster presented at the 2021 Physics Education Research Conference on the development and validation of the F-IMPACT.
Student Observed IMPACT
We have also begun developing the Student Observed Inventory of Methods and Practices Associated with Competent Teaching, or SO-IMPACT. The SO-IMPACT is based on the F-IMPACT, where we have removed items that would not be observable to students, specifically sections V and VI. We also modified language to reflect solicitation of students' observations as opposed to instructors' intent. This inventory is being designed as a potential replacement for traditional student course evaluations. It focuses on observed research-verified high-impact practices, lacks subjective and affect items, and provides feedback to instructors that is directly actionable. This inventory is currently in early development. A draft has been developed by a team composed of discipline-based education researchers in the sciences and social sciences and education researchers. This version has been reviewed by faculty across a diverse selection of disciplines (STEM, social science, humanities, and arts) to establish broad, multi-discipline face-validity. Initial student focus groups and pilot deployment in live courses are ongoing to determine face validity for the intended audience. Inter-rater reliability will be established by measuring SO-IMPACT score variability and standard deviation across respondents in the same courses. We will also be comparing SO-IMPACT scores with faculty self-reported F-IMPACT scores and expert observations to determine criterion validity.
Version 1 of the SO-IMPACT will be posted here after the first complete validation and reliability round.
This work has been funded by NSF DUE #2021315