Participation and Performance on Paper- and Computer-based Low-stakes Assessments

dc.contributor.authorNissen, Jayson
dc.contributor.authorJariwala, Manher
dc.contributor.authorClose, Eleanor
dc.contributor.authorVan Dusen, Ben
dc.date.accessioned2019-04-15T22:41:34Z
dc.date.available2019-04-15T22:41:34Z
dc.date.issued2018-05
dc.description.abstractBackground: High-stakes assessments, such the Graduate Records Examination, have transitioned from paper to computer administration. Low-stakes research-based assessments (RBAs), such as the Force Concept Inventory, have only recently begun this transition to computer administration with online services. These online services can simplify administering, scoring, and interpreting assessments, thereby reducing barriers to instructors’ use of RBAs. By supporting instructors’ objective assessment of the efficacy of their courses, these services can stimulate instructors to transform their courses to improve student outcomes. We investigate the extent to which RBAs administered outside of class with the online Learning About STEM Student Outcomes (LASSO) platform provide equivalent data to tests administered on paper in class, in terms of both student participation and performance. We use an experimental design to investigate the differences between these two assessment conditions with 1310 students in 25 sections of 3 college physics courses spanning 2 semesters. Results: Analysis conducted using hierarchical linear models indicates that student performance on low-stakes RBAs is equivalent for online (out-of-class) and paper-and-pencil (in-class) administrations. The models also show differences in participation rates across assessment conditions and student grades, but that instructors can achieve participation rates with online assessments equivalent to paper assessments by offering students credit for participating and by providing multiple reminders to complete the assessment. Conclusions: We conclude that online out-of-class administration of RBAs can save class and instructor time while providing participation rates and performance results equivalent to in-class paper-and-pencil tests.
dc.description.departmentPhysics
dc.formatText
dc.format.extent17 pages
dc.format.medium1 file (.pdf)
dc.identifier.citationNissen, J. M., Jariwala, M., Close, E. W. & Van Dusen, B. (2016). Participation and performance on paper- and computer-based low-stakes assessments. International Journal of STEM Education, 5(21).
dc.identifier.doihttps://doi.org/10.1186/s40594-018-0117-4
dc.identifier.urihttps://hdl.handle.net/10877/7980
dc.language.isoen
dc.publisherSpringer Open
dc.rights.licenseThis work is licensed under a Creative Commons Attribution 4.0 International License.
dc.sourceInternational Journal of STEM Education, 2018, Vol. 5, No. 21.
dc.subjectparticipation
dc.subjectperformance
dc.subjectcomputer-based test
dc.subjectresearch-based assessments
dc.subjecthierarchical linear models
dc.subjectPhysics
dc.titleParticipation and Performance on Paper- and Computer-based Low-stakes Assessments
dc.typeArticle

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2018-Close-Participation-and-performance.pdf
Size:
1.26 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.54 KB
Format:
Item-specific license agreed upon to submission
Description: