Technology Assisted Research Assessment Research Phase 2

School assessment technologies have potential in Canadian K-12 schools, but require further examination

Summary

The second phase of the Technology Assisted Research Assessment (TASA) project pursued six avenues of inquiry. This project undertook an independent evaluation of an on-line assessment pilot, synthesized research on computer scored essays, reviewed the degree to which paper tests and on-line tests produce the same results, examined accessibility for special needs students, presented a cost/benefit analysis of return on investment (ROI) for investing in education technology, and generated an e-portfolio guide to existing technologies. The goal of the project was to link provincial government education ministries and assessment technology development firms. The researchers set out to provide informed recommendations about the evaluative quality and feasibility of implementing new technologies in jurisdictions across Canada.

Grant Outputs

Automated Essay Scoring: A literature review – http://maxbell.org/sites/default/files/036.pdf

Automated Essay Scoring (AES) draws upon diverse disciplines, including writing instruction, computational linguistics, and computer science. This report provides a literature review on the state of AES research and its implications for K-12 schools in Canada. It concludes with a series of recommendations, including calling for further research on how to most effectively reproduce human marking, further examinations of whether AES can be adjusted to reflect rubrics in different municipalities, and tests to determine whether detailed, descriptive feedback could be provided to students so that computerized test responses would be more personalized.

Establishing a Business Case for Transition to Online Testing: Factors for a Return on Investment (ROI) Analysis –

This report examines the return on investment (ROI) schools might expect over the long term by implementing technology-based testing in schools. The results of this report were based on an analysis of Alberta Education’s online pilot, called Science 30. The report concludes that potential cost savings for moving test and examination development to an online environment could prove substantial.

Innovations in Testing Technology: From promise to practice (Conference) – https://www.scienceandtechnologynetwork.ca/main/modules/news/article.php?storyid=26

The purpose of this conference was to bring together experts and partners in the field of technology and education assessment to share knowledge and discuss future research opportunities. Eighty-five people attended the event, including representatives from six provincial Ministries of Education, school districts, technology and assessment companies, educators and researchers.

Comparisons Between Paper and Computer Based Tests: Foundations Skills Assessment Data, 2001-2006 – http://maxbell.org/sites/default/files/038.pdf

This report is based on the test results of 15 schools that chose to electronically administer the Foundation Skills Assessment tests to grade 7 level students between 2004 and 2006. The study found that students did significantly better in both numeracy and reading multiple choice when paper tests were delivered. The difference between paper and electronic modes was greater for male than female students. This report suggests that there appears to be little significant difference between paper and electronic test modes for reading constructed response tests, and no difference for writing focused response or writing extended response.

Grant Details

Go Back to Grants Listing