Writing to Learn with Automated Feedback through (LSA) Latent Semantic Analysis: Experiences Dealing with Diversity in Large Online Courses

Miguel Santamaria Lancho
UNED, Spain
msantamaria@cee.uned.es

Mauro Hernandez
UNED, Spain
mhernandez@cee.uned.es

Jose Maria Luzon Encabo
UNED, Spain
jmluzon@psi.uned.es

Guillermo Jorge-Botana
UNED, Spain
gdejorge@psi.uned.es

Abstract

The increasing demand for higher education and life-long training has induced a raising supply of online courses provided both by distance education institutions and conventional face to face universities. Simultaneously, public universities’ budgets have been experiencing serious cuts, at least in Europe. Due to this shortage of human and material resources, large online courses usually face great challenges to provide an extremely diverse student community with quality formative assessment, specially the kind that offers rich and personalized feedback. Peer to peer assessment could partially address the problem, but involves its own shortcomings.The act of writing has been identified as a high-impact learning tool across disciplines, and competence in writing has been shown to aid in access to higher education and retention. Writing to learn (WTL) is also a way to foster critical thinking and a suitable method to train soft skills such as analysis and synthesis abilities. These skills are the base for other complex learning methodologies such as PBL, case method, etc. WTL approach requires a regular feedback given by dedicated lecturers. Consistent assessing of free-text answers is more difficult than we usually assume, specially, when addressing large or massive courses. Using multiple choice objective assessment appears an obvious alternative. However, the authors feel that this alternative shows serious shortcomings when aiming to produce outcomes based on written expression and complex analysis. To face this dilemma, the authors decided to test an LSA-based automatic assessment tool developed by researchers of Developmental and Educational Psychology Department at UNED (Spanish National Distance Education University) named GRubric. The experience was launched in 2014-2015. By using GRubric, we provided automated formative and iterative feedback to our students for their open-ended questions (70-200 words). This allowed our students to improve their answers and practice writing skills, thus contributing both to better organize concepts and to build knowledge. In this paper, we present the encouraging results of our first two experiences with UNED Business Degree students in 2014/15 and 2015/16.

Full Text:

PDF

Refbacks

  • There are currently no refbacks.