Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/73498
Citations
Scopus Web of Science® Altmetric
?
?
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPapinczak, T.-
dc.contributor.authorPeterson, R.-
dc.contributor.authorBabri, A.-
dc.contributor.authorWard, J.-
dc.contributor.authorKippers, V.-
dc.contributor.authorWilkinson, D.-
dc.date.issued2012-
dc.identifier.citationAssessment and Evaluation in Higher Education, 2012; 37(4):439-452-
dc.identifier.issn0260-2938-
dc.identifier.issn1469-297X-
dc.identifier.urihttp://hdl.handle.net/2440/73498-
dc.description.abstractIn small groups, medical students were involved in generating questions to contribute to an online item bank. This study sought to support collaborative question‐writing and enhance students' metacognitive abilities, in particular, their ability to self‐regulate learning and moderate understanding of subject material. The study focused on supporting students to write questions requiring higher order cognitive processes. End‐of‐year formal examinations comprised 25% student‐generated questions (SGQs), while mid‐year examination items were completely unseen. Data were gathered from repeated administration of a questionnaire and from examination results. No statistically significant changes were identified in self‐rated monitoring of understanding and regulation of learning. The activity of generating questions supported students to work collaboratively in developing questions and answers. The bank of questions was appreciated by students as a source of revision material, even though it was not strongly focused on higher order processes. Based on scores, it would appear that many students chose to memorise the question bank as a 'high‐yield' strategy for mark inflation, paradoxically favouring surface rather than deep learning. The study has not identified directly improvements in metacognitive capacity and this is an area for further investigation. Continual refinement of the study method will be undertaken, with an emphasis on education of students in developing questions addressing higher order cognitive processes. Although students may have memorised the questions and answers, there is no evidence that they do not understand the information.-
dc.description.statementofresponsibilityTracey Papinczaka, Raymond Peterson, Awais Saleem Babri, Kym Ward, Vaughan Kippers & David Wilkinson-
dc.language.isoen-
dc.publisherCarfax Publishing Ltd-
dc.rights© 2012 Taylor & Francis-
dc.source.urihttp://dx.doi.org/10.1080/02602938.2010.538666-
dc.subjectassessment-
dc.subjectsmall‐group learning-
dc.subjectmetacognition-
dc.titleUsing student-generated questions for student-centred assessment-
dc.typeJournal article-
dc.contributor.departmentFaculty of Health Sciences-
dc.identifier.doi10.1080/02602938.2010.538666-
pubs.publication-statusPublished-
Appears in Collections:Aurora harvest 5
Medical Sciences publications

Files in This Item:
File Description SizeFormat 
RA_hdl_73498.pdf
  Restricted Access
Restricted Access250.77 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.