Alderman and Milne (2005) observe that work-based learning is now a significant part of numerous higher education qualifications, however, the monitoring (among others) thereof is subject to much debate. “Work-based learning (WBL) is undertaken in a wide variety of higher education contexts and is increasingly viewed as a valuable, and increasingly essential, component of both the undergraduate and postgraduate student learning experience. However, the development of rigorous pedagogies to underpin WBL and its assessment is still embryonic.” (Brodie & Irving 2008). The purpose of this post is an attempt to add a little to the existing body of knowledge.
Some qualifications, in the light of their particular outcomes, are designed to incorporate periods of required work-experience (real-life learning) that integrate with the academic study. Where Work-integrated Learning (WIL) is a structured part of such a qualification the volume (number of academic credits) of learning allocated thereto, should be appropriate to the purpose of the qualification and to the cognitive demands of the learning outcome and assessment criteria contained in the appropriate level descriptors. Following this expanded version of a paragraph (p. 9) that appeared in the South African Higher Education Qualifications Framework (HEQF) as policy in terms of the Higher Education Act, gazetted by the Department of Education (DoE 2007), it is stated:
It is the responsibility of institutions, which offer programmes requiring WIL credits to place students into WIL programmes. Such programmes must be appropriately structured, properly supervised and assessed.
Two 2004 publications of the Higher Education Quality Committee (HEQC), a permanent structure of the South African Council on Higher Education (CHE), detail the last sentence of the quotation above. The CHE is an independent statutory body, which is responsible for advising the South African Minister of Education about higher education policy matters. The HEQC, led by an Executive Director, has executive responsibility for quality promotion and quality assurance of all higher education institutions. The two sets of HEQC (2004) criteria, namely two for Institutional Audits and for Programme Accreditation, therefore serve as important imperatives. A synopsis thereof includes:
· Effective management and coordination, clear delineation of responsibilities and adequate provision of resources
· Structured learning programme to accomplish the outcomes and learning agreements
· Communication between host organisations, the institution and students
· Mentoring system (supervision in the work place) that enables the student to recognise strengths and weaknesses, develop abilities, and to gain knowledge of work practices
· Monitoring and recording systems to regularly and systematically assess the progress and learning of students in the workplace
An assessment workshop of the Geraldine Rockefeller Dodge Foundation (Grant 2007:1) generated a nine significant concepts and principles regarding assessment, which ties in with the mentioned short course, namely:
1. “The primary purpose of assessment is to improve performance, not audit it”, which ties up with the mentoring imperative of the HEQC.
2. “Good assessment requires being clear about mission and goals, the standards to which you aspire, and the criteria by which you would measure success.
3. Therefore, it is about MEASURING WHAT MATTERS. (If you assess what you value, others will value what you assess.)
4. And, necessarily, it becomes about PLANNING BACKWARDS.
5. Assessment that improves performance involves FEEDBACK.
6. One tool for getting useful feedback on what matters most is the RUBRIC.
7. Good assessment requires a variety of measures, data, and feedback.
8. Good assessment is ongoing. It is about continuous improvement. And unless we designate and protect the TIME to do this work, it will not happen.
9. Done collectively, assessment builds community.”
Astin, Banta, Cross, El-Khawa, Ewell, Hutchings, Marchese, McClenney, Mentkowski, Miller, Moran and Wright (s.a.) support and supplement the nine principles above: “…it begins with educational values; …is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time; …works best when the programs it seeks to improve have clear, explicitly stated purposes; …requires attention to outcomes but also and equally to the experiences that lead to those outcomes; …works best when it is ongoing not episodic; …fosters wider improvement when representatives from across the educational community are involved; …makes a difference when it begins with issues of use and illuminates questions that people really care about; …is most likely to lead to improvement when it is part of a larger set of conditions that promote change; and through assessment, educators meet responsibilities to students and to the public.”
Burns (2008) remarks that assessment of the learning of students in clinical conditions; is unfortunately not always easy. She mentions patient participation in assessment of interpersonal skills, but expresses reservations about the ability to assess other competence. The Making Practice Based Learning Work (s.a.) website aims to enhance the quality of student experiences whilst on practice, i.e. work-based placements. This website is the product of a collaborative project involving staff from Bournemouth, Northumbria and Ulster universities as result of a grant from the Department of Employment and Learning (Northern Ireland) and the Higher Education Funding Council for England, to make practitioners more effective in and promoting the quality of practice based learning. In one of the many resources of this website, Allin and Turnock (s.a.) point out that assessment should be based on criteria that are known to the student; and also that assessment should be continuous—final judgement should never be on a single incident. Gray (in Allin & Turnock s.a.) observes that the contemporary focus of work-based learning is on competency standards, also called occupational and employment-related standards. Placement or work-based learning outcomes are therefore defined by means of outcomes to be achieved by students. Assessment, therefore take on a problem-centred approach, rather than knowledge-orientation. Demonstrating competency is about proving attainment of skills, not just writing about, and assessment becomes an integral part of the learning process. Bortot, Culberson, Modak, Becan-McBride and Niemand (2004) also emphasise that students must master defined competencies.
In contrast to the emphasis on outcomes and competencies, Bates (2003) argues that specific behavioural objectives cannot be specified in advance, because the moments of challenge cannot be predicted. Peterson (2004: 60) described such learning opportunities as “ambiguous, messy, and unclear” and observes that students “often do unexpected and serendipitous things”. Clark (2006:584) emphasises, in addition to “scientific dimensions of practice”, “technical knowledge and skills of the profession”, also “the so-called artistic … the ability to grapple with the gray or indeterminate areas of practice where moral ambiguity, value conflicts, and ethical dilemmas are commonplace”. Raths (in Bates 2003) states that the work-based learning (WBL) curriculum aught to be based on propositions about learning rather than specifically prescribed objectives. Bates (2003:303) found that a substantial body of literature address curriculum design and expected learning outcomes, but she did not find studies analysing “how students make sense” of the workplace experience. Baxter-Magolda (in Bates 2003) uses the term ‘self-authorship’ for the process that takes place as students incorporate their newfound learnings into their self-concept—“students construct knowledge as they construct ideas [that] they form about themselves” (p. 322). The WBL journals of students indicate that significant things happen unpredictably during WBL and that each placement entails different challenges. Gibbs & Panayiotis (2004) to some extent echo this by arguing that the assessment practices of higher education aught not enframe students, but rather liberate them by a notion of ‘letting learn’. Clark (2006:581) phrases the notion well by saying “learning is a continuous process grounded in experience, not an outcome”.
Ulmer (2001:68) cautioned about ‘patterning’ instead of learning—oversimplified it results in “Give me an example of how to think and do, and I’ll think and do like you.” Ulmer (2001) argues for problem-based learning—that is “learning that results from the process of working toward the understanding or resolution of a problem” (Barrows & Tamblyn in Peterson 2004)—and self-grading to achieve formative assessment. Ulmer emphasises that it is important that students know how they are learning; that they know if they are thinking critically; and that they can reflect on their own patterns of thought. Buchy & Quinlan (2000) made monitoring a participatory activity through a self-assessment scoring matrix. It develops the reflective thinking of students, as well as critical awareness of their own learning. Morton, Anderson, Frame, Moyes and Cameron (2006) found a stark contrast between the self-assessment of students and objective performance-based assessment of medical procedural competency.
It seems that there are two camps, one emphasising competencies and another that believes it limits. There appear to be a correlation between the schools of thought and the nature of the field of study, some do require specified competence, others are more open and some both.
REFERENCES
Alderman, B. & Milne, P. 2005. A model for work-based learning. Metuchen, NJ: Scarecrow Pr.
Allin, L. & Turnock, C. s.a. Helping you understand assessment [online]. Making Practice Based Learning Work. Available on the Internet at: http://www.practicebasedlearning.org/students/docs/Assessment%20in%20the%20Work%20Place%20for%20students.doc
Astin, A.W.; Banta, T.W.; Cross, K.P.; El-Khawa, E.; Ewell, P.T.; Hutchings, P.; Marchese, T. J.; McClenney, K.M.; Mentkowski, M.; Miller, M.A.; Moran, E.T.; Wright, B.D. s.a. Principles of Good Practice for Assessing Student Learning [online]. Mercer University. Available on the Internet at: http://www2.mercer.edu/Assessment/Principles.htm
Bates, M. (2003) The assessment of work integrated learning: symptoms of personal change. Journal of Criminal Justice Education, 14(2), 303-326.
Bortot, A.T.; Culberson, J.W.; Modak, I.; Becan-McBride, K. & Nieman, L.Z. 2004. Assessment of competencies in physicians-in-training through the delivery of a community-based health curriculum using distance learning. Medical Teacher, 26(7), 615-620.
Brodie, P. & Irving, K. 2007. Assessment in work-based learning: investigating a pedagogical approach to enhance student learning. Assessment & Evaluation in Higher Education, 32(1), 11-19.
Buchy, M. & Quinlan, K.M. 2000. Adapting the scoring matrix: a case study of adapting disciplinary tools for learning centred evaluation. Assessment & Evaluation in Higher Education, 25(1), 81-91.
Burns, D. 2008. A test of fitness: Dianne Burns asks if the profession should adopt new approaches to assessing student learning. Nursing Standard, 61(1), 23 April 2008, 22-23.
Clark, P.G. 2006. What would a theory of interprofessional education look like? Some suggestions for developing a theoretical framework for teamwork training. Journal of Interprofessional Care, 20(6), 577-589.
Dictionary.com [online] Available on the Internet at: http://dictionary.reference.com/browse/clinical
DoE see South Africa. Department of Education.
Gibbs, P. & Panayiotis, A. 2004. Accreditation of knowledge as being-in-the-world. Journal of Education and Work, 17(3), September 2004, 333-346.
Grant, D. 2007. Assessment Principles & Concepts [online]. Geraldine Rockefeller Dodge Foundation. Available on the Internet at: http://www.grdodge.org/learning/assessment/principles.htm
HEQC see South Africa. Council for Higher Education, Higher Education Quality Committee.
Making Practice Based Learning Work [online]. s.a. Available on the Internet at: http://www.practicebasedlearning.org/home.htm
Morton, J.; Anderson, L.; Frame, F.; Moyes, J. & Cameron, H. 2006. Back to the future: teaching medical students clinical procedures. Medical Teacher, 28(8), 723-728.
Peterson, T.O. 2004. Assessing performance in problem-based service-learning projects. New Directions for Teaching and Learning, 100, Winter 2004, 55-63.
South Africa. Department of Education. 2007. Higher Education Qualifications Framework. (Government Notice No 928), Government Gazette (No. 30353), 5 October 2007.
South Africa. Council for Higher Education, Higher Education Quality Committee. 2004. Criteria for Institutional Audits. Pretoria: Compress. Available on the Internet at: http://www.che.ac.za/documents/d000061/CHE_Institutional-Audit-Criteria_June2004.pdf
South Africa. Council for Higher Education, Higher Education Quality Committee. 2004. Criteria for Programme Accreditation. Pretoria. Available on the Internet at: http://www.che.ac.za/documents/d000084/CHE_accreditation_criteria_Nov2004.pdf
Starr-Glass, D. 2005. Metaphors and maps in evaluation. Assessment & Evaluation in Higher Education, 30(2), April 2005, 195-207.
Ulmer, M.B. 2001. Self-grading for formative assessment in problem-based learning. Academic Exchange Quarterly, 5(1), 68-71.