|
ENSURING
CONSISTENCY: THE NEMP PROCESS OF CROSS-MARKING
Nicole Brown
|
This
report describes and reviews the cross-marking process developed
by NEMP to score performance-based assessment tasks on a consistent
basis. Cross-marking uses facilitated discussion on the part of
markers to establish, apply and verify marking criteria. Random
selections are made of the work of up to six students who have
completed each set of NEMP tasks. All NEMP markers then grade
this work at strategic points during the marking process so as
to build and check consistency. |
|
|
|
A
group of markers independently graded one student performance, using
the criteria given on the marking schedule. Markers were then asked,
in a group session, to explain their scores. The goal was to clarify
the choices and decision-making processes markers used and to obtain
agreement on the grades to be assigned. The marking of each task involved
three sessions—the beginning, middle and end of the marking.
Two student performances were considered in each cross-marking session.
To gauge the consistency of scoring overall,
|
|
markers
were asked not to change grades they assigned during the cross-marking
sessions. All cross-marking grades were collated into a grid showing
the pattern of scores awarded by each marker, then averaged to provide
a final score for each ‘cross–marked’ student script.
The cross-marking process was then observed with a view to identifying
those factors that hindered or facilitated this process. The extent
to which this process met consistency objectives was also considered. |
|
|
|
The
marking process
• Collaboration and joint ownership help develop good performance
assessment rubrics, and are of more value if they occur during the
marking process.
• Markers need to share their collective experience of students’
responses during scoring and use it to adjust the rubrics where appropriate.
Some rubrics designed using collaborative processes before the actual
marking period were difficult to apply when used by a different group
of markers during the actual marking period.
• An active facilitative approach is necessary. Markers may
be uncomfortable about explaining their scoring decisions, so positive
and sensitive facilitation skills are needed to encourage, acknowledge
and consider different points of view. Once markers are comfortable
about contributing to discussion, the facilitation process can focus
on identifying key issues and resolving them through the flexibility
of the marking schedule.
|
|
Consistency
• As tasks require greater professional judgement, the consistency
of scoring decreases.
• Consistency in scoring is generally greatest when a student’s
performance is considered particularly strong.
• Inconsistencies in scoring a student performance spreading
across three different scores seemed to arise from most markers having
a similar view of the performance but coming down on either side of
a marking ‘boundary’.
• Teachers who awarded scores falling significantly outside
the common range of agreement appeared to be influenced by whether
they had independently marked a series of very good or very poor performances
just before the cross-marking took place. |
|
|
|
Collaborative
discussions about the cross-marking process appear to have a strong
influence on its validity and reliability. The essential principle
of treating the scoring of student performance as a discussion process
involving collaboration, joint ownership and facilitation is one that
can easily be replicated in school settings. |
|
|
|
|
top
of page | return
to Contents | return
to Probe Studies menu to
FULL REPORT |
|
The full report
of this probe study will be available on this website by Jan 2004 or can
be obtained from USEE. |