|
|
Teachers
acknowledge that the extensive information they gather on student
achievement and progress is useful, but they often report that
this activity limits teaching time. There is therefore a need
to determine the most appropriate and effective formats for gathering
comprehensive information on individual student progress, along
with an understanding of when and how these should be applied.
NEMP uses a variety of task formats to gather rich information
on what students know and can do. This probe study endeavoured
to determine the appropriateness and effectiveness of different
task formats. |
|
|
|
Twenty-seven
NEMP mathematics and science tasks at the Year 8 level were selected
for this study. Parallel versions of these tasks were then constructed
in four different formats: multiple-choice, short answer, stations
(working independently on tasks set up at stations around the room),
and one-to-one interviews. Students were given equipment to use for
the interview and stations tasks, but not for the short answer and
multiple-choice formats.
For interview tasks, the teachers read the questions and student responses
were video-
|
|
recorded.
For stations tasks and the multiple-choice format, students read the
questions and wrote their answers.
A total of 258 students took part in the study; each version of each
task was completed by a minimum of 62 students. Tasks were grouped
into four sets, with each set having a selection of assessment types.
Students were assigned to four groups of equivalent ability (as rated
by their classroom teacher). |
|
|
|
•
Students with lower reading abilities did not do as well in task formats
requiring them to read and write. These students struggled to understand
task requirements or to read task scripts. Reading load became a bigger
issue when tasks were set in an everyday context.
• Conversely, there were times when needing to read the question
assisted students. Complex questions, questions requiring involved
or extended answers, and responses requiring a synthesis of information
were answered better when presented in written form. The written format
was more appropriate than the oral format because students could control
the time they needed to re-read and process the information being
presented.
|
|
•
Oral answers were generally more extended than written answers. In
many cases, minimal amounts of information were presented in written
answers while spoken answers were extended (i.e., contained examples
and comprehensive explanations). The use of technical terms was also
greater, suggesting that students were reluctant to write down terms
that they were unsure how to spell.
• In most cases, students performed better when equipment was
available. Equipment seemed to help with understanding questions,
support the thinking process and assist when demonstrating solutions. |
|
|
|
By
using a variety of assessment task formats, and achieving a closer
match between the format used and the type of information required,
teachers can improve the quality of information gathered. In some
cases, an adequate picture of student progress can be obtained through
checklists. In other cases, particularly when measuring the attainment
of objectives reflecting higher order thinking skills, more complex
assessment methods provide a more informative picture. |
|
|
|
|
top
of page | return
to Contents | return
to Probe Studies menu to
FULL REPORT |
|
The full report
of this probe study will be available on this website by Jan 2004 or can
be obtained from USEE. |