|
BACKGROUND |
New Zealand's
national monitoring of students' educational achievements started in 1995.
This marked the realisation of a succession of recommendations, spanning
at least thirty years, from governmental enquiries and reports which highlighted
the need for regular, dependable and consistent information about the educational
achievements, attitudes and interests of New Zealand students. Prior to
1995 New Zealand had no national programme for systematically monitoring
student learning outcomes. While valuable information has been available
through participation in international IEA surveys, these cover only some
areas of our mandatory curriculum and include only a modest coverage of
learning outcomes. Furthermore, they are restricted mainly to paper and
pencil tests and questionnaires which are not tailored to New Zealand's
school system and curriculum. National monitoring covers most of the curriculum
in diverse ways to give a rich picture of student achievement. |
|
|
|
PURPOSE |
The National
Education Monitoring Project (NEMP) is a national assessment programme with
the purpose of obtaining a dependable national picture of what New Zealand
students know and can do. The programme monitors achievement trends over
time and provides information which is particularly relevant to the work
of both policy makers and practitioners. Because national monitoring also
meets a public accountability function, its descriptive reports on students'
achievements and attitudes are widely circulated. An important goal of the
project is to help identify what is being done well, areas of concern, and
priorities for future improvement in student achievement. |
|
|
|
ORGANISATION
AND
ADMINISTRATION
OF NEMP |
Random
Samples of Schools and Students
Each year nationally representative random samples of 2,880 students from 260
randomly chosen schools were carefully selected from national lists of state,
integrated and private schools; half at year 4 (ages 8-9) and half at year 8 (ages
12-13). From each school or pair of neighbouring small schools, twelve students
were randomly chosen to take part. In turn, the twelve students were assigned
to three groups of four. Each group of students worked on a different set of tasks
across all of the curriculum areas being assessed that year. Over a period of
a week, each student typically took part in about four hours of assessment activities.
Schools and parents/students were individually notified of their selection to
take part in national monitoring, and given the opportunity to withdraw.
Approximately
11,500 students from over 1000 schools took part in the first four years
of national monitoring. Over 1000 personal contacts were made with school
principals by the project's directors along with numerous discussions
with individual parents. In the first four years ten schools withdrew
for reasons not related to the project, and were subsequently replaced.
The number of students who were replaced for reasons other than change
of school or absences averaged approximately 2 percent of the original
sample.
|
^ |
Learning
Areas Assessed
Within repeating four year cycles, NEMP assessed and reported on all major
curriculum areas. This is recognition of the considerable value and importance
New Zealand attaches to a broad-based curriculum which relates to the
world around the school, and to learning for life beyond school.
In the first
four years of national monitoring, assessments were reported in 15 areas
of the curriculum: science, art, information skills (graphs, tables
and maps) (1995); reading and speaking, aspects of technology, music
(1996); mathematics, social studies, information skills (library and
research) (1997); writing, listening and viewing, health and physical
education (1998).
|
^ |
Curriculum
Advisory Panels
The process of identifying the key learning outcomes to be assessed (knowledge,
skills, understandings and attitudes) and deciding on suitable assessment
tasks involved curriculum advisory panels made up of curriculum specialists,
classroom practitioners and Mäori educators. These panels assisted with
drawing up frameworks to guide the development and selection of tasks
without being unduly prescriptive or limiting. The assessment frameworks
helped to guide task development decisions and helped those interested
in contributing task ideas. NEMP curriculum panels played an important
part in generating task ideas and guiding the final selection of tasks
that were used in the assessment programme.
National monitoring
drew upon the experience and insights of 65 nationally acknowledged
curriculum specialists, classroom practitioners and Mäori educators
who were members of the project's nine curriculum panels and its Mäori
Reference Group, Te Pitau Whakarei.
|
^ |
Task
Administration and Marking
Each year more than 100 national monitoring tasks and survey questionnaires
were administered by trained teachers seconded from their own schools
for periods of six weeks. Each teacher attended a one week national training
workshop then spent the following five weeks working with a paired colleague
in group of selected schools. Each week each pair of teachers visited
one school where twelve randomly selected students were assessed, or two
small neighbouring schools which together provided the required twelve
students. At the conclusion of the assessment programme in schools, tasks
were marked by senior tertiary education students and teachers. Tertiary
students generally marked tasks which required reasonably clear cut answers.
Teachers marked tasks requiring a higher degree of experienced professional
judgement. The payback for professional development from these involvements
is an established feature of the project.
In the first
four year cycle of national monitoring, 384 teachers had opportunities
to work as task administrators. 120 senior tertiary students and 660
experienced teachers were involved in marking. Systematic feedback each
year from all groups shows a high level of professional satisfaction
and extended understandings of assessment principles and methods.
|
^ |
Approaches
to Assessment
Assessing a full cross-section of students across a broad range of curriculum
outcomes in authentic, contextualised ways required varied approaches
suited to the different processes and outcomes that were assessed. National
monitoring used five main approaches for presenting assessment tasks,
each one allowing students easy access to the support of a trained teacher
assessor: Most sessions took approximately one hour.
One-to-one interview: each student worked individually
with a teacher with the whole session recorded on videotape.
Stations: four students worked independently, moving around
a series of task activity stations.
Team: four students worked collaboratively with the session
usually recorded on videotape.
Independent: four students worked individually on paper
and pencil task or art making tasks.
Open Space (physical education): four students, supervised
by two teachers, attempted a series of physical skill tasks with performances
being videorecorded.
Across the
first four years of national monitoring approximately 15,000 hours of
video recorded performances and 240,000 pages of paper responses (including
art works) were gathered for marking from a total of 499 tasks. Total
student assessment time amounted to approximately 45,000 hours. Approximately
four tonnes of supplies and equipment were in use around New Zealand
during each year's assessment programme. About one million bits of information
were produced from the marking of individual tasks each year. The highest
proportion of tasks used performance assessment methods. Very few tasks
involved paper and pencil multiple choice methods.
|
^ |
Low
Stakes, High Impact Assessment
There is a considerable imperative arising out of the low stakes nature
of national monitoring (no school, teacher or student is identified at
any point). Tasks and their presentation had to be designed so that students
would feel strongly inclined to produce and sustain their best efforts
regardless of individual differences in ability, background and experience.
However, the national monitoring project recognised that any form of externally
administered assessment inevitably constrains the sorts of things that
students can be asked to do. Despite this, there has been a strong commitment
to developing and administering tasks that reflect the best of day to
day teaching methods, learning experiences, and the world in which we
live. Emphasis was placed on assessing important or "big picture"
learning outcomes in order to sum up students' achievements at particular
points in time.
Every student
who has taken part in national monitoring assessment was asked to rate
their impressions of the tasks they attempted a s "really enjoyed"
or" not enjoyed". Feedback on this variable can be useful
for gauging task attributes that might have a negative impact on student
performance. Of the 499 tasks administered in the first four years,
in only 4 tasks did fewer students say they really enjoyed them than
those who said they did not enjoy them Ð each at the year 8 level.
|
^ |
Reporting
NEMP Results
The project attaches great importance to wide distribution and easy access
to its reports on student achievement. Annual reports of assessment results
provide task by task descriptive information in preference to statistically
aggregated data which can be largely meaningless and easily misused. Approximately
one third of each year's tasks will be used again as "link tasks" in the
second four year cycle to allow comparisons of performance over time.
The descriptive details of these monitoring tasks will not be fully published
until they have been repeated. The relative performance of subgroups was
also reported using nine demographic variables. Prior to public release,
all reports were examined by national reporting forums made up of curriculum
and assessment specialists, teachers and Mäori educators. Each year the
forum produced a summary statement of key findings and highlighted implications
for policy and practice.
In the first
four years of national monitoring a total of approximately 210,000 copies
of reports were distributed to major educational institutions and agencies,
and to every New Zealand school and its Board of Trustees. Around 170,000
copies were distributed of the Forum Comment which highlights major
findings and their implications for policy and practice.
|
|