Main Samples, Assessed in English
In 2003, 2878 children from 252 schools were in the main samples
to participate in national monitoring. Half were in year 4, the other
half in year 8. At each level, 120 schools were selected randomly
from national lists of state, integrated and private schools teaching
at that level, with their probability of selection proportional to
the number of students enrolled in the level. The process used ensured
that each region was fairly represented. Schools with fewer than
four students enrolled at the given level were excluded from these
main samples, as were special schools and Mäori immersion schools
(such as Kura Kaupapa Mäori).
Late in May 2003, the Ministry of Education provided computer files
containing lists of eligible schools with year 4 and year 8 students,
organised by region and district, including year 4 and year 8 roll
numbers drawn from school statistical returns based on enrolments
at 1 March 2003.
From these lists, we randomly selected 120 schools with year 4 students
and 120 schools with year 8 students. Schools with four students
in year 4 or 8 had about a one percent chance of being selected,
while some of the largest intermediate (year 7 and 8) schools had
a more than 90 percent chance of inclusion. In the two cases where
the same school was chosen at both year 4 and year 8 level, a replacement
year 4 school of similar size was chosen from the same region and
district, type and size of school.
Mäori Immersion Sample, Assessed
Predominantly in Te Reo
Details of the sample for the Mäori immersion assessments
are given in the separate report Assessment Results for Mäori
Students 2003: Science; Visual Arts; Graphs, Tables and Maps.
Pairing Small Schools
At the year 8 level, six of the 120 chosen schools in the main sample
had less than 12 year 8 students. For each of these schools, we identified
the nearest small school meeting our criteria to be paired with the
first school. Wherever possible, schools with eight to eleven students
were paired with schools with four to seven students, and vice versa.
However, the travelling distances between the schools were also taken
into account.
Similar pairing procedures were followed at the year 4 level. Four
pairs and one trio of very small schools were included in the sample
of 120 schools.
Contacting Schools
In the first week of June, we attempted to telephone the principals
or acting principals of all schools in the year 8 sample. In these
calls, we briefly explained the purpose of national monitoring, the
safeguards for schools and students, and the practical demands that
participation would make on schools and students. We informed the
principals about the materials which would be arriving in the school
(a copy of a 20 minute NEMP videotape plus copies for all staff and
trustees of the general NEMP brochure and the information booklet
for sample schools). We asked the principals to consult with their
staff and Board of Trustees and confirm their participation by the
end of June.
A similar procedure was followed at the end of July with the principals
of the schools selected in the year 4 samples, and they were asked
to respond to the invitation by the end of August.
Response from Schools
Of the 252 schools originally invited to participate, 246 agreed.
Four schools in the year 8 sample declined to participate: two intermediate
schools and one independent school because of severe space problems
associated with active building projects, and an intermediate school
because of an extreme overload of special projects. Two schools were
replaced in the year 4 sample: an integrated primary school because
of major building work and resultant space problems, and another
integrated school that failed to respond to repeated communications
after initially expressing interest in participating.
The six schools not participating were replaced with schools, from
the same district, matched as closely as possible on size and socio-economic
decile rating.
Sampling of Students
Each school sent a list of the names of all year 4 or year 8 students
on their roll. Using computer-generated random numbers, we randomly
selected the required number of students (12, or four plus eight
in a pair of small schools), at the same time clustering them into
random groups of four students. The schools were then sent a list
of their selected students and invited to inform us if special care
would be needed in assessing any of those children (e.g. children
with disabilities or limited skills in English).
For the year 8 sample, we received 112 comments from 67 schools about
particular students. In 54 cases, we randomly selected replacement
students because the children initially selected had left the school
between the time the roll was provided and the start of the assessment
programme in the school, or were expected to be away or involved
in special activities throughout the assessment week, or had been
included in the roll by mistake. The remaining 58 comments concerned
children with special needs. Each such child was discussed with the
school and a decision agreed. Fourteen students were replaced because
they were very recent immigrants or overseas students who had extremely
limited English language skills. Thirteen students were replaced
because they had disabilities or other problems of such seriousness
that it was agreed that the students would be placed at risk if they
participated. Participation was agreed upon for the remaining 31
students, but a special note was prepared to give additional guidance
to the teachers who would assess them.
For the year 4 sample, we received 92 comments from 55 schools
about particular students. Thirty-four students originally selected
were replaced because the lists originally supplied were incorrect
or the student had left the school or was expected to be away
throughout the assessment week. Nine students were replaced because
of their non-English speaking background (NESB) “status” and
very limited English. Twenty-two students were replaced because
they had disabilities or other problems of such seriousness the
students appeared to be at risk if they participated. Special
notes for the assessing teachers were made about 27 children
retained in the sample.
Communication with Parents
Following these discussions with the school, Project staff prepared
letters to all of the parents, including a copy of the NEMP brochure,
and asked the schools to address the letters and mail them. Parents
were told they could obtain further information from Project staff
(using an 0800 number) or their school principal, and advised that
they had the right to ask that their child be excluded from the assessment.
At the year 8 level, we received a number of phone calls including
several from students or parents wanting more information about what
would be involved. Seven children were replaced as a result of these
contacts, two at the child’s request and five at the parents’ request.
At the year 4 level we also received several phone calls from
parents. Some wanted details confirmed or explained (notably
about reasons for selection). Three children were replaced at
their parents’ request
and one at the child’s request.
Practical Arrangement with Schools
On the basis of preferences expressed by the schools, we then allocated
each school to one of the five assessment weeks available and gave
them contact information for the two teachers who would come to the
school for a week to conduct the assessments. We also provided information
about the assessment schedule and the space and furniture requirements,
offering to pay for hire of a nearby facility if the school was too
crowded to accommodate the assessment programme. This proved necessary
in several cases.
Results of the Sampling Process
As a result of the considerable care taken, and the attractiveness
of the assessment arrangements to schools and children, the attrition
from the initial sample was quite low. Only about two percent of
selected schools in the main samples did not participate, and less
than two percent of the originally sampled children had to be replaced
for reasons other than their transfer to another school or planned
absence for the assessment week. The main samples can be regarded
as very representative of the populations from which they were chosen
(all children in New Zealand schools at the two class levels except
the one to two percent in special schools, in Mäori immersion
programmes, or in schools with less than four year 4 or year 8 children).
Of course, not all the children in the samples actually could be
assessed. One place in each sample was not filled because insufficient
students were available in a school. Nine year 8 students and 23
year 4 students left school at short notice and could not be replaced.
Parents withdrew a year 8 student too late to be replaced, and four
year 8 students declined to participate when first approached by
the teacher administrators. Two year 8 students were suspended at
the time of the assessments and a year 4 student was found to be
misclassified (actually year 5). Seven year 8 students and five year
4 students were absent from school throughout the assessment week.
Some other students were absent from school for some of their assessment
sessions, and a small percentage of performances were lost because
of malfunctions in the video recording process. Some of the students
ran out of time to complete the schedules of tasks. Nevertheless,
for many tasks, over 95 percent of the sampled students were assessed.
No task had less than 90 percent of the sampled students assessed.
Given the complexity of the Project, this is a very acceptable level
of participation.
Composition of the Sample
Because of the sampling approach used, regions were fairly represented
in the sample, in approximate proportion to the number of school
children in the regions.
|