NEMP About Us Reports Access Tasks Forum Comment Probe Studies Search
nznemp

Index of Annual NEMP Samples
of Schools and Students

.
cycle1 1995
1996
1997
1998
.
cycle1 1999
2000
2001
2002
.
cycle1 2003
2004
2005
2006
.
cycle1 2007
2008
2009
2010
.
 
 
column

ABOUT NEMP

column
 
column
KEY FEATURES
column
 
column
Co-Directors:
Jeffrey K. Smith
jeffrey.smith@otago.ac.nz

Emeritus Director:

unilogo

Educational Assessment
Research Unit
University of Otago,
Box 56, Dunedin 9054,
New Zealand

Toll free : 64 0800 808 561
Fax : 64 03 479 7550

Email : earu@otago.ac.nz

column
 
column


2007 Reports
Now Available from NEMP

Order your hard copies

2007 Reports Online
Science
Visual Arts
Graphs, Tables & Maps

column
 

 

2000 Reports

Main samples
In 2000, 2876 children from 260 schools were in the main samples to participate in national monitoring. About half were in year 4, the other half in year 8. At each level, 120 schools were selected randomly from national lists of state, integrated and private schools teaching at that level, with their probability of selection proportional to the number of students enrolled in the level. The process used ensured that each region was fairly represented. Schools with fewer than four students enrolled at the given level were excluded from these main samples, as were special schools and Mäori immersion schools (such as Kura Kaupapa Mäori).

Late in May 2000, the Ministry of Education provided computer files containing lists of eligible schools with year 4 and year 8 students, organised by region and district, including year 4 and year 8 roll numbers drawn from school statistical returns based on enrolments at 1 March 2000.

From these lists, we randomly selected 120 schools with year 4 students and 120 schools with year 8 students. Schools with four students in year 4 or 8 had about a one percent chance of being selected, while some of the largest intermediate (year 7 and 8) schools had a more than 90 percent chance of inclusion. In the four cases where the same school was chosen at both year 4 and year 8 level, a replacement year 4 school of similar size was chosen from the same region and district, type and size of school.

Additional samples
From 1999 onwards, national monitoring has included additional samples of students to allow the performance of special categories of students to be reported.

To allow results for Pacific students to be compared with those of Mäori students and other students, 10 additional schools were selected at year 4 level and 10 at year 8 level. These were selected randomly from schools that had not been selected in the main sample, had at least 20 percent Pacific students attending the school, and had at least 12 students at the relevant year level.

To allow results for Mäori students learning in Mäori immersion programmes to be compared with results for Mäori children learning in English, 10 additional schools were selected at year 8 level only. They were selected from Mäori immersion schools (such as Kura Kaupapa Mäori) that had at least 4 year 8 students, and from other schools that had at least 4 year 8 students in classes classified as Level 1 immersion (80 to 100 percent of instruction taking place in Mäori). Only students that the schools reported to be in at least their fifth year of immersion education were included in the sampling process.

Pairing small schools
At the year 8 level, 9 of the 120 chosen schools in the main sample had less than 12 year 8 students. For each of these schools, we identified the nearest small school meeting our criteria to be paired with the first school. Wherever possible, schools with 8 to 11 students were paired with schools with 4 to 7 students, and vice versa. However, the travelling distances between the schools were also taken into account. Six of the 10 schools in the year 8 Mäori immersion sample also needed to be paired with other schools of the same type.

Similar pairing procedures were followed at the year 4 level. Nine pairs were required in the main sample of 120 schools. In one further case, a trio of schools was formed, with four students sampled from each school.

Contacting schools
At the first week of June, we attempted to telephone the principals or acting principals of all schools in the year 8 samples (excluding the 13 schools in the Mäori immersion sample). We made contact with all schools during that week or early in the next week.

In our telephone calls with the principals, we briefly explained the purpose of national monitoring, the safeguards for schools and students, and the practical demands that participation would make on schools and students. We informed the principals about the materials which would be arriving in the school (a copy of a 20 minute NEMP videotape plus copies for all staff and trustees of the general NEMP brochure and the information booklet for sample schools). We asked the principals to consult with their staff and Board of Trustees and confirm their participation by the middle of July.

A similar procedure was followed in early August with the principals of the schools selected in the year 4 samples, and they were asked to respond to the invitation by the end of August. The principals of the 16 schools in the Mäori immersion sample at year 8 level were contacted towards the end of August, and were sent brochures in both Mäori and English.

Response from schools
Of the 296 schools originally invited to participate, 291 agreed. All five schools that declined to participate were in the year 8 sample. Three of these schools said that they needed a break, having participated in 1999. Another had special pressures in 2000, but was willing to participate in 2001. The fifth was a small school dealing with the death of a pupil, and the principal felt under too much pressure. At a later stage, too late for replacements to be organised, two schools in the Mäori Immersion sample withdrew. One had arranged a two week field trip overlapping with their chosen assessment week. The other had reservations about participation and decided that they were just too busy.

Sampling of students
With their confirmation of participation, each school sent a list of the names of all year 4 or year 8 students on their roll. Using computer generated random numbers, we randomly selected the required number of students (12, or 4 plus 8 in a pair of small schools), at the same time clustering them into random groups of four students. The schools were then sent a list of their selected students and invited to inform us if special care would be needed in assessing any of those children (e.g. children with disabilities or limited skills in English).

At the year 8 level, we received 124 comments from schools about particular students. In 55 cases, we randomly selected replacement students because the children initially selected had left the school between the time the roll was provided and the start of the assessment programme in the school, or were expected to be away throughout the assessment week. The remaining 69 comments concerned children with special needs. Each such child was discussed with the school and a decision agreed. Nine students were replaced because they were very recent immigrants or overseas students who had extremely limited English language skills. Sixteen students were replaced because they had disabilities or other problems of such seriousness that it was agreed that the students would be placed at risk if they participated. Participation was agreed upon for the remaining 44 students, but a special note was prepared to give additional guidance to the teachers who would assess them.

In the corresponding operation at year 4 level, we received 145 comments from schools about particular students. Forty-seven students originally selected needed to be replaced because they had left the school, were not actually year 4 students, or were expected to be away throughout the assessment week. Nine students were replaced because of their NESB status and very limited English. Forty students were replaced because they had disabilities or other problems of such seriousness the students appeared to be at risk if they participated (31 because of severe disabilities or learning difficulties and 9 because of limited ability to cope emotionally with the assessment situation). Special notes for the assessing teachers were made about 49 children retained in the sample.

Communication with parents
Following these discussions with the school, Project staff prepared letters to all of the parents, including a copy of the NEMP brochure, and asked the schools to address the letters and mail them. Parents were told they could obtain further information from Project staff (using an 0800 number) or their school principal, and advised that they had the right to ask that their child be excluded from the assessment.

At the year 8 level, we received about 20 phone calls including several from students wanting more information about what would be involved. The main issues raised by parents were our reasons for selection of their child, a wish for fuller details or reiteration of what would be involved, concerns about the use of video equipment, or reluctance of the child to take part. Ten children were replaced as a result of these contacts, two at the child's request, and eight at the parents' request (two families would not allow their child to view videos or use computers on religious grounds, the other six families simply requested that their child not participate).

At the year 4 level we also received about 10 phone calls from parents. Some wanted details confirmed or explained (notably about reasons for selection). Three children were replaced at parents' request (one because of concern about the emotional demands on their child, one because of concern about missing class time, and one because the parents felt the child was not suited to the assessments).

Practical arrangement with schools
On the basis of preferences expressed by the schools, we then allocated each school to one of the five assessment weeks available and gave them contact information for the two teachers who would come to the school for a week to conduct the assessments. We also provided information about the assessment schedule and the space and furniture requirements, offering to pay for hire of a nearby facility if the school was too crowded to accommodate the assessment programme.

Results of the sampling process
As a result of the considerable care taken, and the attractiveness of the assessment arrangements to schools and children, the attrition from the initial sample was quite low. Less than three percent of selected schools did not participate, and less than three percent of the originally sampled children had to be replaced for reasons other than their transfer to another school. The sample can be regarded as very representative of the population from which it was chosen (all children in New Zealand schools at the two class levels except the one to two percent in special schools or schools with less than four year 4 or year 8 children).

Of course, not all the children in the sample actually were able to be assessed. Nine year 8 students and 18 year 4 students left school at short notice and could not be replaced. Two year 8 students withdrew too late to be replaced. A further 10 year 8 students and 4 year 4 students were absent from school throughout the assessment week. Some others were absent from school for some of their assessment sessions, and a small percentage of performances were lost because of malfunctions in the video recording process. Some of the students ran out of time to complete the schedules of tasks. Nevertheless, for many tasks over 95 percent of the student sample were assessed. No task had less than 90 percent of the student sample assessed. Given the complexity of the Project, this is a very acceptable level of participation.

Composition of the sample
Because of the sampling approach used, regions were fairly represented in the sample, in approximate proportion to the number of school children in the regions.

Region

Percentages of children from each region
Region % of year 4 sample % of year 8 sample

Northland

4.2 5.0
Auckland
30.8 30.0
Waikato
10.0

9.2

Bay of Plenty/Poverty Bay
8.3 8.3
Hawkes Bay
4.2 5.0
Taranaki
3.3 3.3
Wanganui/Manawatu
5.8 5.8
Wellington/Wairarapa
11.7 10.8
Nelson/Marlborough/W. Coast
4.2 4.2
Canterbury
10.8

11.7

Otago
4.2 4.2
Southland 2.5 2.5
 
Demography
Percentages of children in each category of the demographic variables
Variable Category % of year 4 sample % of year 8 sample
Gender Male
Female
48
52
52
48
Ethnicity Non-Mäori
Mäori
77
23
82
18
Geographic Zone Greater Auckland
Other North Island
South Island
29
49
22
30
47
23
Community Size > 100,000
10,000-100,000
< 10,000
57
25
18
55
22
23
School SES Index Bottom 30 percent
Middle 40 percent
Top 30 percent
28
36
36
18
46
36
Size of School < 20 y4 students
20-35 y4 students
> 35 y4 students
15
23
62
 
  <35 y8 students
35-150 y8 students
> 150 y8 students
 

25
30
45

Type of School Full Primary
Intermediate
Other (not analysed)
  33
49
18
 
 
Contact details:      Email : earu@otago.ac.nz   |   Freephone 0800 808 561   |   Fax 64 3 479 7550   |   Updated October 2008

REPORTS FORUM COMMENTS ACCESS TASKS PROBE STUDIES ABOUT US EARU