The learning guarantee programme

S. GIRIDHAR

back to issue

IT is late night on Saturday, 14 February, and I have just returned after a week in Bellary and Gulbarga. My mind is full of vivid images of unsung heroes and untold stories from some of north-east Karnataka’s most inspirational schools; the pride and joy on the faces of the teachers as 40 outstanding schools that met the criteria defined by the Learning Guarantee Programme are feted and honoured by Azim Premji and Karnataka’s Education Minister, Chandrasekhar, at the learning guarantee award function.

The hall is overflowing with children, head teachers, teachers and members of the school monitoring committees from all parts of north-east Karnataka. The thrill of being winners was all pervasive. And though the winning schools constitute a small minority, there seemed to be a determination as well as optimism among the schools that did not qualify to win next year. Each of the 896 schools that opted for evaluation in the programme in 2003 will be given a detailed analysis of their school’s evaluation so as to use this feedback to develop their own action plans.

The Azim Premji Foundation, operational since 2001, implements programmes that are focused towards improving the quality of learning in government schools. The foundation’s approach is to work in partnership with the government; it has a formal memorandum of understanding with the Government of Karnataka to work together for universalization of elementary education. The foundation chose to concentrate on north-east Karnataka because this area has pronounced adverse conditions. More than 48% of Karnataka’s out of school children are from the seven districts of this region and, on every conceivable index of education and economy, this region is clearly disadvantaged.

Data from the state educational census conducted in 2002 showed that 3.19 lakh out of the 6.66 lakh out of school children in Karnataka come from the north eastern districts of the state. The districts with the largest ‘out-of-school’ population of children in the age group of six to 14 are Yadgir (22.3%), Koppal (16.31%) and Raichur (15.92%).

The literacy rate in this area is 55.78% compared to state average of 67% and national average of 65%. The dropout rate is 17% compared to the state average of 13% and the repetition rate is 6.38 compared to state average of 4.50. The region has a teacher to pupil ratio of 1:46 compared to state average of 1:36.

The poor situation extends to sectors other than education. Child health and nutrition indicators point to higher levels of infant mortality and morbidity and malnutrition than in other parts of the state. Chronically drought prone, this area provides little scope for continuous year-round employment.

 

 

During the course of its work the foundation saw that despite the tough conditions in most of the habitations in north-east Karnataka there were still some schools which stood out from the rest – where there was high enrolment, regular attendance and high learning levels.

The Learning Guarantee Programme was designed to identify schools that are achieving expected learning competencies for all their children, reward and recognize them, identify factors that enable these schools to perform beyond constraints and communicate their best practices and motivate all the other schools to emulate them.

Like all programmes of the Azim Premji Foundation, this too is a joint initiative in partnership with the government. The programme is designed to run through a three year period of evaluating schools in 2003, ’04 and ’05. Launched in November 2002 in all the districts of north-east Karnataka (Bellary, Bijapur, Bagalkot, Raichur, Bidar, Gulbarga, Yadgir and Koppal), the programme was communicated personally to each of the 9270 primary and higher primary schools in the region, offering them not only the choice of participation but also the year in which they wished to be evaluated. The criteria for evaluation of schools were enrolment, attendance and learning outcomes of their children.

 

 

Some of the significant aspects of the programme are:

* Participation is voluntary and open to all primary and upper primary schools. There is no screening process except that the schools have to complete all aspects of the application.

* Participating schools have the freedom to choose the year when they wish to be evaluated.

* Schools can decide what they want to do with the award money. They indicate this on the application form.

* Finally, the evaluation of the schools provides an opportunity for teachers and the educational system to understand the status of individual schools. This could feed into plans for strengthening individual schools/clusters of schools.

Given the focus on voluntary participation of schools, it was decided to launch the programme with emphasis on advocacy and communication. First, to build awareness about learning outcomes and to mobilize a broad base of popular support for the programme; second, to reach out to every school in the region to solicit its participation. A significant component of this communication exercise was the personal communication to each of the 9270 schools in north-east Karnataka, appealing to the pride and competitive spirit of the habitation and the school. Jingles on the radio and announcements in newspapers reiterated the message so that schools and communities were motivated to come forward and offer themselves for evaluation.

 

 

The objectives and goals of the programme were introduced to the educational officials (DDPIs, BEOs and CRCs) in all the eight districts through a series of district level meetings where representatives of teachers unions, headmasters and other local leaders were also invited.

By December 2002, each of the 9270 schools in the region had received the programme communication and application forms. An overwhelming 70% of schools, i.e. 6484 schools sent a request for the prospectus. This enthusiastic response showed that schools and teachers were keen to look at learning outcomes, the awards offered making it all the more attractive. Perhaps the fact that there would be no screening and that participation was not contingent on the status of current learning levels may also have spurred such a response.

The 6484 schools were given detailed application forms and prospectus. The participating schools exercised their option of choosing the year that they wished to be evaluated: 2003, 2004 and 2005. The cut-off date for participation was extended to end March 2003 by the foundation. 1,888 schools completed the process and qualified as participants. Of these, 896 schools offered themselves for assessment in 2003.

 

 

In the application form, the school was required to provide the following information:

* Medium of instruction, number of classes in the school, number of classrooms and number of teachers.

* Enrolment and attendance data for children from classes 1-7 in the school.

* Data on number of children in the 6-11 age group in the habitation, number enrolled and number of children who were out of school as per the child Census of Karnataka 2002.

* Details of the school examination results.

* The results of a baseline study done by conducting a test for classes 2 to 5 based on model test papers prepared by the foundation. One purpose of the model test is give the school an idea of the format of external evaluation and more importantly, to make a self assessment of learning achievement.

* Requirements of the school on which the award money, if received, would be spent.

* Year/s in which the schools would like to be evaluated under the programme.

Despite the well thought out steps of reaching out to every school and responding to requests by posting the prospectus, there were still some gaps evident during the field visits and field interactions. Some schools did not receive the applications; others felt that they did not receive any help in completing the application process. In some cases schools that had filled up the prospectus did not receive any confirmation that they were in the programme. In one school the teachers feared that since the project was for three years they would not be able to get a transfer. In another, the teachers feared that the evaluation would be used to penalize schools or individual teachers. By and large, however, the participating schools seemed excited about to being evaluated.

 

 

After the initial workshops and meetings with the education functionaries in November 2002 to introduce the programme, no clear follow-up role for the cluster resource coordinators, block resource persons and block education officers was defined. This shortcoming, sharply brought out in interactions with the block education officers in June 2003, was soon rectified by involving education functionaries in the smooth conduct and logistics of school evaluation, with responsibility assigned to ensure that schools knew their scheduled week for evaluation, and communicating the teacher awards and giving certificates of appreciation to the participating schools to keep their morale high.

 

Evaluation criteria

• 100% enrolment of all children in the habitation in the ages 6 to 11, as per the child census survey 2002 must be enrolled in school);

• 90% of all children enrolled in Class 1 to 5 must attend school on at least 70% of the school working days during the academic year;

and

• learning achievement of children:

* Category A – 80% of all children enrolled in classes 2, 3, 4, 5 should have attained 90% of the prescribed competencies for class 1, 2, 3 and 4 respectively. This entitles the school to an award of Rs 20,000.

* Category B – 70% of all children enrolled in classes 2, 3, 4, 5 should have attained 90% of the prescribed competencies for class 1, 2, 3 and 4 respectively. This entitles the school to an award of Rs 10, 000.

* Category C – 60% of all children enrolled in classes 2, 3, 4, 5 should have attained 90% of the prescribed competencies for class 1, 2, 3 and 4 respectively. This entitles the school to an award of Rs 5,000.

 

The Commissioner for Public Instruction, Government of Karnataka was involved throughout; every communication to the school was signed by him. Similarly, the DSERT and the DIETs helped the foundation in the selection and training of evaluators.

 

 

The next stage involved the preparation and unrolling of a massive evaluation exercise: 896 schools in three months, July to September 2003. This called for detailed and meticulous planning, scheduling, logistics, selecting a large but high quality team of evaluators, their training, close supervision to ensure adherence to quality and, finally, professional data entry and analysis.

1) A team of 584 evaluators (all of them graduates, many with a MSW or B.Ed degree additionally) formed 146 teams. Each school evaluation took an average of five days. The school evaluation was led by a field project leader who had a team of 37 area coordinators positioned in 37 taluk headquarters to supervise the evaluation teams.

2) The 584 evaluators were selected in June 2003 through a process of preliminary screening of the applications and bio-data, written test, and interview with a panel consisting of designated officers of DSERT, Karnataka and Azim Premji Foundation.

3) The foundation trained the evaluators in a five day residential programme and evaluation team leaders were selected on the basis of their aptitude.

4) The evaluation team members were trained in appropriate child friendly methods to conduct these tests. A certificate of satisfactory completion of training was a pre-requisite for qualifying as a member of the team.

5) A evaluation process manual was prepared (before the training programme for evaluators) following a number of pilot tests and dry runs and after consultations with Six Sigma quality experts. Professor Nayana Tara of IIM Bangalore provided additional inputs and vetted the manual.

6) The foundation scheduled the 896 school evaluations and informed each of the schools being assessed in writing, the dates of their evaluation at least 14 days in advance. These letters were signed by the Commissioner for Public Instruction, underscoring that this programme was as much of the Government of Karnataka as of the foundation.

7) Each school evaluation by the four member team took five working days – three days for testing every child (oral and written tests) in language and mathematics, the other two days to collect, verify and record data on the school’s enrolment and attendance records.

8) The foundation used different sets of question papers for each week of the evaluation process.

9) The data was cross-checked and verified at various stages: at the school by the team leader, at the block headquarters by the area coordinator and at the foundation in Bangalore before data entry and computer processing of results. Surprise checks during evaluation and formal fortnightly review of the data collected was part of the quality assurance process.

10) The results were presented to a committee consisting of the CEO, Azim Premji Foundation, the Commissioner of Public Instruction, Department of Public Instruction, Government of Karnataka, Director, DSERT Karnataka and Nayana Tara of IIM Bangalore for approval.

 

The Results in a nutshell:

1. 20% of the government primary and higher primary (1888 of 9270) schools in north-east Karnataka are voluntarily participating in the program over the three years.

2. 896 schools (50% of the participating schools) chose to be evaluated in 2003.

3. 40 schools have won the Learning Guarantee Programme Award for 2003.

4. There are 12 winning schools in category A ( Rs 20000); 14 in category B (Rs 10000) and 14 in category C (Rs 5000).

5. Among the 40 winning schools, 465 girls and 446 boys (12% of total number of participants) demonstrated attainment of 100% competencies in maths and language.

6. Enrolment and attendance in all the schools evaluated were uniformly high. 53% of the schools met the enrolment criteria and 47% met the attendance criteria. The key differentiator was ‘learning levels’. Only 7.3% schools could meet the criteria of learning achievement.

7. The average pupil-teacher ratio (PTR) in the winning schools is 37 against the north-east Karnataka average of 43.

8. The average school strength in the winning schools (class I to V) is 159 as compared to the north-east Karnataka average of 185.

9. Only about 5% (40 out of 896 evaluated) primary schools ensure learning competencies for majority of children in the school.

 

 

Will the manpower, time and exhaustiveness of the current model of evaluation be a limiting factor to the speedy expansion of the programme? After all, one of the objectives of a successful experiment is that it should be replicated. If so, what could be the desirable modifications?

One option is to only evaluate learning achievement of children and not enrolment and attendance. Since the first year of evaluation shows that most schools are either meeting or close to meeting the enrolment and attendance criteria, why not just test for learning? This approach would save nearly 40% in terms of time and costs.

Additionally, some have suggested that a sample of children be evaluated instead of testing every child. This would reduce testing time by nearly 50% and if done scientifically the results would be absolutely reliable. However, there are some points to bear in mind. The children in the participating schools are eager; there are awards for outstanding performances by individual children and sampling would deny an opportunity to all. Community members may feel that their child has lost an opportunity to be assessed if he or she is not in the sample. One unique feature of the learning guarantee programme is that unlike most other programmes which give us empirical evidence of learning levels based on samples, here every child is tested.

Another issue for reflection is the nature of the programme, conceived as an outcome based exercise to evaluate schools on a defined criteria. This school of thought believes that the purpose of this programme – to spontaneously spur all schools to strive for quality – is only to hold a mirror of current achievement to the schools, the only way it can spread across more states in the country. But is it enough to just evaluate and inform schools with an analysis of their current learning levels and pinpoint the areas for improvement? Should the programme not involve itself with schools to help them improve.

 

 

What should be the next phase and potential for expansion? 1500 schools have offered themselves for evaluation in 2004. This time, the foundation must ensure early involvement of the government’s block education officers with well-defined roles and a greater sense of ownership. There is also a plan to assess children on all-round development parameters (even though the school will still be rated on the learning achievements as evaluated through competency based testing) to provide qualitative information on the process of learning and overall development, in addition to empirical testing for outcomes. The challenge will be to design appropriate measurement tools.

A study to identify key differences between schools that participated and those who did not has been undertaken by Princeton University. Its findings are expected to be released by March 2004. Dr. Jalaluddin and his organization, NEEV, are conducting case studies of the processes in successful schools as part of the overall aim to gain greater insight and understanding.

The enormous wealth of data that has been collected on the current learning levels offers a unique basis for schools to implement their own home-grown improvement programmes. The foundation will inform every school about its class wise, gender wise performance, identify critical areas and competencies where the learning is not demonstrated so that action plans can be prepared on the basis of sharp granular data.

The programme’s ability to get spontaneous and voluntary participation from schools could drive others to aspire for quality and thus stand up and be accountable. More schools may make light of constraints in aiming to be successful. This writer, during his visit to Bellary, ran into a school inspector who said that she is moving away from the routine practice of inspection to a greater emphasis on interacting with children to assess their learning levels and observing and contributing to the teaching learning classroom processes. Some states have expressed interest in the Learning Guarantee Programme. The fact that the foundation has documented every aspect of this programme from the concept stage, recording both what went well and things that did not, may give it the ability to provide improved and custom built options to these states.

 

 

Finally, in this model of external evaluation, integrity is vital. In adopting such a model it would be imperative that the evaluating agency’s credibility and integrity in the public perception be established beyond doubt and reproach.

The foundation is justifiably cautious as the programme has completed only the first year of evaluation. As it grows and spreads, will schools transcend from a limited vision of qualifying for the learning guarantee awards to providing a more wholesome and holistic education? Will experts who examine this model find a golden mean for the evaluation process to reduce the costs of evaluation? These are areas where the programme needs regular review to help maintain its ideals and objectives of learning guarantee while maximizing its potential for expansion beyond Karnataka.

 

* The author can be contacted at giri@ azimpremjifoundation.org. For more details of the programme and the results of the first year of school evaluations please visit the website www.azimpremjifoundation.org

top