caCarousel

Monday, 3 December 2012

Measuring Impact

by Siddhi Mankud

Siddhi works with Catalyst Management Services (CMS), the firm who will be conducting an impact evaluation for Phase 1 of this program.

Impact evaluations are used to measure the degree of impact achieved by a project, attributable to the project. This is done by measuring: (a) Before-after: the difference in the variable being measured  between the beginning of the project and the end of the project; (b) Treatment-control: the difference in the variable being measured between a group that receives project intervention, and a group that does not receive the project intervention. This difference gives the impact that is attributable to the project.

Important to the IE design is the random selection of sample under study, so that sampling biases do not skew the findings of the study.

IE for the Pratham Vodafone
The Pratham-Vodafone WebBox project primarily seeks to improve the learning outcomes of students by providing teachers with technology that facilitates and improves lesson planning and delivery. The first phase of the initiative is being tested in low-cost English medium schools where students and teachers have limited access to technology; and to modern methods of lessons delivery such as activity based learning which makes learning enjoyable and improves comprehension and retention. Catalyst Management Services (CMS), a research and consulting firm in Bangalore has designed and is implementing the IE.
The project in its Phase 1 covers 5 locations, 142 schools and 8 learning centres, across three academic years, for 6th and 7th standard mathematics and science. Pratham and the Vodafone Foundation were interested in doing an Impact Evaluation so that they could understand how the WebBox technology developed by them can impact learning outcomes, for scale-up.
However, due to limited funding, the IE has been designed to cover 110 schools across 4 locations of the project’s first phase.
How to control?
When the team started to design and identify schools for treatment (where the project is implemented) and control (which do not receive project inputs), they realised that getting control schools which do not receive the WebBox intervention would not be possible, because why would schools that receive no project benefit agree to be a part of the study?
In such instances a staggered approach is often employed, where some schools get the intervention in the first year, some in the second year, some in the third year and so on. Since the Pratham initiative in Phase 1 was to sign up all schools to be covered under the Phase 1 at the beginning, the staggered approach was not feasible.
The team therefore designed an approach where half of the schools (55 in number, divided proportionately across the centres) receive the WebBox for Math and Science in the 6th Standard, and half receive it in the 7th Standard. The IE study covers both Standards - so in schools where the WebBox is given for the 6th Standard, the 7th Standard becomes the control group (not receiving the intervention), and vice-versa for schools where the WebBox is given for the 7th Standard.

Randomisation

Randomisation for the IE has been taken care of in the process of signing up the schools. Pratham did not have a definite list of schools at the beginning of their intervention which would be included in the programme. A school they visited and explained the intervention  could choose to be a part of the intervention or not. Not having a universe available to randomise on which schools should get the WebBox in the 6th and which in the 7th, a different approach had to be taken. Since there was no chosen process of approaching schools (creating a bias), CMS suggested that the first school approached in a centre gets the WebBox for 6th Standard, and the next for the 7th Standard, the third for 6th Standard, and so on. Therefore every alternate school approached would get the WebBox 6th and 7th Standard respectively.
The tool-box
The IE uses a mixed methodology of quantitative and qualitative methods. The design is quantitative dominant, sequential, embedded - where the quantitative method is the primary method of investigation. The qualitative method will be embedded into the quantitative method, its areas of inquiry emerging from the quantitative findings. It will thereby be implemented following the data collection and analysis of the quantitative component.
In the quantitative method CMS has designed (a) tools to collect student, teacher and school profiles, against which learning outcomes will be analysed; (b) competency learning tests for Maths, Science and English that enable standardised measurement values, to be administered at the beginning of the academic year and the end of the academic year. In addition the quantitative component will also use data from the school tests (semi-annual/annual) and from attendance records and usage data from the ERP of the WebBox.

In the qualitative method CMS will use group interviews and focus group discussions with students and teachers and will also conduct some key informant interviews with school principals and some parents.

The phasing of Phase 1 for IE
Given the timelines of the programme roll out, the IE has been planned in two phases - the prototype in 2012-13 will develop the tools and test them out in select 5 schools each in three locations. This will enable testing of the protocols as well as give some insights into how teachers and students perceive the project initially. Then the second phase will have the Year 1 (2013-14) and Year 2 (2014-15) study that will be implemented covering the entire academic years and will provide the data for the IE.
CMS will share its findings with the Pratham-Vodafone Foundation team towards the end of each academic year as all the data collected gets analysed.