Vision Centres (VC) are permanent, stand-alone, primary eye-care facilities staffed by a vision technician or an optometrist. These VCs are located strategically to maximise access to underserved communities and affordability by offering services at nominal rates. The range of primary eye care services provided include promotive, preventive, diagnostic, and referral services.1,
In many Low-and-Middle Income Countries (LMIC), VCs are increasingly seen as a catalyst to achieving equity in eyecare and realising universal health care for all. Mission for Vision, in its effort to eliminate avoidable blindness and to narrow down the healthcare utilisation gap, has embarked upon establishing 500 such centres in India under its flagship programme Mission Jyot.3 Evaluation of healthcare interventions, like a VC, is essential in order to collect evidence about the efficacy of a programme, identify ways to improve practice, justify the use of resources, and identify unexpected outcomes.
Different models for evaluating a vision centre
Evaluation is a key feature of evidence-based healthcare practice and service delivery. It assesses the efficiency and effectiveness of a service and whether Key Performance Indicators (KPIs) are being met. There are several evaluation models available for use in the healthcare setting, however, the underpinning values of accountability and transparency are common to all models. Let us look at the more commonly used models of evaluation in healthcare.
The Logic Model is a linear model that can be used for both planning and evaluating a VC project. Its uncomplicated user-friendly design makes it an attractive choice. It consists of four basic components (inputs, activities, outputs and outcomes) and can be expanded by adding feedback loops of adding tiers if a more complex analysis is required.4,5
For example, the logic model may posit that VC technician will use computer management information system (CMIS) software to document patient interactions during clinic visits, while evaluation data may show that he/she is unable to do so effectively because of lack of understanding on how to use the new software – thus highlighting a need for additional provider-specific trainings.
2. Donabedian model
The Donabedian model is a conceptual model that provides a framework for examining health services and evaluating the quality of healthcare.6 This model assesses the quality of a service provided at VC based on three main categories: (a) structure (building, staff), (b) processes (patient pathways, referral patterns) and (c) outcomes (quality of life, morbidity, waiting times). This model has been criticised for not taking patient and cultural factors such as socio-economic status and health beliefs into account.
For example, the evaluation of Structural component can highlight if the number of certified technicians and equipment available at a VC is as per the guidelines. Process component evaluation could pick-up deficiencies in services like patient counselling, refraction, prescription of spectacles and identification of cataracts and other eye anomalies (glaucoma, diabetic retinopathy), etc. Outcome component of this evaluation can pick-up minor instances of patients whose vision deteriorated further because of cataract surgery, paying a way to kickstart the process to understand why.
3. Context Inputs Processes and Products (CIPP) model
This is a process based thorough evaluation model that can be applied at any stage of programme development i.e., planning a new VC, assessing a mature VC or evaluating a novel intervention within the VC framework after its completion.7 CIPP is a decision-focused approach to evaluation and emphasises the systematic provision of information for programme management and operation. However, it requires careful planning as multiple data collection methods are often required in order to carry out a comprehensive CIPP evaluation.
For instance, the CIPP model allows the evaluators to ask formative questions at the beginning of the VC programme, then later gives a guide of how to evaluate the VC impact by allowing you to ask summative questions on all aspects of the VC programme.
Context: What needs to be done Vs. Were important needs addressed? – Identifying the need of vision centre in a geographical area.
Input: How should it be done Vs. Was a defensible design employed? – Establishment of vision centre, required staff selection and placement, standard operating protocol/guidelines.
Process: Is it being done Vs. Was the design well executed? – Is it followed as per the guidelines and set quality measures? Are the staff capacity sufficient to carry out activities, number of people undergoing examination, counselling and provision of treatment.
Product: Is it succeeding Vs. Did the effort succeed? – Advised versus followed – spectacles purchase, visit to hospital for further diagnosis and/or surgical interventions, visual outcome after the treatment, satisfaction level of patients undergone treatment.
There are several evaluation models available for use in the healthcare setting. While evaluation comes in many shapes and sizes, its key purpose is to help us to develop a deeper understanding of how best to improve healthcare. However, the underpinning values of accountability and transparency are common to all models.
Misra V, Vashist P, Malhotra S, et al. Models for primary eye care services in India. Indian J Community Med. 2015;40(2):79-84.
Khanna RC, Sabherwal S, Sil A, et al. Primary eye care in India – The vision center model. Indian J Ophthalmol. 2020 Feb; 68(2): 333–339.
Millar A, Simeone RS, Carnevale JT. Logic models: a systems tool for performance management. Evaluation and Program Planning. 2001; 24:73-81.
Centers for Disease Control and Prevention. Evaluation Guide: Developing and Using a Logic Model. Department of Health and Human Services Centers for Disease Control and Prevention National Center for Chronic Disease Prevention and Health Promotion. USA. Available from: https://www.cdc.gov/dhdsp/docs/logic_model.pdf [Last accessed 04 October 2021].
McDonald KM, Sundaram V, Bravata DM, et al. Closing the Quality Gap: A Critical Analysis of Quality Improvement Strategies. 2007; 7: Care Coordination. Rockville (MD): Agency for Healthcare Research and Quality (US); 2007 June.