By Dun Grover, Director of Monitoring, Evaluation and Learning, Transforming Market Systems Activity (TMS) Honduras, ACDI/VOCA
In this blog, Dun highlights some important learning from his project’s efforts to apply more rigorous monitoring methods when tracking systemic change. A key point is that quantitative methods are valuable when tracking systemic change, but they have to be understood in the context of complexity. As Dun explains, there has been pushback on the use of quantitative methods when trying to monitor and learn about systemic change. While there is a valid argument against quantitative methods, the concern is more narrow and related to how many practitioners perceive such methods as providing absolute answers. For example, traditional approaches have applied such methods to come up with absolute yes and no answers related to project attribution. While we have learned that such ways of thinking are not valid from a systems thinking perspective, we also have to recognize that quantitative methods are important when applied properly. The blog lays out important considerations and insights into how to ensure quantitative methods add value when trying to gain insights into whether and how systems change.
In Cape Town, I had the opportunity to share some of my experiences working with the Honduras Transforming Market Systems Activity and our market system diagnostic – more specifically, how we were attempting to measure several aspects of market systems, including resilience. One of the points that sparked dialogue at our Symposium session was the feasibility and usefulness of quantitative methods (versus qualitative ones) to measure market systems and their attributes.
Since then, we have completed the market systems diagnostic. I encourage you to check out the dashboards and whitepaper here at http://cohep.com/sistemasdemercado/. Now that this process is complete, I’ve had the chance to reflect on the discussion from Cape Town. In addition to the interesting results which you can read about in the link, here are some of the realizations I’ve had:
Quantifying helps inform management decision making, even if those measures are less precise. I think the challenge most people bump into when attempting to quantify complex systems is that is impossible to do so with the same level of precision that we can, say, measure yields. USAID’s definition for precision states that “data should have a sufficient level of detail to permit management decision-making.” It turns out that we can measure the number of shocks experienced and the pace of recovery of enterprises across a sample of enterprises in an industry within an acceptable margin of error. Though these may be proxies for systems-level change, they are very meaningful measures that inform adaptive decision-making for the project and its stakeholders. Data analysis brought to light several hidden features of Honduran market systems, such as the impact of certain shocks of enterprise performance, that have shifted activities and generated insights that have prompted qualitative inquiry.
You should focus on the key variables and their directionality of change – you don’t need to explain it all. In statistics, R squared is the percent of variance explained by the model. 0 percent indicates the model explains none of the variance and 100 percent indicates the model explains all the variance. In modeling our data, we found relatively low R squares (note: this is not surprising for social sciences fields). Despite this, we also found 17 variables that were statistically predictive of systems performance results. These discoveries are potential levers for systems change. Although these quantitative models might not tell you whether pulling these levers will result in an 11%, 32.5% or even a 500% increase in system ‘performance’, we do have a stronger evidence base to inform our decisions around which levers to pull in which directions. Further, we know with a level of confidence that when we do, the system will materially change to produce more of the results we want now and likely into the future.
How to cross-validate to avoid overfitting your models in order to reduce errors in your predictions. In developing a statistical model, you may develop a model that fits your data perfectly but doesn’t fit in the real world and leads you to make errors in your predictions. This statistical phenomenon is called overfitting. In ‘real-life’ overfitting is akin to when we try to generalize experiences from one situation to another and mistakenly apply variables that don’t belong in our understanding. In statistical analysis it is a standard practice to use validation methods to detect and remedy such errors. One of the methods we are applying to avoid overfitting is cross-validation. To do this, we are facilitating workshops with a set of enterprises to validate the measures, identify ones which may have been mistakenly included, and, further, to identify variables which we missed that we should try to measure the next year.
Quantitative reasoning is integral to constructing knowledge of systems. A core feature of systems that doesn’t change is that our knowledge and understanding of systems must always change. Quantitative reasoning is a process and way of thinking that helps us construct knowledge about systems. Quantitative reasoning involves the collection and reinterpretation of data and subsequent revisions to models and theories based on new lines of evidence. In our market system diagnostic, we intend to adapt, drop, or replace indicators on an annual basis that do not prove statistically or materially predictive of target outcomes. The purpose of this process is to continue to improve the precision and fit of our measurement methods to Honduran market systems. Further, by engaging academia and the private sector in this process, we are strengthening a systems mindset oriented towards exploration and discovery among local stakeholders in constructing collective knowledge of Honduran market systems.
We welcome you to contribute to this process of learning and adaptation. If you have recommendations of variables to include in the 2019 diagnostic or methodologies to model the data, please send them to firstname.lastname@example.org or email@example.com. Please include any evidence and sources as to why this contribution can help us to better explain Honduras’s market system performance.
The MSDHub Blog Series is authored by respected implementers and donors of market systems projects globally.