Skip to content

Digital Impact was created by the Digital Civil Society Lab at Stanford PACS and was managed until 2024. It is no longer being updated.

Three Things About Three Words: Evaluation, Impact, Scale

Click to go to pdfWe could stop at this title and be bored silly as each of those words has its own familiar set of images, definitions, and stories that immediately come to mind – even when we use them together. (Maybe, by using them together, I’ve finally stripped them of meaning in the same way yet another product promises “New, Improved, Ultra-Premium!)

Regardless, as I looked back at the content this past month on evaluation and how to support that work in the social sector, I didn’t want to leave out a look at what’s working. No need to complicate things: I went straight to Scaling What Works, thankful to GEO for this resource compilation to find a thought-starter we could use as a jumping-off point.

The “three words” then came from starting with evaluation, of course, as the theme for the month, but then adding impact and scale because of the relationships between all three, in many different combinations, one to another, one to all three, etc..

And, straight to the point, I’d like to offer this paper as a re-post : How do we approach impact and evaluation in the context of scale?  This places scale as the object, but the paper is also a brief but solid thought-starter on evaluation.

Three things to note when you read it:

  • Evaluation is framed broadly as “any activity that informs learning and drives improvement.” (see pages 2,3) As a result, we can think of many ways to get off the sidelines when it comes to the persistent and real barriers to full-on technical evaluations and begin to think of ways we can use the data and resources we have to produce powerful  questions and answers – powerful because they’re the most relevant.
  • The intersection of evaluation, impact, and scale helps position evaluation as an organizational discipline, as opposed to just a measurement event.
  • But, evaluation isn’t free – not the data, not the time, not the skill. There’s a recognition of that in this paper, noting (p.5) that the “best available evidence is preferable to the best possible…”