Why Nonprofit Leaders Need Habits for Measuring Performance
This post originally appeared on the American Evaluation Association's blog, and can be found here.
Over the last two years I worked as the Data and Evaluation Manager at the San Francisco Child Abuse Prevention Center (SFCAPC), a mid-size nonprofit focused on ending child abuse in San Francisco. This was a fantastic learning experience — I worked with 50+ staff members ranging from policy advocates to social workers, helping them use our data to serve clients better.
About two months into this job, I read a book that changed how I approach this work entirely:
I believe great people to be those who know how they got to where they are, and what they need to do to go where they’re going. They go to work on their lives, not just in their lives… They compare what they’ve done with what they intend to do. And when there’s disparity between the two, they don’t wait very long to make up the difference. — Michael E Gerber, The E-Myth Revisited
Reading Gerber’s book convinced me on the importance of systems and habits to help people succeed at their jobs. This particular nonprofit had a database that held client information — Efforts to Outcomes — but had few habits to improve that system and use that data to serve our clients better. So, I set out to create habits for how we do our “data” work.
I read through books, blogs, and web sites, and talked to mentors and friends to get a sense for what other organizations do. I found many resources that shaped my approach to creating these new “data habits”. Some of the best are:
The books, whitepapers, and newsletter from the Leap of Reason Institute — The free e-books on this site by Mario Moreno and David Hunter, and the institute’s recent whitepaper, The Performance Imperative, are the resources I recommend most often to others.
Sheri Cheney Jones’ book Impact and Excellence — Jones’ book contains many useful strategies she uses in her consulting practice to help clients generating insights from their data when resources are scarce.
The Root Cause Blog — Root Cause is another consulting firm that supports nonprofits in creating effective data and evaluation habits; they share some of their tricks on their blog.
The Data Analysts for Social Good professional association — Members get access to 25+ webinars ranging on topics from justifying the return on investment of data analysis to introductory analytical techniques using Excel, R, and other platforms.
After reading these I was convinced that developing regular habits would be critical to my organization’s ability to use it’s data to improve the lives of clients. But exactly what these habits should be remained unclear.
So with the support of our programs staff, I began experimenting. Over the last 18 months, we arrived at six specific habits that helped me provide consistent value to our staff.
Keep a reporting calendar: Organizations are often required to submit detailed program participant and activity uploads for certain government contracts. I created a comprehensive calendar of when to submit these uploads and who needed to review the data before it uploaded.
Define data integrity controls: Data integrity controls minimize the risk that information in a database is incorrect. For example, we scrubbed our database of test data monthly and audited a sample of new families to verify the accuracy of data entry each quarter. We summarized data integrity controls in a spreadsheet outlining each procedure, information source, performer, reviewer, and results.
Review dashboards: The first week of each month, I sent a performance dashboard to each of the program managers. Managers discussed these metrics with their teams. Then they shared explanations for variances and any action items they were going to take at the manager’s meeting the following week.
Schedule time for troubleshooting and report development: To build staff buy-in I needed to be responsive to database troubleshooting and report development needs. I tracked time for these tasks and blocked out time for them weekly. In an average week, I spent 2–10 hours troubleshooting and training, and 5–15 hours developing self-service reports staff could use to access program data themselves.
Automate annual development data pulls: A significant “data” responsibility was pulling data for our development team, including demographics and unduplicated client counts. Working with our development team, I developed a self-service report designed to answer 80% of their “stats” asks, saving everyone time.
Have a data analysis process: Stakeholders across our organization came to me with many good questions to explore in our data which I just didn’t have time to answer. I created a master tracker of these questions and set aside several weeks each year to explore those that were most critical. This “Annual Data Analysis” process set expectations and created a process to focus our limited analysis time.
These habits helped us save time, set expectations, and create lasting systems. They’re far from perfect, but were a huge step forward for us. Now that I’m no longer working for this organization, I realize these habits not only helped organize the work, but enabled continuity between me and my successor.
If you’ve read this and are working in the social sector, I’d love to know — What are the data & evaluation habits you’ve found valuable at your organization?