My nonprofits in the community - are you planning a donor survey in the next two months? Here are some examples of how you can ensure that the data does not sit silently in your work folders but actually lets it help you take meaningful actions. Example 1: Say your survey question is: "How likely are you to continue donating to our organization in the next year?" ● Data says: If 60% of donors say they are "very likely" to continue donating, but 30% are "somewhat likely" and 10% are "unlikely," this indicates a potential drop-off in donor retention. ● Turning that data into action: Focus retention efforts on the "somewhat likely" group. Create a targeted campaign that re-engages these donors by highlighting recent successes, impact stories, or new initiatives they might care about. Additionally, reach out to the "unlikely" group to understand their concerns and see if any issues can be addressed. Example 2: Say your survey question is: "Which of the following areas do you believe your donation has the most impact?" ● Data says: 50% of respondents say their donation has the most impact on "Education Programs," while only 10% say "Healthcare Initiatives." ● Turning that data into action: Understand the why and promote the success and need for your "Healthcare Initiatives" more prominently, aiming to increase donor awareness and support in this underfunded area. Example 3: Say your survey question is: "What is your primary reason for donating to our organization?" ● Data says: If the top reason to engage is "Alignment with my values" (40%) followed by "Transparency in how funds are used" (35%). ● Turning that data into action: Emphasize your organization's values and transparency in all communications. Regularly update donors on how their funds are being used with clear, detailed reports, and align your messaging with the core values that resonate with your donor base. Example 4: Say your survey question is: "How satisfied are you with the level of communication you receive from our organization?" ● Data says: If 70% of donors are "satisfied", 20% are "neutral," and 10% are "dissatisfied," there's room for improvement in communication. ● Turning that data into action: Understand the "neutral" and "dissatisfied" groups to pinpoint where communication may be lacking. This could involve increasing the frequency of updates, personalizing communications, or providing more opportunities for donor feedback and engagement. Sit with the data you collect. Read the numbers. Read the stories. Read the hopes, barriers, and interests of those humans in your data. The best possibility of a survey is to make the humans in that data feel included and belong by listening and acting on their perspectives. Co-create change with your community in those surveys. #nonprofits #nonprofitleadership #community #inclusion
How Nonprofits Assess Data Value
Explore top LinkedIn content from expert professionals.
Summary
Nonprofits assess data value by analyzing which information truly advances their mission rather than just looking impressive in reports. This means moving beyond surface-level numbers to focus on metrics and outcomes that reflect meaningful progress for the people and communities they serve.
- Prioritize mission-driven metrics: Choose data points that show how your programs improve lives or create lasting change, instead of tracking numbers that just fill dashboards.
- Connect and collaborate: Work with staff and partner organizations to share outcome definitions and integrate systems so you can track each person’s journey and measure real impact.
- Act on donor feedback: Use survey responses to tailor your communication, understand concerns, and highlight your values—making sure donors feel included and their perspectives guide your actions.
-
-
𝗪𝗵𝗲𝗻 “𝗕𝗮𝗱” 𝗗𝗮𝘁𝗮 𝗶𝘀 𝗔𝗰𝘁𝘂𝗮𝗹𝗹𝘆 𝗚𝗼𝗼𝗱 Looks can be deceptive. Ask the nonprofit that pursued a flashy corporate partner to make their Annual Report look good. The corporate partner turned around and twisted the org to make major changes in their programs to better match their brand and required impossible reporting timelines. The same lesson applies to your data, but in reverse. Real progress can actually be mistaken for decline. Consider the following: 𝗤𝘂𝗮𝗹𝗶𝘁𝘆 > 𝗤𝘂𝗮𝗻𝘁𝗶𝘁𝘆 • 𝘚𝘶𝘳𝘷𝘦𝘺 𝘳𝘦𝘴𝘱𝘰𝘯𝘴𝘦 𝘳𝘢𝘵𝘦𝘴 𝘧𝘦𝘭𝘭: Did the feedback quality improve because only recipients genuinely vested in the cause responded? • 𝘕𝘰 𝘰𝘧 𝘱𝘢𝘳𝘵𝘯𝘦𝘳𝘴𝘩𝘪𝘱𝘴 𝘥𝘦𝘤𝘳𝘦𝘢𝘴𝘦𝘥: Have you started focusing on value alignment instead of logos? • 𝘌𝘮𝘢𝘪𝘭 𝘭𝘪𝘴𝘵 𝘴𝘩𝘳𝘢𝘯𝘬: Have you invested hours cleaning it up? Gone are all the ghosts. Is your open rate higher? 𝗖𝗹𝗮𝗿𝗶𝘁𝘆 > 𝗖𝗼𝗺𝗽𝗹𝗲𝘅𝗶𝘁𝘆 • 𝘋𝘢𝘴𝘩𝘣𝘰𝘢𝘳𝘥 𝘒𝘗𝘐𝘴 𝘥𝘦𝘤𝘭𝘪𝘯𝘦𝘥 𝘧𝘳𝘰𝘮 20 𝘵𝘰 10 𝘮𝘦𝘵𝘳𝘪𝘤𝘴: Is clarity your new mantra? Have you gotten off the “just in case” data collection bandwagon? • 𝘙𝘦𝘷𝘦𝘯𝘶𝘦 𝘥𝘦𝘤𝘳𝘦𝘢𝘴𝘦𝘥: Is it because you said "No" to restricted-funds that came with strings attached? 𝗗𝗲𝗽𝘁𝗵 > 𝗕𝗿𝗲𝗮𝗱𝘁𝗵 • "𝘔𝘦𝘴𝘴𝘺" 𝘯𝘰𝘯-𝘯𝘶𝘮𝘦𝘳𝘪𝘤 𝘥𝘢𝘵𝘢 𝘪𝘯𝘤𝘳𝘦𝘢𝘴𝘦𝘥: Kudos! Have you started listening and capturing the stories that factor intangible impact your programs are having? • 𝘌𝘷𝘦𝘯𝘵 𝘈𝘵𝘵𝘦𝘯𝘥𝘢𝘯𝘤𝘦 𝘳𝘦𝘥𝘶𝘤𝘦𝘥: Is headcount no longer your goal? Have you shifted attention to designing experiences that attract authentic engagement and behavioral change? • 𝘝𝘰𝘭𝘶𝘯𝘵𝘦𝘦𝘳 𝘴𝘪𝘨𝘯-𝘶𝘱𝘴 𝘧𝘦𝘭𝘭: Has your volunteer retention rate increased? Are volunteer satisfaction levels and engagement up? • 𝘕𝘦𝘸 𝘪𝘯𝘪𝘵𝘪𝘢𝘵𝘪𝘷𝘦 𝘥𝘦𝘭𝘢𝘺𝘦𝘥: Have you started running test pilots before scaling to ensure higher probability of impact? 𝗧𝗵𝗲 𝗕𝗼𝘁𝘁𝗼𝗺𝗹𝗶𝗻𝗲 I could go on but you get the drift. The drops above reflect an org’s growing maturity when it comes to its mission, staff, volunteers and data. They are indicators that you are making hard choices over vanity metrics. 𝘏𝘰𝘸𝘦𝘷𝘦𝘳, 𝘯𝘰𝘵 𝘦𝘷𝘦𝘳𝘺 𝘥𝘦𝘤𝘭𝘪𝘯𝘦 𝘪𝘮𝘱𝘭𝘪𝘦𝘴 𝘱𝘳𝘰𝘨𝘳𝘦𝘴𝘴 - 𝘤𝘰𝘯𝘵𝘦𝘹𝘵 𝘮𝘢𝘵𝘵𝘦𝘳𝘴. 𝗪𝗵𝗮𝘁’𝘀 𝗮 𝗻𝗲𝘄 𝗺𝗮𝗻𝘁𝗿𝗮 𝗶𝗻 𝘆𝗼𝘂𝗿 𝗼𝗿𝗴 𝘁𝗵𝗮𝘁 𝗹𝗼𝗼𝗸𝘀 𝗹𝗶𝗸𝗲 𝗮 𝘀𝘁𝗲𝗽 𝗯𝗮𝗰𝗸 𝗯𝘂𝘁 𝗶𝘀 𝗮𝗰𝘁𝘂𝗮𝗹𝗹𝘆 𝗴𝗿𝗼𝘄𝘁𝗵?
-
Some nonprofits obsess over the wrong numbers. Open rates. Social likes. Event RSVPs. And then wonder why 𝘳𝘦𝘷𝘦𝘯𝘶𝘦 𝘪𝘴 𝘧𝘭𝘢𝘵 and donors are disappearing. Here’s the truth: 𝗡𝗼𝘁 𝗮𝗹𝗹 𝗺𝗲𝘁𝗿𝗶𝗰𝘀 𝗮𝗿𝗲 𝗺𝗼𝗺𝗲𝗻𝘁𝘂𝗺. I call them 𝘃𝗮𝗻𝗶𝘁𝘆 𝗺𝗲𝘁𝗿𝗶𝗰𝘀 𝗶𝗻 𝗺𝗶𝘀𝘀𝗶𝗼𝗻 𝗰𝗹𝗼𝘁𝗵𝗲𝘀. They look good in a dashboard. But they don’t move the mission. Here’s what high-performing organizations track instead: 𝗗𝗼𝗻𝗼𝗿 𝗿𝗲𝘁𝗲𝗻𝘁𝗶𝗼𝗻 Because keeping a donor is cheaper—and more powerful—than chasing a new one. 𝗦𝗲𝗰𝗼𝗻𝗱 𝗴𝗶𝗳𝘁 𝗿𝗮𝘁𝗲 Because a second gift turns interest into belief. 𝗟𝗶𝗳𝗲𝘁𝗶𝗺𝗲 𝘃𝗮𝗹𝘂𝗲 Because impact multiplies when donors stay, grow, and refer. 𝗖𝗼𝘀𝘁 𝗽𝗲𝗿 𝗱𝗼𝗹𝗹𝗮𝗿 𝗿𝗮𝗶𝘀𝗲𝗱 Because sustainability matters more than the hype of “big numbers.” 𝗗𝗼𝗻𝗼𝗿 𝗲𝗻𝗴𝗮𝗴𝗲𝗺𝗲𝗻𝘁 𝗱𝗲𝗽𝘁𝗵 Not how many saw it. How many felt it. Shared it. Acted on it. Data should serve decisions, not just presentations. The best fundraisers don’t just measure what’s easy. They measure what 𝘮𝘢𝘵𝘵𝘦𝘳𝘴. 𝗪𝗵𝗮𝘁 𝗺𝗲𝘁𝗿𝗶𝗰𝘀 𝗱𝗼 𝘆𝗼𝘂 𝘁𝗿𝗮𝗰𝗸 𝘁𝗵𝗮𝘁 𝗺𝗼𝘃𝗲 𝘆𝗼𝘂𝗿 𝗻𝗼𝗻𝗽𝗿𝗼𝗳𝗶𝘁 𝗳𝗼𝗿𝘄𝗮𝗿𝗱?
-
The smartest investment we made as a nonprofit in 2025? It wasn’t fundraising. It was data. 📊 I say this as someone who often warns about measurability bias. But over the past few years, I’ve become one of the strongest advocates for 𝘥𝘢𝘵𝘢-𝘪𝘯𝘧𝘰𝘳𝘮𝘦𝘥 decision-making. 2025 was New Roots Institute's first full year with a dedicated R&D department, and it has transformed our fundraising, our strategy, and the quality of our programs. For a long time, “number of students reached” was our primary metric. That incentivized us to simply reach more people. We could 𝘵𝘩𝘦𝘰𝘳𝘦𝘵𝘪𝘤𝘢𝘭𝘭𝘺 scale volume, sacrifice program quality, have no strategy around who we were reaching, and still look successful on paper while making limited progress toward ending factory farming. We now evaluate every session, track the efficacy of our campaigns, and identify which tools, training, and support actually help students succeed as organizers and campaigners. That learning feeds directly back into program design and how we support fellows in real time. Our work is complex, relational, and long-term. Embracing monitoring, evaluation, and learning hasn’t flattened that complexity. It’s strengthened our ability to navigate it with nuance. As more nonprofits take on hard-to-measure challenges, I hope we stop treating R&D as a luxury. It’s a commitment to learning, humility, and building organizations that get smarter over time. Is R&D part of your work these days? I’m curious how your organization approaches data. Our fellows are reaching over 𝟯 𝗺𝗶𝗹𝗹𝗶𝗼𝗻 𝗽𝗲𝗼𝗽𝗹𝗲, shifting dining behaviors, and removing plant-milk upcharges. Explore their impact here: https://hubs.ly/Q03YSvbX0 Grateful for our incredible R&D team Sean Rice, Jiwon Joung, and Nichalus Vali who push us, and our movement, to learn faster and adapt smarter. 💜 #Leadership #Nonprofit #Data #MeasurabilityBias #R&D #Impact #Strategy #Evaluation #MovementBuilding
-
Most nonprofits can tell you how many people they served last year, but have a much harder time telling you whether those people are better off. I've seen this pattern across the sector for years. Organizations collect a bunch of data, including intake numbers, service hours, referral counts, and demographic breakdowns, it gets reported to funders, and that's usually where it ends. When asking questions like did those services actually stabilize housing, reduce ER visits, or keep a family together, the data to answer them can't be found because everyone tracks their own piece, in their own system, with no way to connect it all. When I think about what a client's journey actually looks like... they might access employment support from one organization, transitional housing from another, and mental health services from a third. Each provider records its own outputs, but rarely can they see the full picture or tell you whether the combination of those services produced a lasting result. In the social sector, we are very good at counting activities, but have challenges measuring change. Moving from tracking transactions to tracking trajectories is important. Not just "we served this person," but "this person moved from crisis to stability over twelve months, and here's what that path looked like." That starts with integrated systems, shared outcome definitions across programs and services, and the analytical capacity to turn data into evidence. If boards, funders, and executive directors can work from the same understanding, then all of the data being collected can be used to really determine what's actually helping people and what isn't.