Why the commercial security frameworks we’ve inherited are necessary, useful, and not enough β and what they keep missing about the work we actually do.
On Monday we opened this series by highlighting that donors are not customers in the typical B2C sense. Almost every senior security person in our industry came up in commercial software. That is a strength. The frameworks we’ve inherited β SOC 2, ISO 27001, NIST CSF, the standard set of vendor risk and contractual data protection instruments β are durable, well-tested, and correctly skeptical of the kind of magical thinking that often shows up around security in mission-driven organizations. We rely on them, and we should.
But they were not designed for the work we do. They were designed for B2B contexts in which the controller is a buyer with negotiating power and the data subject is, in most cases, a paying customer of that buyer. The mental model behind them is a triangle: vendor, customer, customer’s customer. The customer can fire the vendor. The customer’s customer can fire the customer. Market discipline operates at every joint.
Our triangle does not work that way. The customer β a nonprofit β often cannot fire us in any practical sense; the switching cost is too high. The customer’s customer β a donor β was never in a contractual relationship with us in the first place. The data subject behind the donor’s record may be a beneficiary who didn’t even know her data was in a system. That mismatch shows up in three specific places.
1. Risk is modeled in dollars, not in trust.
Commercial frameworks model the cost of a breach mostly in dollars: regulatory fines, breach notification costs, customer churn, brand damage that translates to lost ARR. Those are real costs. They are not the most important costs in our sector. The most important cost in our sector is the long, slow erosion of donor confidence β both in a specific organization and in the act of giving itself. That cost does not show up cleanly on a CFO’s spreadsheet, and it doesn’t get priced into the kind of risk register a SOC 2 auditor reviews.
2. Consent is implied where it shouldn’t be.
Commercial frameworks largely assume that the data subject opted into the relationship β checked a box, signed a TOS, downloaded the app. That assumption is shaky in our sector. A donor opts into a relationship with a nonprofit, not with the seven vendor systems her record will pass through. A beneficiary served by a nonprofit may never have opted into anything; her file exists because she received care. The privacy posture appropriate to a customer who clicked ‘I agree’ is not the privacy posture appropriate to a beneficiary who did not.
3. Incident response is written for customers, not for donors.
Most IR plans we’ve reviewed in this industry are oriented around customer notification β when, how, and with what messaging the nonprofit organization will be told about a breach. Donor and beneficiary notification is treated as the customer’s responsibility. Legally, that may be defensible. Operationally, it’s a rough fit, because the nonprofit often has neither the security context to interpret what happened nor the staff bandwidth to draft communication on a tight timeline. The result, in too many cases, is that donor notification is slower, less specific, and less useful than it should be β not because anyone meant for it to be, but because the framework didn’t ask anyone to design for it.
SOC 2 doesn’t ask whether your incident response plan is written for the donor. We probably should.
What we’d add, not what we’d remove
We don’t think the inherited frameworks should be discarded. They’re load-bearing, and they were hard-won. We think they should be supplemented with a small set of sector-specific commitments that explicitly account for the parts of our work the commercial frame doesn’t see. A short, opinionated list to start the conversation:
- A trust-impact assessment for new features that touch donor or beneficiary data, written in plain English, intended to be readable by a board chair and an executive director, not a security engineer.
- Default retention windows that reflect philanthropic time horizons (lifetime giving, planned gifts, multigenerational relationships) without quietly turning into indefinite retention.
- Beneficiary-aware threat modeling β explicit consideration of harm models for the people whose records exist in the system because they received services, not because they opted in.
- Donor-facing incident communication templates pre-drafted with the nonprofit, so day-of communication is faster and more useful than the legal minimum.
- Public commitments around data minimization and downstream sharing, written at a level a sophisticated donor could read and understand.
None of these are revolutionary. None of them require a new regulator to write. They require us to decide, as an industry, that we want to.
Next up: a deeper dive into security and privacy standards and how they apply to the sector.