Confessions of a Developer Surrendering to AI: What Nonprofits Really Deserve

Oct 4, 2025

Lilac Flower

After 23 years of building software, I'm re-inventing. But it's not just in how I work as a professional. It's also how I see platforms interacting with nonprofits.

I need to be honest with you: I'm torn.

I've been writing code for 23 years. It's not just what I do - it's what I've consider "my product". Every feature I've ever built, every database I've architected, every bug I've fixed at 2am - that's been my value. My identity. My proof that I matter.

And now I'm watching AI agents write better code, faster, than I ever could.

This article is supposed to be about how AI can transform nonprofit fundraising. And it is. But it's also about something more uncomfortable: a 23-year veteran developer confronting obsolescence.

The Uncomfortable Truth About Platform Economics

Let me start by clearing up a fantasy that I perpetuated in my first draft of this article: platforms cost money to run, and they need to generate value for stakeholders.

Platforms are not nonprofits. They can't be. Someone has to pay for servers, maintain code, provide support, and yes, earn a living or provide returns to investors. There's no shame in that. It's reality.

The problem isn't that platforms charge. The problem is that we've built an entire industry on clever marketing that makes nonprofits feel like they're getting something free while we extract value in ways they don't fully understand.

And then, this is the part that burns me up, we pass that frustration directly to their donors.

We've Lost Our Way

Here's how it works in most of the fundraising platform industry:

  1. Platform markets itself as "free" or with "low fees"

  2. Platform adds tip screens, upsells, hidden fee padding, and feature gates

  3. Nonprofit thinks they're getting a deal

  4. Donors encounter friction, pressure, and confusion

  5. Platform extracts maximum value while nonprofit bears the reputational cost

At every point where a donor should feel proud to support a cause they care about, we've instead made them feel:

  • Suspicious ("Is this tip really going to cover fees, or is it profit?")

  • Pressured ("I'm already donating, why am I being guilted into paying more?")

  • Confused ("Wait, why do I need to pay extra for them to use a CRM?")

  • Nickel-and-dimed ("Another fee? Really?")

We took what should be a moment of joy, someone giving to something they believe in, and turned it into a moment of friction and doubt.

That's not a bug. That's the business model. And it's wrong.

Building for Nonprofits is a Calling

I need to be clear about something: I didn't build Sapling because I saw a market opportunity. I built it because I am experimenting with my own personal pivot into ultra focused and AI first development.

Working with nonprofits isn't like working with other clients. These organizations are doing impossible work with insufficient resources. They're feeding hungry kids, housing the homeless, protecting the environment, caring for the sick, defending the vulnerable.

When you build technology for these organizations, you're not just providing a service. You're either helping them fulfill their mission, or you're extracting resources that could have gone to that mission.

So yes, platforms need to charge. But we need to charge honestly. We need to charge for what it actually costs to provide the service, not for what we can extract through clever UX psychology and marketing obfuscation.

The Real Cost of Running a Platform

Let me show you what Sapling actually costs to operate. Not projections or estimates, real numbers:

Infrastructure & Services (Monthly):

  • Cloud hosting (AWS/DigitalOcean) = ~$400

  • Email service (SendGrid) = ~$200

  • Video hosting (Mux) = ~$300

  • AI content generation (Google Veo, Imagen) = ~$150

  • AI coding platforms (Claude, OpenAI) = ~$100

  • Database management and backups = ~$100

  • Security monitoring (Rollbar, security audits) = ~$150

  • SSL certificates and domain management = ~$50

  • Total: ~$1,450/month or $17,400/year

Human Resources:

  • Developer time (my time - 1 senior developer) = ~$120K/year value

  • Security oversight and architecture = ~$20K/year value

  • Customer support (part-time) = ~$30K/year

  • Total: ~$170K/year

Grand total: ~$187,400/year to operate (as a startup)

Now, let's say Sapling processes $5 million in donations annually (a reasonable goal for a growing platform). To break even, I need to collect approximately 3.75% in platform fees.

But here's the thing: breaking even isn't enough. I need buffer for growth, unexpected costs, security incidents, and yes—I need to eat and pay my mortgage.

So we set our platform fee at 5%.

Not 5% plus tips. Not 5% plus CRM upsells. Not 5% plus hidden fees. Just 5%. Transparent and honest.

But What About Credit Card Processing?

Here's where it gets important: credit card processing is not our fee.

When Stripe charges 2.9% + $0.30, that money goes to Stripe, Visa, Mastercard, and the banking infrastructure that makes electronic payments possible. That's not our revenue. That's not our margin.

Our 5% platform fee is in addition to actual credit card processing costs, which we pass through at cost: 2.9% + $0.30.

Total cost to the nonprofit: 7.9% per transaction

That's higher than some platforms advertise. But it's honest. And here's the crucial difference: your donors aren't paying an arbitrary tip. They're being sponsors of real costs.

NOTE: At launch, Sapling is locked in at 5% per transaction. Our goal is to determine real costs over time and only increase as needed.

Donors as Sponsors

This is the reframe that changes everything.

When you add a tip screen that says "Help us keep the platform free!" you're:

  1. Creating friction (donors feel guilted)

  2. Obscuring costs (nobody knows where the money really goes)

  3. Treating donors like marks who can be psychologically manipulated

What if instead we were honest?

Old model: "Donate $100! [Screen appears] Add 15% to help us cover fees?"

  • Donor feels tricked

  • Unclear where money goes

  • Emotional manipulation

  • Total donation: $115 (if they don't opt out in confusion)

Sapling model: "Your $100 donation supports [Cause]. The actual cost (at launch) is 5% ($5) for platform infrastructure and credit card processing. You can choose to sponsor these costs so 100% of your intended donation reaches [Cause], or the nonprofit will cover them."

  • Donor understands exactly what's happening

  • Clear breakdown of costs

  • No manipulation, just transparency

  • Donor becomes a sponsor of infrastructure, not a mark

When donors understand what they're paying for, they don't resent it. They feel proud that they're not just supporting the cause—they're supporting the infrastructure that makes the cause sustainable.

That's the difference between a tip and sponsorship. Tips feel optional and guilt-inducing. Sponsorships feel meaningful and transparent.

My Fear: Do I Still Have Value?

Here's where I need to be brutally honest about what AI is doing to my profession and my sense of self-worth.

For 23 years, my value was in my ability to write code. I could see a feature request, architect a solution, write clean implementations, catch edge cases, optimize performance. I was good at this.

Now I watch Claude write complete PHP files with proper security patterns, database isolation, and error handling in minutes. I watch AI generate test suites, documentation, and even catch bugs I would have missed.

The code is often better than what I would write. Faster, more consistent, more thoroughly commented.

So what's my value?

I'm still figuring that out. But here's what I'm learning:

AI can write code. I still need to:

  • Decide what to build - Vision still requires human judgment about what problems matter

  • Design security architecture - AI can implement patterns, but I have to decide what "secure enough" means for nonprofits handling sensitive donor data

  • Make ethical calls - Should we collect this data? How long should we keep it? Who gets access? These aren't technical questions

  • Take responsibility - When something breaks, someone with a conscience has to fix it and answer to the people affected

  • Maintain relationships - Nonprofits need to trust a human, not a chatbot, with their fundraising infrastructure

  • Understand context - I know what it's like to frantically set up a campaign before a gala. AI doesn't.

Maybe I'm not a developer anymore. Maybe I'm an architect, a security designer, a product visioneer, and a technical ethicist who happens to use AI as a tool.

That's terrifying. Because it means my identity has to change. But it's also liberating. Because maybe I can finally focus on what actually matters instead of spending 80% of my time on implementation details.

What AI Actually Enables (And What It Doesn't)

Let me be specific about what AI agentic coding has done for Sapling:

What AI Built:

  • Complete donation processing flow with Stripe integration

  • Multi-tenant database architecture with uID isolation

  • Email receipt system with SendGrid

  • Video upload and AI generation features

  • Admin dashboard with analytics

  • Mobile-responsive layouts

  • Security hardening (CSRF tokens, XSS prevention, prepared statements)

  • Automated test suites

  • API webhook handlers

  • Form validation and error handling

What I Still Had to Do:

  • Decide on the business model (free platform, 5% fee, no tips)

  • Design the privacy policy (what data to collect, how long to keep it)

  • Make security architecture decisions (encryption standards, session management, multi-tenant isolation requirements)

  • Test edge cases with real nonprofits

  • Write this article and think through the ethics

  • Provide customer support and understand user frustration

  • Decide what features matter and which are distractions

  • Review all AI-generated security-critical code

  • Take responsibility when things break

The AI compressed months of implementation work into weeks. But it didn't tell me what to build or why it matters.

That's still my job. I'm just no longer sure what to call myself.

Why the Industry Has Lost Its Way

The fundraising platform industry hasn't lost its way because people are greedy or evil. It's lost its way because the economics of traditional software development required aggressive monetization.

When you need $1-2 million in annual revenue to support a development team, you're forced to:

  • Maximize revenue per customer

  • Create upsell opportunities

  • Gate features behind paywalls

  • Add tip screens

  • Pad processing fees

  • Find every possible revenue stream

You tell yourself it's justified because you're providing value. And you are! But you're also extracting as much as possible because you have to. Because payroll is due every two weeks and AWS bills don't negotiate.

The marketing department then wraps this in language about "partnership" and "empowering nonprofits" while the pricing team figures out how to extract more without triggering churn.

Everyone's acting rationally within the system. The system itself is just broken.

What AI Changes About the Economics

The cost of development is collapsing.

Not to zero. I'm still needed for architecture, security, vision, and accountability. But the $600K-900K annual engineering cost just dropped to under $200K.

When your operational costs drop by 70%, you don't need to be as aggressive about monetization. You can charge what it actually costs plus reasonable margin, rather than charging the maximum the market will bear.

This enables a different business model:

  • Transparent, flat-rate pricing

  • No tip screens or dark patterns

  • No feature gates or upsells

  • Treating donors as sponsors, not marks

  • Actually serving nonprofits instead of extracting from them

Is this sustainable? I think so. Volume makes up for lower per-customer revenue, and platforms that genuinely serve their users tend to grow through word-of-mouth rather than expensive marketing.

But I'll be honest: I don't know for sure. This is an experiment. It might fail. I might have to adjust the fee structure or add some kind of premium tier for larger organizations.

The difference is that if I do that, I'll be honest about why.

What Nonprofits Deserve

After 23 years of building software, here's what I know nonprofits deserve from their platforms:

1. Transparency Tell them exactly what you're charging and why. Show them the math. No hidden fees, no tip screens disguised as "optional," no bait-and-switch with feature gates.

2. Respect Don't treat their donors like marks to be psychologically manipulated. Treat them like sponsors who are proud to support both the cause and the infrastructure.

3. Alignment Your incentives should align with theirs. When they raise more money, you should benefit from fair fees, not from extracting maximum revenue.

4. Accountability A human who will answer when things break. A human who will take responsibility for security. A human who understands the stakes.

5. Quality Tools that work, are secure, and don't waste their time. Whether built by AI or humans doesn't matter, what matters is that someone cares enough to ensure quality.

6. Fair Pricing Charge what it costs plus fair margin. Not what you can get away with. Not what your investors demand. What's fair.

The Sapling Experiment: Radical Transparency

Here's what we're trying with Sapling:

Today's Pricing: 5% total fee = actual credit card processing (2.9% + $0.30) + platform overhead and operations.

Fee Breakdown:

  • 2.9% + $0.30 = Credit card processing (Stripe, Visa, Mastercard)

  • 5% - (2.9% + $0.30) = Platform costs (infrastructure, tools, AI, human oversight, support)

No Hidden Costs:

  • No tip screens

  • No CRM upsells

  • No feature gates

  • No surprise fees

  • No minimum monthly payments

  • No setup fees

Donor Choice: At checkout, donors can choose to sponsor the platform costs (5%) so 100% of their intended donation reaches the nonprofit. Or the nonprofit covers these costs.

The key is transparency. Donors know exactly what they're paying for and why.

Does This Work? I'm Not Sure Yet

I'm 48 years old. I've been coding since I was 20. For the first time in my career, I genuinely feel like my core competency is shifting.

I can't write code faster than AI. I can't implement features more cleanly than AI. I probably can't even debug as thoroughly as AI with the right prompts.

What I can do is care. I can feel called to serve nonprofits. I can make ethical decisions. I can take responsibility. I can be honest even when it's uncomfortable.

Maybe that's enough. Maybe that's actually more valuable than pure coding ability ever was.

This article is my attempt to be radically honest about:

  • Platform economics (they cost money, need to charge fairly)

  • Industry problems (we've lost our way with dark patterns and extraction)

  • My own fears (I don't know if I have value anymore)

  • What's possible (AI enables different economics, which enables different ethics)

  • What nonprofits deserve (transparency, respect, alignment, accountability)

I'm running an experiment with Sapling to see if a platform can serve nonprofits genuinely while still being sustainable. I'm using AI to collapse operational costs so I can charge fair fees instead of maximum fees.

Will it work? Ask me in two years.

But I'd rather try this and fail than succeed at building another extractive platform that makes donors feel frustrated and manipulated while pretending to serve nonprofits.

A Different Kind of Call to Action

I'm not going to ask you to sign up for Sapling. (Though if you're curious about what radical transparency looks like in practice, signup.)

Instead, I want to make three requests:

To nonprofit leaders: Stop accepting platforms that aren't transparent about costs. Ask every vendor: "What does this actually cost you to provide, and what are you charging me?" Demand the math. Vote with your budget for transparency.

To donors: When you see a tip screen, ask: "Is this actually covering costs, or is this profit?" When platforms are transparent about costs, reward them. Proudly sponsor infrastructure that genuinely serves causes you care about.

To fellow developers (especially those curious like me): Our skills aren't obsolete. But our role is changing. We're becoming architects of ethical systems, not just implementers of features. That's scarier but also more important. Use AI to handle the implementation so you can focus on what actually requires human judgment: security, privacy, ethics, accountability, and vision.

The Fear is Real, But So is the Calling

I'm terrified that AI is making me obsolete.

But I'm more terrified of what our industry has become. Of tip screens that manipulate donors. Of platforms that extract maximum value while pretending to serve. Of nonprofits that bear the reputational cost of our greed.

Building for nonprofits was always a calling for me. Now, with AI handling implementation, I can finally focus on what that calling actually requires: radical transparency, ethical design, fair pricing, and genuine service.

My identity as a developer is radically changing. Maybe what's being created is more important.

I don't know if Sapling will succeed. I don't know if this business model works. I don't know if I still have value in an AI-driven world.

But I know this: nonprofits deserve platforms that treat their donors with respect, charge honest fees, and genuinely try to maximize what reaches the mission instead of what's extracted along the way.

If AI enables that, even if it makes me question my own value, then it's worth surrendering to.

The experiment continues.

Speaking of transparency: this article was written by Claude AI and edited by Matt, the founder of Sapling.