Essay
-
Ethical Technology

The Responsibility That Comes With Building at Scale

What technology teams owe the people who use what they build — beyond legalcompliance and user agreements.

The Moment It Changes

There's a moment in every product's lifecycle when it stops being "your thing" and becomes infrastructure.When it's no longer optional. When people structure their lives around it.
That's when responsibility becomes complicated.

What We Tell Ourselves

The tech industry has a convenient narrative: we're just building tools. What people do with them is up to them. We're neutral platforms. We don't make choices for users — we enable choice.

This is true at small scale. A tool used by 1,000 people who opted in and can leave anytime is meaningfully different from a platform used by 100 million people who have no practical alternative.

At scale, design is policy. Every default setting is a nudge. Every algorithm is a subtle shaping force. Saying "We're neutral" is like saying roads are neutral — technically true, but meaningless if the roads only lead certainplaces.

What We Actually Owe
1. Honest disclosure about what we're optimizing for

Most platforms optimize for engagement because that's what drives revenue. Fine. But we pretend we're optimizing for "connection" or "discovery" or other user-centric goals.

This is dishonest. If the business model requires maximizing time-on-site, say so. Let people make informed choices about whether they want to participate.

2. Reasonable exit paths

When a product becomes infrastructure, leaving becomes expensive. Your photos, your messages, your network— all locked in. We designed it that way, then we act surprised when people feel trapped.

Interoperability isn't a technical constraint. It's a choice. We could build export tools that actually work. We could support open standards. We don't because retention metrics look better when leaving is hard.

3. Accountability for second-order effects

Products have consequences beyond their intended use. A recommendation algorithm might radicalize users. Asocial feature might enable harassment. A growth tactic might exploit psychological vulnerabilities.

Saying "we didn't intend that" doesn't absolve responsibility. If you're building at scale, you have an obligation to think several moves ahead. Not perfectly — but genuinely.

Why This Is Hard

I'm a project manager, not an ethicist. I don't have clean answers. What I do know is that "move fast and break things" was always a luxury afforded by building for early adopters who opted in.

When you're building infrastructure, you don't get to break things anymore. The things you break are people's communication, their livelihoods, their sense of reality.

What Actually Changes Behavior

Not ethics training. Not codes of conduct. Those help, but they're not sufficient.

What works:
Structural incentives:

If your bonus is tied to engagement metrics, you'll optimize for engagement. Change the metric.

Diverse teams:

Homogeneous groups have blind spots. Some harms are only visible if you've experienced them.

Regulation with teeth:

Self-regulation doesn't work at scale. External accountability does.

The Question We Avoid

If your product disappeared tomorrow, would the world be better or worse off?

Not "would users complain" — they complain about everything. Not "would the business suffer" — of course it would.

Would the world be better off?

If the honest answer is "I don't know" or "probably not," that's information. What you do with it is up to you.

But at scale, "I was just doing my job" stops being an excuse.

Get in touch

I'm always open
to good conversations

Whether you're thinking about systems design, want to discuss sport psychology and performance, multi-cultural team management or just want to connect — I'd like to hear from you.