Excellent Hacker Fallacy

From Noisebridge
Revision as of 17:41, 2 February 2026 by Nthmost (talk | contribs) (Initial publication of Excellent Hacker Fallacy as primary Noisebridge concept)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
ESSAY: This is an essay by a Noisebridger expressing their ideas. | E

Excellent Hacker Fallacy

TL;DR

"But they're such an excellent hacker! We can't afford to lose them!"

The Excellent Hacker Fallacy is believing that someone's technical brilliance justifies tolerating harmful behavior - or that their work can only be done by them.

Why it's a fallacy: The calculation typically ignores:

  • People already driven away by the harmful behavior
  • Potential contributors who never showed up
  • Opportunity cost: what could happen if others stepped in
  • Cumulative overhead of everyone routing around the problem
  • Long-term community health

The truth: When the "irreplaceable" person is finally removed, others step up and work continues - often better than before. The technical contribution rarely outweighs the total damage.

Hacker space variant: Technical skill gets weighted too heavily against social harm. "They're brilliant" becomes justification for behavior that would be unacceptable from someone less skilled.

The Fallacy Explained

In hacker and maker communities, we revere technical excellence. This creates a dangerous trap: weighing someone's contributions against their harm using a broken scale.

What we see:

  • They have deep technical skills in critical systems
  • They control important infrastructure
  • They possess institutional knowledge others lack
  • They do unglamorous work nobody else wants
  • They bring funding, connections, or legitimacy

What we calculate:

  • Contribution value HIGH → Person is valuable
  • Behavior harm (one incident, a few people affected) → Impact seems LIMITED
  • Therefore: We need to keep them, make it work

What this calculation misses:

Invisible losses: The excellent contributors who already left. The newcomers who showed up once, encountered the person, never returned. The projects that didn't happen because people avoid working with them. You're comparing visible contribution against invisible loss - the scale is rigged.

Monopolization cost: When one person controls infrastructure, institutional knowledge, or critical systems, the community becomes dependent. This isn't contribution - it's monopolization. Real contribution is distribution: teaching others, documenting knowledge, building systems others can maintain. If their "excellence" makes them irreplaceable, that's a failure, not a feature.

Routing overhead: Every person who routes around the harmful behavior pays continuous cost: modified projects to exclude them, event planning around their presence, emotional labor, warning new people, absorbing their friction. Multiply this across everyone affected, over years. The community spends more energy managing the problem than the person contributes.

Selection pressure: Tolerating harmful behavior creates a filter. You select against: people who won't tolerate abuse, those with less privilege/power to push back, those who prioritize psychological safety, people from marginalized groups this person targets. You end up with a community that can tolerate this person - not necessarily the community you want.

Sunk cost trap: "We've already invested so much trying to make this work." Each intervention becomes proof of investment rather than evidence of failure. The longer you tolerate it, the harder it becomes to acknowledge the calculation was wrong from the start.

What Actually Happens When They Leave

Prediction: "If they go, the [infrastructure/project/knowledge] will collapse."

Reality (observed pattern at Noisebridge and similar spaces):

  1. Brief disruption (days to weeks) while others learn the systems
  2. Distribution of responsibility - multiple people share the work
  3. Documentation appears - because now it's necessary, not optional
  4. New people step up - those who were deterred by the person's presence
  5. Better systems emerge - designed for maintainability not monopolization
  6. Community energy shifts - from managing one person to actual work
  7. Departed members return - or new people join who previously avoided the space

The "irreplaceable" person was often preventing the very resilience the community needed.

The Excellent Hacker Fallacy often protects what's known as a "Missing Stair" - a person whose harmful behavior everyone routes around instead of addressing.

The term comes from a 2012 essay by Pervocracy: imagine a house with a broken stair. Everyone learns to step over it, warns newcomers, but nobody repairs it.

In communities: everyone maintains private warnings, newcomers inherit workarounds without context, and the routing overhead eventually exhausts the community. Originally used to describe sexual predators ("don't let new women be alone with him"), it now applies to any harmful behavior tolerated through workarounds rather than addressed.

The connection: The Excellent Hacker Fallacy is often why communities protect missing stairs. "Yes, there's a problem, but look at their contributions!" The perceived technical value becomes justification for infinite routing overhead.

See Missing Stair for detailed analysis of recognition patterns, system costs, and intervention protocols.

Why This Happens at Noisebridge

We're anarchist. We value autonomy. We celebrate technical excellence. We resist authority. These are features - until they become bugs.

The perfect storm:

  • Do-ocracy rewards action - the person who does the work gets the say. When that person also causes harm, we're paralyzed: they're "doing," which grants legitimacy.
  • Excellence worship - we revere the brilliant hacker. Someone's technical skill can create a reality distortion field around their behavior.
  • Conflict avoidance - nobody wants to be "the authority" who judges someone. In consensus culture, removing someone feels like the opposite of our values.
  • Missing stairs get load-bearing - the longer we tolerate it, the more infrastructure they control, the harder it seems to act.

Result: We protect the excellent hacker while hemorrhaging everyone else.

Breaking The Pattern

REFRAME THE CALCULATION

Stop asking: "Can we afford to lose them?"

Start asking:

  • Who have we already lost because of them?
  • What's the true cost of everyone routing around them?
  • What could emerge if we weren't spending this energy on management?
  • What kind of community are we selecting for?
  • Are we protecting this person or protecting our ego about "making it work"?

RECOGNIZE THE TRAP

You might be in the Excellent Hacker Fallacy if you hear:

  • "But they built [important thing]!"
  • "Nobody else understands that system"
  • "We need to be patient, they've been here a long time"
  • "It's just their communication style"
  • "They don't mean harm, they're just [neurodivergent/passionate/intense]"
  • "We've already put so much work into helping them change"

These might all be true. They don't change the calculation.

TRUST THE EVIDENCE

When you finally remove the "irreplaceable" person and the space thrives - believe it. When people return after years away. When projects suddenly move forward. When new folks step up. When the infrastructure gets documented. When the room feels lighter.

This isn't luck. This is what happens when you stop spending all your energy routing around one person.

ACT EARLY

The fallacy compounds. The longer someone stays, the more:

  • Infrastructure they control (seems irreplaceable)
  • History they accumulate (seems unfair to remove them now)
  • Energy spent tolerating them (sunk cost)
  • Good people lost (invisible to current members)

Early intervention prevents the fallacy from becoming structural. See Restorative_Communication for frameworks.

For The Excellent Hackers

If you recognize yourself in this pattern:

Your technical skills are real. Your contributions may be genuine. This isn't about your value as a person or a hacker.

But: If your presence makes the community spend more energy managing your impact than building things together, something is wrong. If people route around you, leave because of you, warn newcomers about you - this is data.

The question isn't whether you're "bad" or your contributions are "real." The question is: are you a good fit for this community, right now?

Sometimes the answer is no. That doesn't erase your skills. It might mean:

  • This environment isn't compatible with how you work
  • You need different support than this community can provide
  • Your communication patterns and this culture don't mesh
  • You'd thrive better elsewhere

Recognizing this is strength, not failure. Staying where you're not a good fit costs you too - being the problem nobody will name is its own kind of suffering.

Prevention

Build distributed systems:

  • Cross-train on critical infrastructure
  • Document as you go, not when someone leaves
  • Rotate roles and responsibilities
  • Question monopolization actively
  • Celebrate knowledge-sharing as much as knowledge-having

Make early naming normal:

  • Address concerning behavior early using Restorative_Communication
  • Treat discomfort as valid signal, not proof
  • Act on patterns, don't wait for crisis
  • "I'm noticing a pattern" should be easy to say

Check your biases:

  • Notice when you weight technical skill over social impact
  • Question "but they're so good at X" as justification
  • Ask who's not in the room and why
  • Count the invisible losses, not just visible contributions

Distribute authority:

  • Don't rely on one or two people for conflict resolution
  • Build decision-making capacity across the community
  • Practice intervention while stakes are low
  • Respect mediator expertise - if they say it's not working, believe them (see Anarchy_Paralysis)

Trust emergence:

  • Others will step up
  • Systems will get rebuilt better
  • Knowledge will distribute
  • The space will survive

We've seen it happen. Every time.

See Also

References

  • "Missing Stair" concept: Pervocracy (2012)
  • Noisebridge community experience: Multiple observed cycles 2000s-2020s
  • Cross-space pattern recognition: Similar dynamics in hackerspaces worldwide