Be irreplaceable

Be irreplaceable

icon
tl;dr - you can increase your impact by making sure that it is unique
  • Thinking about what would have happened without you (the counterfactual) helps you know when you are actually helping.
  • To avoid being replaceable, consider options outside of typical do-good careers like being a doctor.
  • Instead, work on problems that others are unfairly neglected.
    • Find a problem we’re biased against
    • Find a problem we’re not incentivised to solve.

You might be thinking about impact wrong

To make a big difference, you need to be thinking about counterfactual impact.

icon
Explanation: The impact which takes other possible worlds into account
Counterfactual impact

The counterfactual is where we are comparing what happened in our world, to what happened in a world where the facts are different. That is, it is counter to the facts. Read more. Learn more.

Your counterfactual impact is the impact you have compared to what would have happened otherwise.

This means asking whether more good is being done in the world you are in, than would be done in a world where you chose a different career, university major, or job.

When you think in this way, you can realise that:

  • Some jobs would be done whether it was you who did them or not. In this case, you are replaceable. Your counterfactual impact is likely to be low.
  • Some would be done best by you. Then, your counterfactual impact is the extra good that you are doing, by doing the job better than your possible replacement.
  • Some wouldn’t be done without you. These are the areas where you can have a unique impact. If these jobs are important, then your counterfactual impact could be very high.
icon
Example thought experiment:

You see a child step into the road of a busy street. You can tell that if they don’t move out of the way immediately, they will be hit by an approaching lorry. But the child is clearly distracted by their phone. Without hesitating, you leap into the road, grab the child and pull them out of the way of the lorry with a few seconds to spare. On your way back to the pavement, you almost bump into another passer-by who evidently had the same thought. It is clear that either of you would have grabbed the child in time.

Did you save the child? Yes, most people would say you did.

But did you make a difference to the child’s life? In the possible world where you didn’t see the child, the other passer-by would still have saved them. That child’s future would have been the same, whether or not you specifically pulled them out of the way of the lorry.

You tangibly saved the child, but you had no counterfactual impact.

Typical do-good careers are more likely to be replaceable

Well-known altruistic careers aren’t your only option. In fact, they aren’t always even the best.

Doctor’s donations are less replaceable than their work

Careers in medicine are among the most widely sought after careers to help people.

Less widely known is that the average medical doctor in western countries can probably save more lives through donating 10% of their income than they do in their job.

icon
Napkin Math
The average US doctor could save 6 times as many lives through donations as they could through their career.

The blog, the average doctor, estimates that a primary care physician in the US makes $5,288,870 in career earnings after tax.

Givewell, a global health and development charity evaluator, estimates that a life can be saved for $4,500 by the Against Malaria Foundation. This life is a counterfactual extra life — in the world where you didn’t donate, that person would have died.

If the average US doctor gave 10% of their income, $520,887, over the course of their life, then they could save lives.

Dr Greg Lewis estimates that the average doctor in the developed world saves 25 lives in their career.

So x as many lives saved as they saved directly through their work.

Dr Greg Lewis estimated in 2012 that the average doctor in the developed world could save 25 lives over the course of their career.

This isn’t an impact to be taken lightly — 25 lives is a lot for any one person to save. If that’s all you could do, that would be an amazing feat to be proud of.

But, the average doctor helps less people than you might expect because they are partly replaceable. For every student accepted into medical school, there are many more who applied and would be able to do nearly as good of a job.

So if you’re talented enough to become a doctor, it’s likely that there are other impactful careers where you can have an even bigger counterfactual impact.

icon
Nerdy aside
Other reasons that a doctor’s impact is lower than expected
  • Diminishing returns: There already are a lot of doctors, reducing the value of additional ones (diminishing marginal returns).
  • Medicine is only a fraction of our good health: a lot of the improvements to welfare over the 20th century came not from medicine itself, but public health (sanitation, vaccinations etc.)
  • Find out more.

Other competitive careers are partly replaceable too

This is likely to be similar for other well-known and highly competitive altruistic professions, like human rights law, or working at famous NGOs.

When a skilled line of applicants stands behind you waiting for the job to open, your counterfactual impact is lower — the job might have been done without you.

That doesn’t mean you shouldn’t pursue competitive paths, degrees, or jobs. You can still have a unique impact, if you focus on neglected problems.

Work on neglected problems to be irreplaceable

To have a large counterfactual impact — one that wouldn’t have happened without you — you need to find opportunities which most people don’t think of.

You can do this by seeking out projects which are neglected. When too few people are working on a problem, every additional person can have a unique counterfactual impact.

icon
Nerdy aside:
Diminishing returns: why finding neglected problems matters

Diminishing returns refers to the phenomenon where investing more and more of some resource (such as money or time) produces benefits at a lower and lower rate — and becomes less efficient the more you invest. In other words, your first dollar spent on something is often far more useful than your thousandth dollar.

We can apply this phenomenon to work on the worlds most impactful problems. The first person working on AI safety is likely to have a more impactful career than the hundred-thousandth.

So this is another reason to look for neglected problems — you are likely to be far nearer the first person working on them than you would be if you worked on the most popular issues.

Source:
Source: Wikipedia

Because so few people are willing to consider unpopular problems or be guided by what will be the most impactful, we believe we’ve found some exceptionally important problems where you can have an outsized impact.

Examples of important, neglected problems

  • Why it’s neglected: Factory farming doesn’t happen in plain sight, so it isn’t often on the mind of the public. We don’t in general think about animal welfare enough, and when we do, we give more to domestic than farmed animals.
  • Neglected subproblems: Corporate activism
  • Why it’s neglected: The most severe AI catastrophe’s might be events without precedent, which contributes to them being particularly neglected, and a problem where you could have an outsized impact.
  • Neglected subproblems: AI safety research to avert worse-case scenarios.
  • Why it’s neglected: We’re yet to experience the full effects of climate change, and those left worst off from climate change aren’t in positions to avert it.
  • Neglected subproblems: Quantifying and mitigating effects of particularly severe climate change; or advocating for green energy research policy.
  • Why it’s neglected: Many countries were insufficiently prepared for the Covid pandemic, perhaps partly because a pandemic had not occurred within the lifetimes of their decision-makers.
  • Neglected subproblems: Novel, engineered pandemics
  • Why it’s neglected: You’re not around to feel good when your donation has an impact in 100+ years.

How to find more neglected problems

icon
Steps to finding a neglected problem you could have a big impact on
How this article fits into your career planning...
  1. Before you do any research, write out your initial ideas of what the most important problems in the world might be.
  2. Expand this list through:
    1. Brainstorming neglected areas people miss because they’re biased or not incentivised to work on it (as outlined in this article).
    2. Reading research into global priorities – for example, or Open Philanthropy’s cause reports or 80,000 Hours problem profiles.
  3. Make a shortlist by selecting your top 3 problems
  4. Prioritise amongst your shortlist by scoring your top 5 problems using this framework. If you’re not sure why you need to, read our article: Prioritise causes: make more than a drop in the ocean.
  5. Figure out what your top causes need more of to make progress. Are they most constrained by advocacy, funding, research, delivery of services, or something else?
  6. Consider your personal fit.
  7. Then choose the most promising career, skill-up, and get involved.

Two key reasons a highly impactful project might be unfairly ignored are that:

  1. We are biased against it, or
  2. We aren’t incentivised to work on it.

Find a problem we’re biased against

icon
Explanation: Why mental shortcuts are sometimes useful
Heuristics and biases

Our brains weren’t designed just for slow, careful, rational thought. Often we use heuristics — rules of thumb which allow us to take shortcuts to quickly solve problems without thinking through every aspect of an issue.

Imagine you are in the forest at night, and you hear a rustle in a bush next to you; you turn and see two reflective orbs amongst the foliage. Heuristics help you start running then, before you figure out whether it is definitely a predator or not.

However, these heuristics aren’t always well-adapted to modern problems.

When heuristics are applied to problems that they didn’t evolve to solve, they cause biases.

Biases are ways that our thinking systematically deviates from what would be rational. If we can figure out the areas where our heuristics fail when we are thinking about having a big impact, we can find opportunities to out-think the crowd.

Two particularly common biases affecting our judgement of important causes:

Availability bias — look for problems that aren’t talked about

A greater exposure to violence on TV leads people to think crime and violence are more likely in the real world (source), due to what’s know as an availability bias.

icon
Availability bias is the mental shortcut we use when we rely on immediate examples that come to mind.

This means that the most newsworthy problems like terrorism, wars, droughts and famines, often get the most attention in our assessments of the most important problems.

Not being anchored to commonly discussed problems means we can search for important problems which don’t get enough attention.

Scope neglect — act based on what will be more impactful

Most charity isn’t directed at the very best ways of helping, and instead is directed at causes that sound good.

One factor is our tendency to value all efforts to help roughly equally, even when some are 100x more impactful, what’s known as scope neglect.

icon
Scope neglect is paying insufficient attention to the degree of benefit or harm. We have trouble caring ten times more about something when it is ten times as important. Read more.

In reality, some efforts cause little more than a drop in the ocean or even do harm, so we need to prioritise with the degree of benefit in mind.

Most people just aren’t making career decisions this way. If you do, it’s one crucial part of the equation to have an outsized impact.

icon
Nerdy aside:
Study: People value 2,000 birds as much as 200,000

In 1989, the Exxon Valdez oil tanker ran aground a few kilometres off the coast of Alaska. It spilled 3,700 tonnes of crude oil into the ocean over the next few days, affecting 1,300 miles of coastline.

Those numbers are quite hard to imagine right? In fact it was 37,000 tonnes of crude oil that the tanker released — but you probably wouldn’t have reacted with more disgust if I told you that the first time.

We know this because after the disaster, Exxon funded research into how much the public cared about the oil spill.

One of the experiments tested how much people were willing to pay to place nets over oil-ponds to prevent migratory birds drowning in them. The experiment asked different groups how much they would pay to save 2,000, 20,000 or 200,000 birds from drowning.

The group asked about the 2000 birds said they would pay $80 on average.

But, perhaps surprisingly, the 20,000 group said they would pay $78, and the 200,000 group said they would pay $88.

The groups were not tying their valuation to the scope of the problem — they weren’t willing to increase their donations even if the problem was 100 times larger.

Find a problem we’re not incentivised to solve

Where there are no incentives for working on a problem, it is unlikely to be solved.

So problems that don’t:

  • Make any money,
  • Pay off for a long time,
  • Or affect well-resourced groups...

… are particularly likely to be ones we can have a large unique impact by solving.

icon
Three ways to beat the altruistic market and make a huge counterfactual difference
Find unprofitable solutions to important problems

If everyone can get something for free no one can make a profit.

This means that companies won’t, without any help, try to solve this problem. If they did, they’d be bankrupt pretty quick.

As a result, we see a lot less investment than we need on ‘public goods’.

icon
Definition & examples:

Public goods are something everyone can enjoy but no one can profit from.

Examples are clean air, public health, or an educated population.

These are typically undersupplied by private companies because they are non-excludable (you can’t stop anyone from experiencing them) and non-depletable (they can’t be used up). Read more.

Public goods are often crucial to our survival — like clean air, or a low risk of nuclear war.

This means that there are likely to be opportunities for people who are impact-focused to provide public goods that wouldn’t be taken otherwise — a counterfactual impact.

icon
Activity: Think of 5 public goods that we don’t provide enough of

Our examples:

  • Low risk from emerging technologies (like artificial intelligence).
  • Security from zoonotic cross-over of pathogens in our food supply chain.
  • Surveillance of pathogens currently affecting public health.
Solve long-term problems

We screw over our future selves when we succumb to the temptation of that 4th piece of cake. (Happens to the best of us.)

Democratic governments go through 4-5 year election cycles. If they make long-term plans that pay off, the opposition might get the credit. So in general, they don’t.

Companies have to show results every quarter for shareholders; if they don’t, their executives will be replaced by people who can.

A charity or foundation won’t look as good if their effects pay off in a century, even if those effects are larger, because the people involved won’t be around to take credit.

Drivers to short-term thinking are everywhere, so if you’re able to plan and act with the longer term in mind, you can have more impact than people who can’t.

Help beings most people don’t care about

In recent centuries, we have seen the western moral circle of concern expand to include people outside our family, other genders, races, and increasingly, people from other countries and animals of other species.

If we want to have a big counterfactual impact, we could aim to make an impact at the rim of our current moral circle — an impact for the groups that are still neglected.

This means looking for neglected individuals who are only just entering the world’s concern, such as farmed animals, future people, or even artificial minds.

Because few people are concerned with making life better for these groups, there is likely to be low-hanging fruit which won’t be picked by anyone but you.

icon
Nerdy asides:
Caring for our great (great...) grandchildren

People who don’t even exist yet deserve our consideration as well.

Philosophers like Derek Parfit argue that we should care a lot more about future people.

The damage we do to the environment gives them a worse world to live in. The risks we impose through the maintenance of nuclear weapons means we may end up being responsible for future people never coming into existence.

Few people think explicitly about our impact on the future, so you could have a large effect on future generations if you pursue research or policy in their benefit.

If the fact that few people are worrying about the lives of the trillions of people to come concerns you, you’ll be interesting in the growing field of longtermism.

How do we affect the long-term future?

Read next