Can Christians identify as Effective Altruists?
2500 words | Reading time 8-12 min
Short summary:
I don’t think it ultimately matters very much whether people identify as Effective Altruists or not. What matters is that they are having a positive impact on the world.
However, engaging with the EA movement seems useful for many, perhaps most, people who want to do good effectively.
Interacting with the movement does not require embracing everything about it or identifying yourself as an Effective Altruist.
That said, I think many reasons people do not identify with EA are not as strong as they may initially appear.
I believe Christians can approach EA with an attitude of using it as a tool to help them do good effectively, and that this approach sidesteps many of the identity problems.
Many Christians feel uneasy about the Effective Altruist label even if they are interested in EA ideas. I explore some reasons for this unease. I think engagement with EA is often beneficial for Christians who want to do good in the world, but it is not necessary to identify as “an EA” to do good effectively or to engage with the movement. I’ll first go through some common stumbling blocks while sketching potential solutions. At the end, I will present what I believe is a perspective that can solve many of the issues.
Problems with EA ideas
Is EA a competing worldview?
A fundamental reason Christians might be uneasy about identifying with the EA movement is that EA feels like a competing worldview. It might feel like EA introduces another set of commitments outside their faith. “Thou shalt have no other gods before me” echoes in the background.
Some of these reservations may be justified. I believe there are risks involved in having EA as the primary meaning-generating mechanism in your life. Having your identity and existential security anchored on something else outside EA can help keep you grounded and guard against excesses. For Christians, following Christ comes first. However, the discomfort is built on the assumption that EA is a package deal that has to be accepted as a whole. I want to challenge this line of thinking—more on this below.
Does EA require utilitarianism?
One worldview-related concern Christians might have with EA is its close association with utilitarianism. Utilitarianism is a consequentialist moral theory that argues that the rightness of an action is determined by how much it increases overall happiness or well-being. This may feel alien to those who believe in inviolable moral principles like the Ten Commandments. Utilitarianism can feel unsettling because it seems to lack a sense of moral absolutes. Its focus on maximising happiness or well-being can seem to elevate these concepts to the status of the sole good, sidelining other important Christian values like justice, integrity, and the glory of God.
Even if you ultimately reject consequentialist morality, it is still useful to know that utilitarianism is not the caricature you sometimes see. Two-level utilitarianism recognises the limitations of calculating the consequences of every action. and allows the use of common moral rules and principles as guides for practical decision making, because this leads to better consequences than trying to separately figure out the value of every single action. Additionally, a broader definition of well-being can encompass values like justice, integrity, and even participation in the goodness of God. (For a general overview of the relationship between Christianity and utilitarianism, see here and here.)
Does EA take into account the spiritual dimension?
EA’s focus on observable, physical impact may feel like it neglects the spiritual dimension that is important to Christians. Some Christians solve this by treating these as kind of separate spheres (though they acknowledge that they are not ultimately separate): they use EA to help with problems such as fighting disease and poverty and support missions and other spiritual activities outside EA. In the Christian EA community, some want to make use of insights from the EA methodology to help with evangelism. The Christians for Impact project does have missions among its cause areas.
Problems with the community
EA is non-religious
The predominantly secular character of the EA community can be challenging for Christians. Around 80% of the people in the movement are nonreligious. This can be a culture shock for those used to moving in mostly Christian social circles. Some have experienced EAs as dismissive towards religion. Experiences vary, of course, but the EA ideas themselves do not exclude religious people. Religious subcommunities within EA, such as EA for Christians or EA for Jews, offer opportunities for fellowship, mutual support, and collaboration with fellow believers interested in EA, and some Christians feel more comfortable interacting with this part of the movement
Scandals and controversial people
EA has had its fair share of scandals over the past two years. Understandably, these might be reasons why some would not want to associate with the community. The collapse of the cryptocurrency exchange FTX in 2022 and Sam Bankman-Fried’s subsequent imprisonment was a major event. A lot has been written about these, including in the EA for Christians blog (see Thoughts on the FTX situation, Once again on EA scandals, Thoughts on Wytham Abbey), so I won’t go into detail here. For Christians, it shouldn’t be a surprise that even the best causes can be used to justify immoral actions and that believing, or professing to believe, in helping others is no guarantee that people won’t sin.
Problems with popular EA cause areas
Beyond philosophical and community concerns, some Christians might also feel uneasy about specific cause areas that are prominent within EA, such as longtermism, AI safety, or farmed animal welfare.
Longtermism and x-risk
Longtermism is the belief that positively impacting the long-term future of humanity is a key ethical priority. This view is based on the idea that the future could contain vast numbers of people, and actions taken today could greatly impact their well-being.
The concept of longtermism can be problematic for Christians for eschatological reasons. As the Nicene Creed says, Jesus “shall come again, with glory, to judge the quick and the dead”. From this perspective, worrying about the future millions of years from now might seem irrelevant, especially if you believe Christ’s return is imminent. However, history shows that eschatological expectations can be at least 2,000 years off. This doesn’t negate the possibility that Christ’s return could be near, but it does suggest the need for a balanced approach to concerns about the future.
X-risk prevention does raise its own eschatological questions, though—traditional Christian eschatology holds, based on verses like Matthew 24:30–31 and 1 Thessalonians 4:1–17, that human beings will be around when Jesus returns. But clearly working to prevent catastrophes like a full-blown nuclear war is a good cause. Extinction threats could also result in terrifying non-extinction catastrophes. And even though humanity can’t go extinct, that doesn't mean that it would be fine for us to set ourselves on a path to self-destruction that only the Second Coming of the Son of God could stop.
Longtermism seems to be less prominent in the Christian EA community than in the general EA community. Some reject it for various reasons like Biblical ethics being focused on the present and nearer-term future, or because they trust God will not allow humans to go extinct. They may still agree that the things longtermists are working on are valuable. Others are more sympathetic to longtermism.
Longtermism got a lot of attention around 2022 with the publication of Will MacAskill’s book What We Owe the Future, but EA seems to be way past “peak longtermism”. The movement is generally using the word less and moving towards framings of global risks that do not refer to the far future. It is now common to lump under “catastrophic risk” both what used to be discussed under global catastrophic risk and under longtermism.
AI safety
Many in the EA community are deeply concerned about the potential dangers posed by advanced AI, particularly the risk that AI could lead to catastrophic outcomes if not properly controlled. These are sometimes framed as longtermist concerns, but they can just as easily be seen as near-term issues. The AI safety field is now generally expecting that AI risk could materialize within the next few decades instead of centuries or millennia from now. Some might feel that AI risk is too speculative. However, responsible stewardship of technology is consistent with Christian ethics. Taking steps to prevent potential harm from AI could be seen as a responsible reaction to emerging powerful technologies. EACH published a piece arguing that Christians should not ignore extreme risks from AI.
A solution: using EA as a tool to do good?
I think viewing EA as a tool rather than a total worldview package solves many of the problems outlined above. It also makes it irrelevant whether you call yourself an EA or not. Viewed as a tool, EA is not a competing core identity but something that can be used to further what Christians already value.
In this perspective, EA is not a comprehensive worldview that dictates your values or ethical principles but rather a set of strategies for doing good in the world. Christians can use these to enact the call to love and serve others. EA becomes a means to an end rather than an end in itself. This allows an eclectic approach of picking what is useful for your project of doing good and leaving that which is not.
This approach is useful for countering any totalising tendencies in EA. Most people, even those deeply involved in the EA movement, have values, interests, and commitments beyond the maximisation of global welfare—and that’s fine.
When identity does and doesn’t matter
An instrumental approach to EA recognises that there are costs and benefits to using the EA label. For some people, fully owning the label and calling oneself an EA is empowering. But for others, it can be good to avoid the EA label for strategic reasons, like avoiding professional repercussions. If identifying as an Effective Altruist would feel like compromising your morals, it’s also a valid reason not to do so. If association with EA is in some ways detrimental to pursuing your altruistic goals effectively, then don’t be an EA!
Most people doing impactfully altruistic things in the world don’t identify as EAs. EA doesn’t have a monopoly on effective giving, using reason and evidence, impartiality, etc. Even among those who engage with the EA community, many don’t call themselves Effective Altruists: an experienced EA community builder once told me that perhaps half of the people working in EA organisations identify as EAs.
EA as a useful community
In addition to a question and a methodology for answering it, EA is also a particular community of people engaged in the project of finding the best ways to achieve altruistic impact. Whether you identify as “an EA” or not, I think the philosophy and the community do have insights that are valuable to most people who want to do good effectively. For this reason, I recommend eclectic engagement even for those who don’t want to identify with EA.
The pre-existing assumptions of people in the community of course influence how they approach finding ways to do good effectively. For example, as a largely secular community, EA generally doesn’t discuss evangelism as a cause area (but the Christian subcommunity does). EA also has a body of commonly accepted knowledge and practices, but these too are open to criticism and revision.
Even if you disagree with the worldviews of people in the movement, you can still find many of their findings useful. In particular, the area of global health and poverty seems like underlying worldview differences matter relatively little, because the field is so driven by empirical cost-effectiveness estimates and the goals of helping people suffering from disease and extreme poverty are so uncontroversial.
As long as the EA community produces something useful for your goals, it can be helpful to interact with it. You can use your discernment and keep what is good. Again, this is something that almost everyone in the community does, Christian or not, as the community has a diversity of opinions on many issues. Interacting with the EA community does not mean embracing everything anyone in the community says, or even accepting the prevalent opinions in the community.
Practical benefits of engaging with EA
Engaging with EA can have several practical benefits for Christians. EA can provide resources that are useful for practising good stewardship. Findings about, for example, the effectiveness of various global health interventions can help us use our resources effectively and thereby avoid wasting the talent we are given. (See the parable of talents in Matthew 25:14–30) More than mere inefficacy and wastefulness, some social programs may actually cause harm (even Christian ones). Avoiding this is an important part of faithful stewardship.
Engagement with EA helps connect you with people passionate about doing good. This kind of networking can be very helpful in your efforts to make a difference. The EA community gathers thoughtful people with diverse perspectives who have spent a long time thinking about various problems and solutions to them. This can help broaden your perspectives on global issues and other pressing problems, help evaluate the effectiveness of interventions, and thereby serve your neighbour better. There are hundreds of Christians who, whether or not identifying as an “effective altruist”, agree with the basic ideas behind it.