The long term sees history differently: as a march towards inevitable progress. MacAskill often refers to the past What we owe to the future, but only in the form of case studies of the life-enhancing impact of technological and moral development. He talks about the abolition of slavery, the industrial revolution and the women’s rights movement as evidence of how important it is to continue the arc of human progress before misguided values are “locked down” by despots . What are the “correct” values? MacAskill takes a coy approach to articulating these: he argues that “we should focus on promoting more abstract or general moral principles” to ensure that “moral changes are relevant and robustly positive in the future”.
Ongoing global climate change, which already affects the poor more than the elite today, is remarkably not a long-term root cause, as philosopher Emile P. Torres points out in his critique. Although it poses a threat to millions of lives, long-term experts argue, it probably won’t end all of humanity; those with the wealth and means to survive can continue to realize the potential of our species. Tech billionaires like Thiel and Larry Page already have plans and real estate in place to weather a climate apocalypse. (MacAskill, in his new book, calls climate change a serious concern for those living today, but considers it an existential threat only in the “extreme” form where agriculture will not survive.)
“To come to the conclusion that to do the most good in the world you need to work on artificial general intelligence is very strange.”
Timnit Gebru
The last mysterious feature of EA’s version of the long view is how its logic ends up in a specific list of distant technology-based threats to civilization that align with many of the research areas of the original EA cohort. “I am a researcher in the field of AI”, says Gebru, “but to come to the conclusion that to do the most good in the world you need to work on artificial general intelligence is very strange. It’s like trying to justify the fact that you want to think in the sci-fi scenario and don’t want to think about real people, the real world, and current structural problems. You want to justify how you want to make billions of dollars on this while people are starving.”
Some EA leaders seem aware that criticism and change are key to expanding the community and strengthening its impact. MacAskill and others have made explicit that their calculations are estimates (“These are our best guesses,” MacAskill offered in a 2020 podcast episode) and said they are eager to improve through critical discourse. Both GiveWell and CEA have pages on their websites titled “Our Mistakes,” and in June, CEA ran a contest inviting critiques on the EA forum; the Future Fund has launched awards of up to $1.5 million for critical perspectives on AI.
“We recognize that the problems that EA is trying to address are very, very big, and we have no hope of solving them with just a small segment of people,” says Julia Wise, GiveWell board member and CEA’s community liaison, about EA’s diversity statistics. . “We need the talent that many different types of people can bring to address these global issues.” Wise also spoke on the topic at the 2020 EA Global Conference, and actively discusses inclusion and community power dynamics in the CEA forum. The Center for Effective Altruism supports a mentoring program for women and non-binary people (founded, by the way, by Carrick Flynn’s wife) that Wise says is expanding to other underrepresented groups in the community. ‘EA, and CEA has made an effort to facilitate conferences in more places. around the world to accommodate a more geographically diverse group. But these efforts appear to be limited in scope and impact; CEA’s public page on diversity and inclusion hasn’t even been updated since 2020. As long-term utopian tech principles take a front seat on EA’s rocket and a few billionaire donors plot their way to in the future, maybe it will be too. late to alter the movement’s DNA.
Politics and the future
Despite the brilliance of science fiction, effective altruism today is a conservative project, consolidating decision-making behind a technocratic belief system and a small set of individuals, potentially at the expense of local and intersectional visions for to the future But EA’s community and achievements were built around clear methodologies that may not transfer to the more nuanced political arena that some EA leaders and some big donors are pushing into. According to Wise, the wider community is still divided on politics as an approach to pursuing EA’s goals, with some dissenters believing that politics is too polarized a space for effective change.
But EA is not the only charitable movement seeking political action to reshape the world; the philanthropic field in general has been moving into politics for greater impact. “We have an existential political crisis that philanthropy has to deal with. Otherwise, many of their other goals will be difficult to achieve,” says Inside Philanthropy’s Callahan, using a definition of “existential” that differs from MacAskill’s. But while EA can provide a clear rubric for determining like giving to charity, the political arena presents a messier challenge. “There’s no easy metric for gaining political power or changing policy,” he says. “And Sam Bankman-Fried has so far proven to be no giver most effective politician”.
Bankman-Fried has articulated her own political giving as “more political than political” and has given primarily to Democrats through her short-lived Protect Our Future PAC (which backed Carrick Flynn in Oregon) and the Guarding Against Pandemics PAC (which is addressed). by his brother Gabe and publishes a list between parties of his “champions” to support). Ryan Salame, the co-CEO with Bankman-Fried of FTX, funded his own PAC, American Dream Federal Action, which focuses primarily on Republican candidates. (Bankman-Fried has said Salame shares his passion for preventing pandemics.) Guarding Against Pandemics and the Open Philanthropy Action Fund (the political arm of Open Philanthropy) spent more than $18 million to get an initiative in the California state vote this fall to fund pandemic research and action through a new tax.