EA - SBF, extreme risk-taking, expected value, and effective altruism by vipulnaik

The Nonlinear Library: EA Forum - Un pódcast de The Nonlinear Fund

Podcast artwork

Categorías:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: SBF, extreme risk-taking, expected value, and effective altruism, published by vipulnaik on November 13, 2022 on The Effective Altruism Forum.NOTE: I have some indirect associations with SBF and his companies, though probably less so than many of the others who've been posting and commenting on the forum. I don't expect anything I write here to meaningfully affect how things play out in the future for me, so I don't think this creates a conflict of interest, but feel free to discount what I say.NOTE 2: I'm publishing this post without having spent the level of effort polishing and refining it that I normally try to spend. This is due to the time-sensitive nature of the subject matter and becauseI expect to get more value from being corrected in the comments on the post than from refining the post myself. If errors are pointed out, I will try to correct them, but may not always be able to make timely corrections, so if you're reading the post, please also check the comments to check for flaws identified by comments.The collapse of Sam Bankman-Fried (SBF) and his companies FTX andAlameda Research is the topic du jour on the Effective Altruism Forum, and there have been several posts on theForum discussing what happened and what we can learn from it. The post FTXFAQ provides a good summary of what we know as of the time I'm writing this post. I'm also funding work on a timeline of FTX collapse (still a work in progress, but with enough coverage already to be useful if you are starting with very little knowledge).Based on information so far, fraud and deception on the part of SBF(and/or others in FTX and/or Alameda Research) likely happened and were likely key to the way things played out and the extent of damage caused. The trigger seems to be the big loan that FTX provided toAlameda Research to bail it out, using customer funds for the purpose. If FTX hadn't bailed out Alameda, it's quite likely that the spectacular death of FTX we saw (with depositors losing all their money as well) wouldn't have happened. But it's also plausible that without the loan, the situation with Alameda Research was dire enough that Alameda Research, and then FTX, would have died due to the lack of funds. Hopefully that would have been a more graceful death with less pain to depositors. That is a very important difference. Nonetheless, I suspect that by the time of the bailout, we were already at a kind of endgame.In this post, I try to step back a bit from the endgame, and even get away from the specifics of FTX and Alameda Research (that I know very little about) and in fact even get away from the specifics of SBF's business practices (where again I know very little). Rather, I talk about SBF's overall philosophy around risk and expected value, as he has articulated himself, and has been approvingly amplified by severalEA websites and groups. I think the philosophy was key to the overall way things played out. And I also discuss the relationship between the philosophy and the ideas of effective altruism, both in the abstract and as specifically championed by many leaders in effective altruism (including the team at 80,000 Hours). My goal is to encourage people to reassess the philosophy and make appropriate updates.I make two claims:Claim 1: SBF engages in extreme risk-taking that is a crude approximation to the idea of expected value maximization as perceived by him.Claim 2: At least part of the motivation for SBF's risk-taking comes from ideas in effective altruism, and in particular specific points made by EA leaders including people affiliated with 80,000Hours. While personality probably accounts for a lot of SBF's decisions, the role of EA ideas as a catalyst cannot be dismissed based on the evidence.Here are a few things I am not claiming (some of these are discussed ...

Visit the podcast's native language site