EA - On Epistemics and Communities by JP Addison

The Nonlinear Library: EA Forum - Un pódcast de The Nonlinear Fund

Podcast artwork

Categorías:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: On Epistemics and Communities, published by JP Addison on December 16, 2022 on The Effective Altruism Forum.This is a Draft Amnesty Day post. I wrote it in 2020 and ~haven't looked at it since. I'm posting it as-is.An invisible project is one of our most important — let’s try to reveal itThis community is about doing the most good. We have many conversations about how to do that. In the course of those conversations, we've slowly pushed forward a cultural understanding of "how do we form correct beliefs?" We call that understanding “epistemics.”I want to make a few, hopefully-useful observations around this general space. If you haven’t read much about epistemics before, I hope it serves as an accessible introduction. If you’re an old hand, I hope it communicates this frame I’ve found useful.I Why is this hard?Most of the time, figuring out what's true is easy. When was this bridge built? Look it up on Wikipedia (more on that later). When it's not easy, it's often best to just use the tools someone else has already developed.The situations where you really start needing to attack the problem are when:The tools you're used to are inadequate for the task at hand, orYou and a collaborator disagree on how to figure out what's true.Once that happens, I claim most people just get really confused. It's like, what the heck is going on? This makes no sense / my collaborator makes no sense. People give up, or can form really deep impasses with those around them. Even if you're fortunate enough to notice the issue for what it is, it can seem really hard to resolve. I think this is what happened when a lot of my friends and I were only half-convinced of AI risk.How the fk are you supposed to weigh "we literally have an RCT here" versus, "this other thing would be big if true"? Many people find the answer obvious, but unfortunately not the same way. I hope at least some point in your life you’ve viewed it as a hard problem.Then, just when you and your best friend have figured out how to weigh evidence between yourselves, there comes a whole lot of other people. Many new complications arise when this is done as a community:Not all the participants are able to get complete information, or evaluate all the argumentsSome participants are probably smarter than othersSome participants are probably acting adversarially, or with some level of own-view favoring-biasII All project-oriented communities do thisMaybe you’ve heard people talk recently about epistemics, and it’s felt like a fuzzy concept. I hope that presenting the ways in which a bunch of different communities go about forming beliefs.WikipediaAs we’ve already mentioned, Wikipedia has some outstanding epistemics. What’s really useful for us here is that they’ve written it down. You can see the way that it's meant for Wikipedia’s particular situation and how it needs to be legible to outsiders and extremely resilient to adversarial action.ScienceYou can observe humanity making a huge leap forward by improving its epistemics in one important domain. Our species went from just being completely wrong about just about everything in the natural world, to methodically making progress in our understanding. That progress has compounded over time to completely transform the world. It's interesting the note that wasn't obvious with those epistemics should be at first. Are thought experiments valid scientific evidence? And in the soft sciences the epistemics are still controversial.Some might claim that science has figured it all out. But what’s the scientific way to predict who you should pick to lead your company? To make most decisions that humans make there is simply too little high quality data.MedicinePerhaps even more so than science, medicine is extremely conservative in its epistemics. There are so many ac...

Visit the podcast's native language site