“The Best Argument is not a Simple English Yud Essay” by Jonathan Bostock
EA Forum Podcast (All audio) - Un pódcast de EA Forum Team
Categorías:
This is a link post. I was encouraged to post this here, but I don't yet have enough EA forum karma to crosspost directly! Epistemic status: these are my own opinions on AI risk communication, based primarily on my own instincts on the subject and discussions with people less involved with rationality than myself. Communication is highly subjective and I have not rigorously A/B tested messaging. I am even less confident in the quality of my responses than in the correctness of my critique. If they turn out to be true, these thoughts can probably be applied to all sorts of communication beyond AI risk. Lots of work has gone into trying to explain AI risk to laypersons. Overall, I think it's been great, but there's a particular trap that I've seen people fall into a few times. I'd summarize it as simplifying and shortening the text of an argument [...] ---Outline:(01:22) Failure to Adapt Concepts(03:48) Failure to Filter Information(05:17) Failure to Sound Like a Human Being(07:30) SummaryThe original text contained 4 images which were described by AI. --- First published: September 19th, 2024 Source: https://forum.effectivealtruism.org/posts/ukE5ZxDGmFyCev9Lb/the-best-argument-is-not-a-simple-english-yud-essay --- Narrated by TYPE III AUDIO. ---Images from the article:Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.