Go Back
You can't go any further
Choose this branch
meritocratic
regular
democratic
hot
top
alive
62 posts
AI forecasting
23 posts
Paul Christiano
Eliezer Yudkowsky
AI takeoff
Epoch
Fund for Alignment Research (FAR)
-3
AGI Isn’t Close - Future Fund Worldview Prize
Toni MUENDEL
2d
14
23
Have your timelines changed as a result of ChatGPT?
Chris Leong
15d
18
80
A concern about the “evolutionary anchor” of Ajeya Cotra’s report on AI timelines.
NunoSempere
4mo
43
129
Samotsvety's AI risk forecasts
elifland
3mo
30
53
Roodman's Thoughts on Biological Anchors
lukeprog
3mo
7
23
Questions on databases of AI Risk estimates
Froolow
2mo
12
4
What is the best source to explain short AI timelines to a skeptical person?
trevor1
27d
3
21
How does one find out their AGI timelines?
Yadav
1mo
4
32
Forecasting thread: How does AI risk level vary based on timelines?
elifland
3mo
8
101
AI timelines via bioanchors: the debate in one place
Will Aldred
4mo
6
79
2022 AI expert survey results
Zach Stein-Perlman
4mo
7
67
AI Forecasting Research Ideas
Jaime Sevilla
1mo
1
64
Disagreement with bio anchors that lead to shorter timelines
mariushobbhahn
1mo
1
3
Hacker-AI – Does it already exist?
Erland Wittkotter
1mo
1
280
On Deference and Yudkowsky's AI Risk Estimates
Ben Garfinkel
6mo
188
184
Announcing Epoch: A research organization investigating the road to Transformative AI
Jaime Sevilla
5mo
11
31
MIRI Conversations: Technology Forecasting & Gradualism (Distillation)
TheMcDouglas
5mo
9
73
Introducing the Fund for Alignment Research (We're Hiring!)
AdamGleave
5mo
3
36
Knowing About Biases Can Hurt People (Yudkowsky, 2007)
Will Aldred
4mo
1
30
My Understanding of Paul Christiano's Iterated Amplification AI Safety Research Agenda
Chi
2y
3
18
Epoch is hiring a Research Data Analyst
merilalama
28d
0
39
Grokking “Forecasting TAI with biological anchors”
anson
6mo
0
15
Paul Christiano – Machine intelligence and capital accumulation
Tessa
8y
0
6
BERI, Epoch, and FAR will explain their work & current job openings online this Sunday
Rockwell
4mo
0
3
Conversation with Holden Karnofsky, Nick Beckstead, and Eliezer Yudkowsky on the "long-run" perspective on effective altruism
Nick_Beckstead
8y
7
54
MIRI announces new "Death With Dignity" strategy (Yudkowsky, 2022)
Will Aldred
5mo
0
16
What role should evolutionary analogies play in understanding AI takeoff speeds?
anson
1y
0
4
Paul Christiano on cause prioritization
admin3
8y
2