Go Back
You can't go any further
Choose this branch
meritocratic
regular
democratic
hot
top
alive
62 posts
AI forecasting
23 posts
Paul Christiano
Eliezer Yudkowsky
AI takeoff
Epoch
Fund for Alignment Research (FAR)
-3
AGI Isn’t Close - Future Fund Worldview Prize
Toni MUENDEL
2d
14
37
Have your timelines changed as a result of ChatGPT?
Chris Leong
15d
18
70
A concern about the “evolutionary anchor” of Ajeya Cotra’s report on AI timelines.
NunoSempere
4mo
43
197
Samotsvety's AI risk forecasts
elifland
3mo
30
91
Roodman's Thoughts on Biological Anchors
lukeprog
3mo
7
25
Questions on databases of AI Risk estimates
Froolow
2mo
12
0
What is the best source to explain short AI timelines to a skeptical person?
trevor1
27d
3
17
How does one find out their AGI timelines?
Yadav
1mo
4
62
Forecasting thread: How does AI risk level vary based on timelines?
elifland
3mo
8
65
AI timelines via bioanchors: the debate in one place
Will Aldred
4mo
6
95
2022 AI expert survey results
Zach Stein-Perlman
4mo
7
71
AI Forecasting Research Ideas
Jaime Sevilla
1mo
1
92
Disagreement with bio anchors that lead to shorter timelines
mariushobbhahn
1mo
1
-3
Hacker-AI – Does it already exist?
Erland Wittkotter
1mo
1
234
On Deference and Yudkowsky's AI Risk Estimates
Ben Garfinkel
6mo
188
180
Announcing Epoch: A research organization investigating the road to Transformative AI
Jaime Sevilla
5mo
11
23
MIRI Conversations: Technology Forecasting & Gradualism (Distillation)
TheMcDouglas
5mo
9
75
Introducing the Fund for Alignment Research (We're Hiring!)
AdamGleave
5mo
3
16
Knowing About Biases Can Hurt People (Yudkowsky, 2007)
Will Aldred
4mo
1
38
My Understanding of Paul Christiano's Iterated Amplification AI Safety Research Agenda
Chi
2y
3
24
Epoch is hiring a Research Data Analyst
merilalama
28d
0
47
Grokking “Forecasting TAI with biological anchors”
anson
6mo
0
27
Paul Christiano – Machine intelligence and capital accumulation
Tessa
8y
0
8
BERI, Epoch, and FAR will explain their work & current job openings online this Sunday
Rockwell
4mo
0
5
Conversation with Holden Karnofsky, Nick Beckstead, and Eliezer Yudkowsky on the "long-run" perspective on effective altruism
Nick_Beckstead
8y
7
8
MIRI announces new "Death With Dignity" strategy (Yudkowsky, 2022)
Will Aldred
5mo
0
8
What role should evolutionary analogies play in understanding AI takeoff speeds?
anson
1y
0
6
Paul Christiano on cause prioritization
admin3
8y
2