Go Back
Choose this branch
Choose this branch
meritocratic
regular
democratic
hot
top
alive
41 posts
Optimization
Adaptation Executors
Selection vs Control
64 posts
General Intelligence
AI Services (CAIS)
Superstimuli
Narrow AI
Hope
Delegation
217
The ground of optimization
Alex Flint
2y
74
183
Utility Maximization = Description Length Minimization
johnswentworth
1y
40
139
Selection vs Control
abramdemski
3y
25
103
What's General-Purpose Search, And Why Might We Expect To See It In Trained ML Systems?
johnswentworth
4mo
15
98
Optimization Amplifies
Scott Garrabrant
4y
12
91
The Optimizer's Curse and How to Beat It
lukeprog
11y
82
79
Bottle Caps Aren't Optimisers
DanielFilan
4y
21
77
"Normal" is the equilibrium state of past optimization processes
Alex_Altair
1mo
5
65
Measuring Optimization Power
Eliezer Yudkowsky
14y
35
59
Fake Optimization Criteria
Eliezer Yudkowsky
15y
21
57
Vingean Agency
abramdemski
3mo
13
52
Humans aren't fitness maximizers
So8res
2mo
45
52
Aligning a toy model of optimization
paulfchristiano
3y
26
47
The Evolutionary-Cognitive Boundary
Eliezer Yudkowsky
13y
29
170
Is Clickbait Destroying Our General Intelligence?
Eliezer Yudkowsky
4y
60
119
Adaptation-Executers, not Fitness-Maximizers
Eliezer Yudkowsky
15y
33
118
Reframing Superintelligence: Comprehensive AI Services as General Intelligence
Rohin Shah
3y
75
117
Just Lose Hope Already
Eliezer Yudkowsky
15y
78
87
Two explanations for variation in human abilities
Matthew Barnett
3y
28
85
Two Neglected Problems in Human-AI Safety
Wei_Dai
4y
24
78
How special are human brains among animal brains?
zhukeepa
2y
38
76
The Octopus, the Dolphin and Us: a Great Filter tale
Stuart_Armstrong
8y
236
76
Comments on CAIS
Richard_Ngo
3y
14
73
AGI and Friendly AI in the dominant AI textbook
lukeprog
11y
27
72
Lotteries: A Waste of Hope
Eliezer Yudkowsky
15y
74
68
Artificial Addition
Eliezer Yudkowsky
15y
129
67
Belief in Intelligence
Eliezer Yudkowsky
14y
37
66
The Power of Intelligence
Eliezer Yudkowsky
15y
4