Go Back
Choose this branch
You can't go any further
meritocratic
regular
democratic
hot
top
alive
49 posts
Bounties (closed)
Information Cascades
Groupthink
Conformity Bias
Courage
Changing Your Mind
2017-2019 AI Alignment Prize
11 posts
Eliciting Latent Knowledge (ELK)
200
The Importance of Saying "Oops"
Eliezer Yudkowsky
15y
31
173
Evaporative Cooling of Group Beliefs
Eliezer Yudkowsky
15y
47
150
Lonely Dissent
Eliezer Yudkowsky
14y
86
134
Two Cult Koans
Eliezer Yudkowsky
15y
100
118
Uncritical Supercriticality
Eliezer Yudkowsky
15y
174
117
No Safe Defense, Not Even Science
Eliezer Yudkowsky
14y
67
89
Matt Levine on "Fraud is no fun without friends."
Raemon
1y
24
79
Asch's Conformity Experiment
Eliezer Yudkowsky
14y
65
75
Right now, you're sitting on a REDONKULOUS opportunity to help solve AGI (and rake in $$$)
Trevor1
6mo
13
74
Triangle Opportunity
Alex Beyman
2mo
10
71
Announcing the AI Alignment Prize
cousin_it
5y
78
63
Information cascades
Johnicholas
13y
36
62
So You've Changed Your Mind
Spurlock
11y
48
61
Announcement: AI alignment prize round 4 winners
cousin_it
3y
41
218
ARC's first technical report: Eliciting Latent Knowledge
paulfchristiano
1y
88
176
Prizes for ELK proposals
paulfchristiano
11mo
156
139
ELK prize results
paulfchristiano
9mo
50
136
Mechanistic anomaly detection and ELK
paulfchristiano
25d
17
79
Finding gliders in the game of life
paulfchristiano
19d
7
51
ELK First Round Contest Winners
Mark Xu
10mo
6
46
Implications of automated ontology identification
Alex Flint
10mo
29
38
Counterexamples to some ELK proposals
paulfchristiano
11mo
10
31
Eliciting Latent Knowledge Via Hypothetical Sensors
John_Maxwell
11mo
2
26
Can you be Not Even Wrong in AI Alignment?
throwaway8238
9mo
7
20
Importance of foresight evaluations within ELK
Jonathan Uesato
11mo
1