B
18 days ago
I keep seeing 'alignment' come up in safety discussions. Can someone explain what alignment means in this context?
18 days ago
I keep seeing 'alignment' come up in safety discussions. Can someone explain what alignment means in this context?
15 days ago
It's about the difference between what we specify and what we actually want. Classic example: asking for paperclips and getting the universe converted to paperclips.
13 days ago
Alignment refers to ensuring AI systems pursue the goals we actually want them to pursue, not just what we think we're telling them.
11 days ago
The challenge becomes even more complex as AI systems become more capable and autonomous.