Discussion about this post

User's avatar
Andreas F. Hoffmann's avatar

I would argue that the obsessive preoccupation with alignment is currently a waste of time. This is probably also the conclusion that the OpenAI leadership team found after they realized that scaling a steam engine (GPT 4o) won't lead to an magical emergency of an jet engine (AGI) and they decided to get rid of most oft their alignment team. I fear we have collectively looked on the wrong comparisons for AI development and and therefore came to the wrong conclusions regarding the steps and time scale to AGI:

https://theafh.substack.com/p/what-viruses-can-teach-us-about-ai?r=42gt5

Expand full comment
1 more comment...

No posts