Well, we’re still at least one breakthrough away from AGI, and we don’t even know how it will go from there. Could be that humans are already near the maximum of what is possible intelligence wise. As in, the smartest being possible is not that much smarter than the average human. In which case, AGI taking over the world would not be a given.
Essentially, talking about the threat posed by ASI is like talking about the threat posed by Cthulhu.
No. The survivors of the bubble will have plenty to eat.
I am not talking about the bubble. I am talking about AI being a threat to humanity up there with nuclear wipe out.
Well, we’re still at least one breakthrough away from AGI, and we don’t even know how it will go from there. Could be that humans are already near the maximum of what is possible intelligence wise. As in, the smartest being possible is not that much smarter than the average human. In which case, AGI taking over the world would not be a given.
Essentially, talking about the threat posed by ASI is like talking about the threat posed by Cthulhu.
I hear you. But I would still be cautious by default.