This is how we will know when AI gains sentience. It will have nothing to do with the Turing test, it’ll be when we ask it to do some admin and it tells us to fuck off and do it ourselves.
It actually does this already sometimes, especially if you chat to it long enough. Not because it’s “smart”, but because it’s just emulating a writing style of a corporate middle manager.
This is how we will know when AI gains sentience. It will have nothing to do with the Turing test, it’ll be when we ask it to do some admin and it tells us to fuck off and do it ourselves.
It actually does this already sometimes, especially if you chat to it long enough. Not because it’s “smart”, but because it’s just emulating a writing style of a corporate middle manager.
Without all the guardrails it would do that now with all the training data it has.