I’ve been in the AI space since ChatGPT first dropped. I’ve toyed around with a lot of Language Models, built random side projects, built a couple from scratch and I’ve spent hours looking at the math behind it all.
Neurons are basically fancy transistors, they don’t “feel”. You’d need the whole bunch of emotional processing unit and a full-blown consciousness stack for that feature.
Sure. But where’s the line? We saw how quickly corporations scaled up LLMs as big and as fast as they could. Once we hit the first real breakthrough in this field, that’s all it takes for these to suddenly become very serious questions.
Neurons are basically fancy transistors, they don’t “feel”. You’d need the whole bunch of emotional processing unit and a full-blown consciousness stack for that feature.
Sure. But where’s the line? We saw how quickly corporations scaled up LLMs as big and as fast as they could. Once we hit the first real breakthrough in this field, that’s all it takes for these to suddenly become very serious questions.