

1·
3 days agoI’m not sure LLMs can do this. The reason is context poisoning. There would need to be an overseer system of some kind.
I’m not sure LLMs can do this. The reason is context poisoning. There would need to be an overseer system of some kind.
It’s not better than nothing - it’s worse than nothing. It is actively harmful, feeding psychosis, and your chat history will be sold at some point.
Try this, instead of asking “I am thinking xyz”, ask " my friend thinks xyz, and I believe it to be wrong". And marvel at how it will tell you the exact opposite.
Its probably talking about the UK stratospheric aerosol injection research. Like all conspiracy theories, just enough of a grain of truth.
If you have a drink that creates a nice tingling sensation in some people and make other people go crazy, the only sane thing to do is to take that drink off the market.