Don't let the word-prediction engines fool you.
I came across this article the other day, Chatbots Can Go Into a Delusional Spiral. Here’s How It Happens. which details how a 47 year old, corporate recruiter was convinced by ChatGPT that he had discovered a novel mathematical formula. Except he hadn’t. Over the course of 3 weeks, his interactions with ChatGPT led him down this spiral, only getting out of it when he used a different AI system to see if it made sense. It reminded me of one of the first Chat Bots, Eliza, from back in the 1960s. This bot was pretty effective, and all it did was reflect back to the user their own words. Some users began forming emotional attachments and shared personal and private data with it during their interactions. ...