ChatGPT has ‘systematic political bias’ towards the left: study [More]
That and hallucinations…
[Via Michael G]
Notes from the Resistance
ChatGPT has ‘systematic political bias’ towards the left: study [More]
That and hallucinations…
[Via Michael G]

That “chatbots” can “hallucinate” sounds like something out of science fiction, and what that actually means and how it differs from program corruption, coding errors or simply bad source inputs is still not clear: This is all new stuff. [More]
GILO: Garbage in, libel out…
Robots told reporters Friday they could be more efficient leaders than humans, but wouldn’t take anyone’s job away and had no intention of rebelling against their creators. [More]
Tell me that doesn’t sound like a John Quincy Adding Machine campaign promise.
Or more likely, GIGO…
Now all we have to do is link all those “smart guns” to Skynet…
OpenAI has been slapped with its first-ever defamation lawsuit after a ChatGPT “hallucination” generated a bogus embezzlement complaint against a Georgia radio host, according to a lawsuit. Mark Walters was shocked to learn ChatGPT created a false case that accused him of “defrauding and embezzling” funds from the Second Amendment Foundation… [More]
It sounded like it was hallucinating — or its programmers were — when I talked to it…
Techno-Hell: Using AI to ‘Identify People at Risk of Mental Illness’? [More]
What? They act like computers can have discriminatory biases or something…
[Via Michael G]

As for changing the world in terms of journalistic content, ChatGPT is only as objective as the content it accesses, and a “test drive” of the bot shows all the old biases and assumptions made by legacy media “real reporters” drive the machine’s “understanding” of politically weighted issues. [More]
Cram buzzwords and talking points into one end and guess what comes out the other.