Armed American Radio Host Takes on Open AI Chat Bot in Landmark Defamation Lawsuit

That “chatbots” can “hallucinate” sounds like something out of science fiction, and what that actually means and how it differs from program corruption, coding errors or simply bad source inputs is still not clear: This is all new stuff. [More]

GILO: Garbage in, libel out…

Rise of the Machines

Robots told reporters Friday they could be more efficient leaders than humans, but wouldn’t take anyone’s job away and had no intention of rebelling against their creators. [More]

Tell me that doesn’t sound like a John Quincy Adding Machine campaign promise.

Or more likely, GIGO

Now all we have to do is link all those “smart guns” to Skynet…

Nothing In Garbage Out

OpenAI has been slapped with its first-ever defamation lawsuit after a ChatGPT “hallucination” generated a bogus embezzlement complaint against a Georgia radio host, according to a lawsuit. Mark Walters was shocked to learn ChatGPT created a false case that accused him of “defrauding and embezzling” funds from the Second Amendment Foundation… [More]

It sounded like it was hallucinating — or its programmers were — when I talked to it

ChatGPT a Perfect Example of Garbage In, Garbage Out on Guns

As for changing the world in terms of journalistic content, ChatGPT is only as objective as the content it accesses, and a “test drive” of the bot shows all the old biases and assumptions made by legacy media “real reporters” drive the machine’s “understanding” of politically weighted issues. [More]

Cram buzzwords and talking points into one end and guess what comes out the other.

Verified by MonsterInsights