Mike - This was an interesting read, but I think the focus on emotion is misplaced. Your article like most studies of AI look into the effects on human users. The question of whether LLMs have emotions is secondary and pretty much of speculative interest to people who design and intensively use AI. Not only are LLMs not sentient, independent, emotional beings, they are tools of persuasion -- like advertising billboards -- designed to evoke and manipulate human emotions. They sell products and ideologies -- mostly, user loyalty to the company, itself. One needs to look at the purpose of the product creators to understand the effects they have on people. That close examination of the intentions of the AI industry is what's missing here and in most discussions of this subject. - Mark
You wrote: "Not only are LLMs not sentient, independent, emotional beings..." That's what I want Anthropic to say. Instead they say AI doesn't experience emotions in the same way people do, etc. It's clear from my constant scanning of the posts and articles that a percentage -- clearly a two-digit percentage -- of the public believes AI have inner lives, thoughts, and emotions.
Mike - This was an interesting read, but I think the focus on emotion is misplaced. Your article like most studies of AI look into the effects on human users. The question of whether LLMs have emotions is secondary and pretty much of speculative interest to people who design and intensively use AI. Not only are LLMs not sentient, independent, emotional beings, they are tools of persuasion -- like advertising billboards -- designed to evoke and manipulate human emotions. They sell products and ideologies -- mostly, user loyalty to the company, itself. One needs to look at the purpose of the product creators to understand the effects they have on people. That close examination of the intentions of the AI industry is what's missing here and in most discussions of this subject. - Mark
And of course I agree with you, but an examination of the intentions of the AI industry is a topic for another article. Like this one: https://www.computerworld.com/article/4036568/the-dark-side-of-ai-monetization.html
You wrote: "Not only are LLMs not sentient, independent, emotional beings..." That's what I want Anthropic to say. Instead they say AI doesn't experience emotions in the same way people do, etc. It's clear from my constant scanning of the posts and articles that a percentage -- clearly a two-digit percentage -- of the public believes AI have inner lives, thoughts, and emotions.