![]() ![]() LLMs are now developing at a rate that is reminiscent of Moore’s Law and, on a connected but different path, the development of robots is also accelerating. Eliza represented a very basic program by today’s standards, yet Weizenbaum was shocked at how users believed that the chatbot was listening to and cared about them, and how much they would reveal to it as a result (See Origins below for more). Eliza was programmed to rephrase user statements as questions and have a few trigger words for blocks of questions, so if you said ‘my mother doesn’t like me,’ Eliza would ask ‘why do you think your mother doesn’t like you?’ In addition, the term ‘mother’ would prompt other questions like ‘ tell me about your family life’. In 1966 MIT’s Joseph Weizenbaum created Eliza, a chatbot that mimicked a therapist's questions. The misalignment between a computer's external output and what we assume is happening internally for them isn’t new. Indeed, the web is filled with examples of people who are in fascinating conversations with LLMs akin to friends, trusted advisors, and even virtual lovers. You might consider them to be glorified autocomplete systems, but if you’ve been chatting with them, you’ll know that this doesn’t do them justice. Instead, LLMs draw on enormous data sets (literature, online content, social media etc) and use a process known as deep learning, a combination of algorithms and statistical models, to generate text based on pattern recognition and context. ![]() Let’s start by level-setting: ChatGTP, Bard, Bing and the broader wave of Large Language Models (LLMs) are generally not considered to be 'thinking' nor 'sentient'. The Eliza Effect describes the human tendency to attribute human characteristics to non-human entities, thus anthropomorphising machines, computers, and things. Inspired by a 1960s chatbot and more relevant today than ever, this effect serves as a warning about the nature of human-AI relationships. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |