this post was submitted on 24 Jun 2023
58 points (100.0% liked)
Technology
67 readers
2 users here now
This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!
founded 2 years ago
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Or, they've done it before and gotten away with it.
It's impressive how well ChatGPT hallucinates citations.
I was asking it about a field of law I happen to be quite aware of (as a layman), and it came up with entire sections of laws that didn't exist to support its conclusions.
Large Language Models like ChatGPT are in my view verisimilitude engines. Verisimilitude is the appearance of being true or real. You'll note, however, that it is not being true or real, simply appearing so.
It's trying to make an answer that looks right. If it happens to know the actual answer then that's what it'll go with, but if it doesn't, it'll go with what a correct answer might statistically look like. For fields with actual right and wrong answers like law and science and technology, its tendency to make things up is really harmful if the person using the tool doesn't know it will lie.