upcarta
  • Sign In
  • Sign Up
  • Explore
  • Search

What is an optimum degree of LLM hallucination? - Marginal REVOLUTION

  • Article
  • Jan 9, 2023
  • #ArtificialIntelligence
Tyler Cowen
@TylerCowen
(Author)
marginalrevolution.com
Read on marginalrevolution.com
1 Recommender
1 Mention
Ideally you could adjust a dial and and set the degree of hallucination in advance. For fact-checking you would choose zero hallucination, for poetry composition, life advice, and... Show More

Ideally you could adjust a dial and and set the degree of hallucination in advance. For fact-checking you would choose zero hallucination, for poetry composition, life advice, and inspiration you might want more hallucination, to varying degrees of course. After all, you don’t choose friends with zero hallucination, do you? And you do read fiction, don’t you?

(Do note that you can ask the current version for references and follow-up — GPT is hardly as epistemically crippled as some people allege.)

In the meantime, I do not want an LLM with less hallucination. The hallucinations are part of what I learn from. I learn what the world would look like, if it were most in tune with the statistical model provided by text. That to me is intrinsically interesting. Does the matrix algebra version of the world not interest you as well?

The hallucinations also give me ideas and show me alternative pathways. “What if…?” They are a form of creativity. Many of these hallucinations are simple factual errors, but many others have embedded in them alternative models of the world. Interesting models of the world. Ideas and inspirations. I feel I know what question to ask or which task to initiate.

Show Less
Recommend
Post
Save
Complete
Collect
Mentions
See All
Alex Tabarrok @ATabarrok · Jan 9, 2023
  • Post
  • From Twitter
“Models too are hallucinations.” Great post by @tylercowen on using GPTs.
  • upcarta ©2025
  • Home
  • About
  • Terms
  • Privacy
  • Cookies
  • @upcarta