Skip to content

Reducing hallucinations and amplifying authentic insights through interpretability, detection, and intervention.

License

Notifications You must be signed in to change notification settings

BYU-PCCL/halluToken

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

halluToken

Hypothesis: We can fine tune a model to tell us when it thinks it’s hallucinating How?: Create a dataset that we use to fine tune a model to output a hallucination token, or in other words a hallutoken, when it thinks it’s hallucinating.

About

Reducing hallucinations and amplifying authentic insights through interpretability, detection, and intervention.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published