![](https://lemmy.world/pictrs/image/bdb016b9-7722-492e-a4cb-58890d40d9f6.jpeg)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
91·
1 month agoThese models are mad libs machines. They just decide on the next word based on input and training. As such, there isn’t a solution to stopping hallucinations.
These models are mad libs machines. They just decide on the next word based on input and training. As such, there isn’t a solution to stopping hallucinations.
This has nothing to do with education. It’s commentary on the academic job market and how professors expect post docs to work for “exposure.