Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hallucinations like this could be a great way to identify missing features or confusing parts of your framework. If the llm invents it, maybe it ought to be like this?


Sometimes that's the case but frequently the thing doesn't exist because of more complex issues. Not every programming language is PHP or JavaScript :-)


I agree completely… Usually when I catch it doing this kind of hallucination, it's inventing an API or syntax that is far more clear and intuitive than the actual syntax.


Maybe it would be good for language design, possibly even language design that would be good for an LLM to read, thus reducing hallucinations.


Only if you wanna optimize exclusively for LLM users in this generation.


I imagine a future where we'll bind a fine-tuned tech-support model to each project and let the general purpose models consult tech support rather than winging it themselves. In that world you'd only have to optimize for whichever one you've chosen.

It'll be like a slack support channel, for robots.


I like your thinking :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: