Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> How have you found it not to be significantly better for those purposes

Not even remotely

> LLM output is no different

It is different

A search result might take me to the wrong answer but an LLM might just invent nonsense answers

This is a fundamentally different thing and is more difficult to detect imo



> A search result might take me to the wrong answer but an LLM might just invent nonsense answers

> This is a fundamentally different thing and is more difficult to detect imo

99% of the time it's not. You validate and correct/accept like you would any other suggestion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: